CHECK: Is CUDA the right version (10)? Not applying augmentation to RGBD data Loading a pascal format RGBD dataset WARN: Loading imagenet weights Creating model, this may take a second... Building ResNet backbone using defined input shape of Tensor("input_1:0", shape=(?, ?, ?, 4), dtype=float32) Loading weights into RGB model Loading weights into depth model tracking anchors tracking anchors tracking anchors tracking anchors tracking anchors Model: "retinanet" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) (None, None, None, 4 0 __________________________________________________________________________________________________ lambda_1 (Lambda) (None, None, None, 3 0 input_1[0][0] __________________________________________________________________________________________________ padding_conv1_rgb (ZeroPadding2 (None, None, None, 3 0 lambda_1[0][0] __________________________________________________________________________________________________ conv1_rgb (Conv2D) (None, None, None, 6 9408 padding_conv1_rgb[0][0] __________________________________________________________________________________________________ bn_conv1_rgb (BatchNormalizatio (None, None, None, 6 256 conv1_rgb[0][0] __________________________________________________________________________________________________ conv1_relu_rgb (Activation) (None, None, None, 6 0 bn_conv1_rgb[0][0] __________________________________________________________________________________________________ pool1_rgb (MaxPooling2D) (None, None, None, 6 0 conv1_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2a_rgb (Conv2D) (None, None, None, 6 4096 pool1_rgb[0][0] __________________________________________________________________________________________________ bn2a_branch2a_rgb (BatchNormali (None, None, None, 6 256 res2a_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2a_relu_rgb (Activa (None, None, None, 6 0 bn2a_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding2a_branch2b_rgb (ZeroPad (None, None, None, 6 0 res2a_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2b_rgb (Conv2D) (None, None, None, 6 36864 padding2a_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn2a_branch2b_rgb (BatchNormali (None, None, None, 6 256 res2a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2b_relu_rgb (Activa (None, None, None, 6 0 bn2a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2c_rgb (Conv2D) (None, None, None, 2 16384 res2a_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_branch1_rgb (Conv2D) (None, None, None, 2 16384 pool1_rgb[0][0] __________________________________________________________________________________________________ bn2a_branch2c_rgb (BatchNormali (None, None, None, 2 1024 res2a_branch2c_rgb[0][0] __________________________________________________________________________________________________ bn2a_branch1_rgb (BatchNormaliz (None, None, None, 2 1024 res2a_branch1_rgb[0][0] __________________________________________________________________________________________________ res2a_rgb (Add) (None, None, None, 2 0 bn2a_branch2c_rgb[0][0] bn2a_branch1_rgb[0][0] __________________________________________________________________________________________________ res2a_relu_rgb (Activation) (None, None, None, 2 0 res2a_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2a_rgb (Conv2D) (None, None, None, 6 16384 res2a_relu_rgb[0][0] __________________________________________________________________________________________________ bn2b_branch2a_rgb (BatchNormali (None, None, None, 6 256 res2b_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2a_relu_rgb (Activa (None, None, None, 6 0 bn2b_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding2b_branch2b_rgb (ZeroPad (None, None, None, 6 0 res2b_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2b_rgb (Conv2D) (None, None, None, 6 36864 padding2b_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn2b_branch2b_rgb (BatchNormali (None, None, None, 6 256 res2b_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2b_relu_rgb (Activa (None, None, None, 6 0 bn2b_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2c_rgb (Conv2D) (None, None, None, 2 16384 res2b_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn2b_branch2c_rgb (BatchNormali (None, None, None, 2 1024 res2b_branch2c_rgb[0][0] __________________________________________________________________________________________________ res2b_rgb (Add) (None, None, None, 2 0 bn2b_branch2c_rgb[0][0] res2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2b_relu_rgb (Activation) (None, None, None, 2 0 res2b_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2a_rgb (Conv2D) (None, None, None, 6 16384 res2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn2c_branch2a_rgb (BatchNormali (None, None, None, 6 256 res2c_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2a_relu_rgb (Activa (None, None, None, 6 0 bn2c_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding2c_branch2b_rgb (ZeroPad (None, None, None, 6 0 res2c_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2b_rgb (Conv2D) (None, None, None, 6 36864 padding2c_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn2c_branch2b_rgb (BatchNormali (None, None, None, 6 256 res2c_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2b_relu_rgb (Activa (None, None, None, 6 0 bn2c_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2c_rgb (Conv2D) (None, None, None, 2 16384 res2c_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn2c_branch2c_rgb (BatchNormali (None, None, None, 2 1024 res2c_branch2c_rgb[0][0] __________________________________________________________________________________________________ res2c_rgb (Add) (None, None, None, 2 0 bn2c_branch2c_rgb[0][0] res2b_relu_rgb[0][0] __________________________________________________________________________________________________ res2c_relu_rgb (Activation) (None, None, None, 2 0 res2c_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2a_rgb (Conv2D) (None, None, None, 1 32768 res2c_relu_rgb[0][0] __________________________________________________________________________________________________ bn3a_branch2a_rgb (BatchNormali (None, None, None, 1 512 res3a_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2a_relu_rgb (Activa (None, None, None, 1 0 bn3a_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding3a_branch2b_rgb (ZeroPad (None, None, None, 1 0 res3a_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2b_rgb (Conv2D) (None, None, None, 1 147456 padding3a_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn3a_branch2b_rgb (BatchNormali (None, None, None, 1 512 res3a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2b_relu_rgb (Activa (None, None, None, 1 0 bn3a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2c_rgb (Conv2D) (None, None, None, 5 65536 res3a_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res3a_branch1_rgb (Conv2D) (None, None, None, 5 131072 res2c_relu_rgb[0][0] __________________________________________________________________________________________________ bn3a_branch2c_rgb (BatchNormali (None, None, None, 5 2048 res3a_branch2c_rgb[0][0] __________________________________________________________________________________________________ bn3a_branch1_rgb (BatchNormaliz (None, None, None, 5 2048 res3a_branch1_rgb[0][0] __________________________________________________________________________________________________ res3a_rgb (Add) (None, None, None, 5 0 bn3a_branch2c_rgb[0][0] bn3a_branch1_rgb[0][0] __________________________________________________________________________________________________ res3a_relu_rgb (Activation) (None, None, None, 5 0 res3a_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2a_rgb (Conv2D) (None, None, None, 1 65536 res3a_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b1_branch2a_rgb (BatchNormal (None, None, None, 1 512 res3b1_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2a_relu_rgb (Activ (None, None, None, 1 0 bn3b1_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding3b1_branch2b_rgb (ZeroPa (None, None, None, 1 0 res3b1_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2b_rgb (Conv2D) (None, None, None, 1 147456 padding3b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn3b1_branch2b_rgb (BatchNormal (None, None, None, 1 512 res3b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2b_relu_rgb (Activ (None, None, None, 1 0 bn3b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2c_rgb (Conv2D) (None, None, None, 5 65536 res3b1_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b1_branch2c_rgb (BatchNormal (None, None, None, 5 2048 res3b1_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3b1_rgb (Add) (None, None, None, 5 0 bn3b1_branch2c_rgb[0][0] res3a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b1_relu_rgb (Activation) (None, None, None, 5 0 res3b1_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2a_rgb (Conv2D) (None, None, None, 1 65536 res3b1_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b2_branch2a_rgb (BatchNormal (None, None, None, 1 512 res3b2_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2a_relu_rgb (Activ (None, None, None, 1 0 bn3b2_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding3b2_branch2b_rgb (ZeroPa (None, None, None, 1 0 res3b2_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2b_rgb (Conv2D) (None, None, None, 1 147456 padding3b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn3b2_branch2b_rgb (BatchNormal (None, None, None, 1 512 res3b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2b_relu_rgb (Activ (None, None, None, 1 0 bn3b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2c_rgb (Conv2D) (None, None, None, 5 65536 res3b2_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b2_branch2c_rgb (BatchNormal (None, None, None, 5 2048 res3b2_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3b2_rgb (Add) (None, None, None, 5 0 bn3b2_branch2c_rgb[0][0] res3b1_relu_rgb[0][0] __________________________________________________________________________________________________ res3b2_relu_rgb (Activation) (None, None, None, 5 0 res3b2_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2a_rgb (Conv2D) (None, None, None, 1 65536 res3b2_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b3_branch2a_rgb (BatchNormal (None, None, None, 1 512 res3b3_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2a_relu_rgb (Activ (None, None, None, 1 0 bn3b3_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding3b3_branch2b_rgb (ZeroPa (None, None, None, 1 0 res3b3_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2b_rgb (Conv2D) (None, None, None, 1 147456 padding3b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn3b3_branch2b_rgb (BatchNormal (None, None, None, 1 512 res3b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2b_relu_rgb (Activ (None, None, None, 1 0 bn3b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2c_rgb (Conv2D) (None, None, None, 5 65536 res3b3_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b3_branch2c_rgb (BatchNormal (None, None, None, 5 2048 res3b3_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3b3_rgb (Add) (None, None, None, 5 0 bn3b3_branch2c_rgb[0][0] res3b2_relu_rgb[0][0] __________________________________________________________________________________________________ res3b3_relu_rgb (Activation) (None, None, None, 5 0 res3b3_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2a_rgb (Conv2D) (None, None, None, 2 131072 res3b3_relu_rgb[0][0] __________________________________________________________________________________________________ bn4a_branch2a_rgb (BatchNormali (None, None, None, 2 1024 res4a_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2a_relu_rgb (Activa (None, None, None, 2 0 bn4a_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4a_branch2b_rgb (ZeroPad (None, None, None, 2 0 res4a_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4a_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn4a_branch2b_rgb (BatchNormali (None, None, None, 2 1024 res4a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2b_relu_rgb (Activa (None, None, None, 2 0 bn4a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4a_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4a_branch1_rgb (Conv2D) (None, None, None, 1 524288 res3b3_relu_rgb[0][0] __________________________________________________________________________________________________ bn4a_branch2c_rgb (BatchNormali (None, None, None, 1 4096 res4a_branch2c_rgb[0][0] __________________________________________________________________________________________________ bn4a_branch1_rgb (BatchNormaliz (None, None, None, 1 4096 res4a_branch1_rgb[0][0] __________________________________________________________________________________________________ res4a_rgb (Add) (None, None, None, 1 0 bn4a_branch2c_rgb[0][0] bn4a_branch1_rgb[0][0] __________________________________________________________________________________________________ res4a_relu_rgb (Activation) (None, None, None, 1 0 res4a_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4a_relu_rgb[0][0] __________________________________________________________________________________________________ bn4b1_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b1_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b1_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4b1_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b1_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn4b1_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ lambda_2 (Lambda) (None, None, None, 1 0 input_1[0][0] __________________________________________________________________________________________________ res4b1_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b1_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ padding_conv1_d (ZeroPadding2D) (None, None, None, 1 0 lambda_2[0][0] __________________________________________________________________________________________________ bn4b1_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b1_branch2c_rgb[0][0] __________________________________________________________________________________________________ conv1_d (Conv2D) (None, None, None, 6 3136 padding_conv1_d[0][0] __________________________________________________________________________________________________ res4b1_rgb (Add) (None, None, None, 1 0 bn4b1_branch2c_rgb[0][0] res4a_relu_rgb[0][0] __________________________________________________________________________________________________ conv1_relu_d (Activation) (None, None, None, 6 0 conv1_d[0][0] __________________________________________________________________________________________________ res4b1_relu_rgb (Activation) (None, None, None, 1 0 res4b1_rgb[0][0] __________________________________________________________________________________________________ pool1_d (MaxPooling2D) (None, None, None, 6 0 conv1_relu_d[0][0] __________________________________________________________________________________________________ res4b2_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b1_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2a_d (Conv2D) (None, None, None, 6 4096 pool1_d[0][0] __________________________________________________________________________________________________ bn4b2_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b2_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2a_relu_d (Activati (None, None, None, 6 0 res2a_branch2a_d[0][0] __________________________________________________________________________________________________ res4b2_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b2_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding2a_branch2b_d (ZeroPaddi (None, None, None, 6 0 res2a_branch2a_relu_d[0][0] __________________________________________________________________________________________________ padding4b2_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b2_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2b_d (Conv2D) (None, None, None, 6 36864 padding2a_branch2b_d[0][0] __________________________________________________________________________________________________ res4b2_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2b_relu_d (Activati (None, None, None, 6 0 res2a_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b2_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2c_d (Conv2D) (None, None, None, 2 16384 res2a_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res2a_branch1_d (Conv2D) (None, None, None, 2 16384 pool1_d[0][0] __________________________________________________________________________________________________ res4b2_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2a_d (Add) (None, None, None, 2 0 res2a_branch2c_d[0][0] res2a_branch1_d[0][0] __________________________________________________________________________________________________ res4b2_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b2_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_relu_d (Activation) (None, None, None, 2 0 res2a_d[0][0] __________________________________________________________________________________________________ bn4b2_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b2_branch2c_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2a_d (Conv2D) (None, None, None, 6 16384 res2a_relu_d[0][0] __________________________________________________________________________________________________ res4b2_rgb (Add) (None, None, None, 1 0 bn4b2_branch2c_rgb[0][0] res4b1_relu_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2a_relu_d (Activati (None, None, None, 6 0 res2b_branch2a_d[0][0] __________________________________________________________________________________________________ res4b2_relu_rgb (Activation) (None, None, None, 1 0 res4b2_rgb[0][0] __________________________________________________________________________________________________ padding2b_branch2b_d (ZeroPaddi (None, None, None, 6 0 res2b_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b3_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b2_relu_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2b_d (Conv2D) (None, None, None, 6 36864 padding2b_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b3_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b3_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2b_relu_d (Activati (None, None, None, 6 0 res2b_branch2b_d[0][0] __________________________________________________________________________________________________ res4b3_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b3_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2c_d (Conv2D) (None, None, None, 2 16384 res2b_branch2b_relu_d[0][0] __________________________________________________________________________________________________ padding4b3_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b3_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2b_d (Add) (None, None, None, 2 0 res2b_branch2c_d[0][0] res2a_relu_d[0][0] __________________________________________________________________________________________________ res4b3_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2b_relu_d (Activation) (None, None, None, 2 0 res2b_d[0][0] __________________________________________________________________________________________________ bn4b3_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2a_d (Conv2D) (None, None, None, 6 16384 res2b_relu_d[0][0] __________________________________________________________________________________________________ res4b3_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2a_relu_d (Activati (None, None, None, 6 0 res2c_branch2a_d[0][0] __________________________________________________________________________________________________ res4b3_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b3_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ padding2c_branch2b_d (ZeroPaddi (None, None, None, 6 0 res2c_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b3_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b3_branch2c_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2b_d (Conv2D) (None, None, None, 6 36864 padding2c_branch2b_d[0][0] __________________________________________________________________________________________________ res4b3_rgb (Add) (None, None, None, 1 0 bn4b3_branch2c_rgb[0][0] res4b2_relu_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2b_relu_d (Activati (None, None, None, 6 0 res2c_branch2b_d[0][0] __________________________________________________________________________________________________ res4b3_relu_rgb (Activation) (None, None, None, 1 0 res4b3_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2c_d (Conv2D) (None, None, None, 2 16384 res2c_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b4_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b3_relu_rgb[0][0] __________________________________________________________________________________________________ res2c_d (Add) (None, None, None, 2 0 res2c_branch2c_d[0][0] res2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b4_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b4_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2c_relu_d (Activation) (None, None, None, 2 0 res2c_d[0][0] __________________________________________________________________________________________________ res4b4_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b4_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2a_d (Conv2D) (None, None, None, 1 32768 res2c_relu_d[0][0] __________________________________________________________________________________________________ padding4b4_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b4_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2a_relu_d (Activati (None, None, None, 1 0 res3a_branch2a_d[0][0] __________________________________________________________________________________________________ res4b4_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b4_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding3a_branch2b_d (ZeroPaddi (None, None, None, 1 0 res3a_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b4_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b4_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2b_d (Conv2D) (None, None, None, 1 147456 padding3a_branch2b_d[0][0] __________________________________________________________________________________________________ res4b4_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b4_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2b_relu_d (Activati (None, None, None, 1 0 res3a_branch2b_d[0][0] __________________________________________________________________________________________________ res4b4_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b4_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2c_d (Conv2D) (None, None, None, 5 65536 res3a_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res3a_branch1_d (Conv2D) (None, None, None, 5 131072 res2c_relu_d[0][0] __________________________________________________________________________________________________ bn4b4_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b4_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3a_d (Add) (None, None, None, 5 0 res3a_branch2c_d[0][0] res3a_branch1_d[0][0] __________________________________________________________________________________________________ res4b4_rgb (Add) (None, None, None, 1 0 bn4b4_branch2c_rgb[0][0] res4b3_relu_rgb[0][0] __________________________________________________________________________________________________ res3a_relu_d (Activation) (None, None, None, 5 0 res3a_d[0][0] __________________________________________________________________________________________________ res4b4_relu_rgb (Activation) (None, None, None, 1 0 res4b4_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2a_d (Conv2D) (None, None, None, 1 65536 res3a_relu_d[0][0] __________________________________________________________________________________________________ res4b5_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b4_relu_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2a_relu_d (Activat (None, None, None, 1 0 res3b1_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b5_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b5_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding3b1_branch2b_d (ZeroPadd (None, None, None, 1 0 res3b1_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b5_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b5_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2b_d (Conv2D) (None, None, None, 1 147456 padding3b1_branch2b_d[0][0] __________________________________________________________________________________________________ padding4b5_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b5_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2b_relu_d (Activat (None, None, None, 1 0 res3b1_branch2b_d[0][0] __________________________________________________________________________________________________ res4b5_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b5_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2c_d (Conv2D) (None, None, None, 5 65536 res3b1_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b5_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b5_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b1_d (Add) (None, None, None, 5 0 res3b1_branch2c_d[0][0] res3a_relu_d[0][0] __________________________________________________________________________________________________ res4b5_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b5_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b1_relu_d (Activation) (None, None, None, 5 0 res3b1_d[0][0] __________________________________________________________________________________________________ res4b5_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b5_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2a_d (Conv2D) (None, None, None, 1 65536 res3b1_relu_d[0][0] __________________________________________________________________________________________________ bn4b5_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b5_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2a_relu_d (Activat (None, None, None, 1 0 res3b2_branch2a_d[0][0] __________________________________________________________________________________________________ res4b5_rgb (Add) (None, None, None, 1 0 bn4b5_branch2c_rgb[0][0] res4b4_relu_rgb[0][0] __________________________________________________________________________________________________ padding3b2_branch2b_d (ZeroPadd (None, None, None, 1 0 res3b2_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b5_relu_rgb (Activation) (None, None, None, 1 0 res4b5_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2b_d (Conv2D) (None, None, None, 1 147456 padding3b2_branch2b_d[0][0] __________________________________________________________________________________________________ res4b6_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b5_relu_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2b_relu_d (Activat (None, None, None, 1 0 res3b2_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b6_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b6_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2c_d (Conv2D) (None, None, None, 5 65536 res3b2_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b6_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b6_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b2_d (Add) (None, None, None, 5 0 res3b2_branch2c_d[0][0] res3b1_relu_d[0][0] __________________________________________________________________________________________________ padding4b6_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b6_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b2_relu_d (Activation) (None, None, None, 5 0 res3b2_d[0][0] __________________________________________________________________________________________________ res4b6_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b6_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2a_d (Conv2D) (None, None, None, 1 65536 res3b2_relu_d[0][0] __________________________________________________________________________________________________ bn4b6_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b6_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2a_relu_d (Activat (None, None, None, 1 0 res3b3_branch2a_d[0][0] __________________________________________________________________________________________________ res4b6_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b6_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding3b3_branch2b_d (ZeroPadd (None, None, None, 1 0 res3b3_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b6_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b6_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2b_d (Conv2D) (None, None, None, 1 147456 padding3b3_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b6_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b6_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2b_relu_d (Activat (None, None, None, 1 0 res3b3_branch2b_d[0][0] __________________________________________________________________________________________________ res4b6_rgb (Add) (None, None, None, 1 0 bn4b6_branch2c_rgb[0][0] res4b5_relu_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2c_d (Conv2D) (None, None, None, 5 65536 res3b3_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b6_relu_rgb (Activation) (None, None, None, 1 0 res4b6_rgb[0][0] __________________________________________________________________________________________________ res3b3_d (Add) (None, None, None, 5 0 res3b3_branch2c_d[0][0] res3b2_relu_d[0][0] __________________________________________________________________________________________________ res4b7_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b6_relu_rgb[0][0] __________________________________________________________________________________________________ res3b3_relu_d (Activation) (None, None, None, 5 0 res3b3_d[0][0] __________________________________________________________________________________________________ bn4b7_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b7_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2a_d (Conv2D) (None, None, None, 2 131072 res3b3_relu_d[0][0] __________________________________________________________________________________________________ res4b7_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b7_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2a_relu_d (Activati (None, None, None, 2 0 res4a_branch2a_d[0][0] __________________________________________________________________________________________________ padding4b7_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b7_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ padding4a_branch2b_d (ZeroPaddi (None, None, None, 2 0 res4a_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b7_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b7_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4a_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b7_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b7_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2b_relu_d (Activati (None, None, None, 2 0 res4a_branch2b_d[0][0] __________________________________________________________________________________________________ res4b7_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b7_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2c_d (Conv2D) (None, None, None, 1 262144 res4a_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4a_branch1_d (Conv2D) (None, None, None, 1 524288 res3b3_relu_d[0][0] __________________________________________________________________________________________________ res4b7_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b7_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4a_d (Add) (None, None, None, 1 0 res4a_branch2c_d[0][0] res4a_branch1_d[0][0] __________________________________________________________________________________________________ bn4b7_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b7_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4a_relu_d (Activation) (None, None, None, 1 0 res4a_d[0][0] __________________________________________________________________________________________________ res4b7_rgb (Add) (None, None, None, 1 0 bn4b7_branch2c_rgb[0][0] res4b6_relu_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2a_d (Conv2D) (None, None, None, 2 262144 res4a_relu_d[0][0] __________________________________________________________________________________________________ res4b7_relu_rgb (Activation) (None, None, None, 1 0 res4b7_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2a_relu_d (Activat (None, None, None, 2 0 res4b1_branch2a_d[0][0] __________________________________________________________________________________________________ res4b8_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b7_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b1_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b1_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b8_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b8_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b1_branch2b_d[0][0] __________________________________________________________________________________________________ res4b8_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b8_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2b_relu_d (Activat (None, None, None, 2 0 res4b1_branch2b_d[0][0] __________________________________________________________________________________________________ padding4b8_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b8_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b1_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b8_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b8_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b1_d (Add) (None, None, None, 1 0 res4b1_branch2c_d[0][0] res4a_relu_d[0][0] __________________________________________________________________________________________________ bn4b8_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b8_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b1_relu_d (Activation) (None, None, None, 1 0 res4b1_d[0][0] __________________________________________________________________________________________________ res4b8_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b8_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b2_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b1_relu_d[0][0] __________________________________________________________________________________________________ res4b8_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b8_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b2_branch2a_relu_d (Activat (None, None, None, 2 0 res4b2_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b8_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b8_branch2c_rgb[0][0] __________________________________________________________________________________________________ padding4b2_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b2_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b8_rgb (Add) (None, None, None, 1 0 bn4b8_branch2c_rgb[0][0] res4b7_relu_rgb[0][0] __________________________________________________________________________________________________ res4b2_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b2_branch2b_d[0][0] __________________________________________________________________________________________________ res4b8_relu_rgb (Activation) (None, None, None, 1 0 res4b8_rgb[0][0] __________________________________________________________________________________________________ res4b2_branch2b_relu_d (Activat (None, None, None, 2 0 res4b2_branch2b_d[0][0] __________________________________________________________________________________________________ res4b9_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b8_relu_rgb[0][0] __________________________________________________________________________________________________ res4b2_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b2_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b9_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b9_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b2_d (Add) (None, None, None, 1 0 res4b2_branch2c_d[0][0] res4b1_relu_d[0][0] __________________________________________________________________________________________________ res4b9_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b9_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b2_relu_d (Activation) (None, None, None, 1 0 res4b2_d[0][0] __________________________________________________________________________________________________ padding4b9_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b9_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b3_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b2_relu_d[0][0] __________________________________________________________________________________________________ res4b9_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b9_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b3_branch2a_relu_d (Activat (None, None, None, 2 0 res4b3_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b9_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b9_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b3_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b3_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b9_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b9_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b3_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b3_branch2b_d[0][0] __________________________________________________________________________________________________ res4b9_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b9_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b3_branch2b_relu_d (Activat (None, None, None, 2 0 res4b3_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b9_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b9_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b3_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b3_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b9_rgb (Add) (None, None, None, 1 0 bn4b9_branch2c_rgb[0][0] res4b8_relu_rgb[0][0] __________________________________________________________________________________________________ res4b3_d (Add) (None, None, None, 1 0 res4b3_branch2c_d[0][0] res4b2_relu_d[0][0] __________________________________________________________________________________________________ res4b9_relu_rgb (Activation) (None, None, None, 1 0 res4b9_rgb[0][0] __________________________________________________________________________________________________ res4b3_relu_d (Activation) (None, None, None, 1 0 res4b3_d[0][0] __________________________________________________________________________________________________ res4b10_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b9_relu_rgb[0][0] __________________________________________________________________________________________________ res4b4_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b3_relu_d[0][0] __________________________________________________________________________________________________ bn4b10_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b10_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b4_branch2a_relu_d (Activat (None, None, None, 2 0 res4b4_branch2a_d[0][0] __________________________________________________________________________________________________ res4b10_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b10_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4b4_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b4_branch2a_relu_d[0][0] __________________________________________________________________________________________________ padding4b10_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b10_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b4_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b4_branch2b_d[0][0] __________________________________________________________________________________________________ res4b10_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b10_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b4_branch2b_relu_d (Activat (None, None, None, 2 0 res4b4_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b10_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b10_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b4_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b4_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b10_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b10_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b4_d (Add) (None, None, None, 1 0 res4b4_branch2c_d[0][0] res4b3_relu_d[0][0] __________________________________________________________________________________________________ res4b10_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b10_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b4_relu_d (Activation) (None, None, None, 1 0 res4b4_d[0][0] __________________________________________________________________________________________________ bn4b10_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b10_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b5_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b4_relu_d[0][0] __________________________________________________________________________________________________ res4b10_rgb (Add) (None, None, None, 1 0 bn4b10_branch2c_rgb[0][0] res4b9_relu_rgb[0][0] __________________________________________________________________________________________________ res4b5_branch2a_relu_d (Activat (None, None, None, 2 0 res4b5_branch2a_d[0][0] __________________________________________________________________________________________________ res4b10_relu_rgb (Activation) (None, None, None, 1 0 res4b10_rgb[0][0] __________________________________________________________________________________________________ padding4b5_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b5_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b11_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b10_relu_rgb[0][0] __________________________________________________________________________________________________ res4b5_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b5_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b11_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b11_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b5_branch2b_relu_d (Activat (None, None, None, 2 0 res4b5_branch2b_d[0][0] __________________________________________________________________________________________________ res4b11_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b11_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b5_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b5_branch2b_relu_d[0][0] __________________________________________________________________________________________________ padding4b11_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b11_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b5_d (Add) (None, None, None, 1 0 res4b5_branch2c_d[0][0] res4b4_relu_d[0][0] __________________________________________________________________________________________________ res4b11_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b11_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b5_relu_d (Activation) (None, None, None, 1 0 res4b5_d[0][0] __________________________________________________________________________________________________ bn4b11_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b11_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b6_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b5_relu_d[0][0] __________________________________________________________________________________________________ res4b11_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b11_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b6_branch2a_relu_d (Activat (None, None, None, 2 0 res4b6_branch2a_d[0][0] __________________________________________________________________________________________________ res4b11_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b11_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b6_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b6_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b11_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b11_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b6_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b6_branch2b_d[0][0] __________________________________________________________________________________________________ res4b11_rgb (Add) (None, None, None, 1 0 bn4b11_branch2c_rgb[0][0] res4b10_relu_rgb[0][0] __________________________________________________________________________________________________ res4b6_branch2b_relu_d (Activat (None, None, None, 2 0 res4b6_branch2b_d[0][0] __________________________________________________________________________________________________ res4b11_relu_rgb (Activation) (None, None, None, 1 0 res4b11_rgb[0][0] __________________________________________________________________________________________________ res4b6_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b6_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b12_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b11_relu_rgb[0][0] __________________________________________________________________________________________________ res4b6_d (Add) (None, None, None, 1 0 res4b6_branch2c_d[0][0] res4b5_relu_d[0][0] __________________________________________________________________________________________________ bn4b12_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b12_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b6_relu_d (Activation) (None, None, None, 1 0 res4b6_d[0][0] __________________________________________________________________________________________________ res4b12_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b12_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b7_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b6_relu_d[0][0] __________________________________________________________________________________________________ padding4b12_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b12_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b7_branch2a_relu_d (Activat (None, None, None, 2 0 res4b7_branch2a_d[0][0] __________________________________________________________________________________________________ res4b12_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b12_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b7_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b7_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b12_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b12_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b7_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b7_branch2b_d[0][0] __________________________________________________________________________________________________ res4b12_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b12_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b7_branch2b_relu_d (Activat (None, None, None, 2 0 res4b7_branch2b_d[0][0] __________________________________________________________________________________________________ res4b12_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b12_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b7_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b7_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b12_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b12_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b7_d (Add) (None, None, None, 1 0 res4b7_branch2c_d[0][0] res4b6_relu_d[0][0] __________________________________________________________________________________________________ res4b12_rgb (Add) (None, None, None, 1 0 bn4b12_branch2c_rgb[0][0] res4b11_relu_rgb[0][0] __________________________________________________________________________________________________ res4b7_relu_d (Activation) (None, None, None, 1 0 res4b7_d[0][0] __________________________________________________________________________________________________ res4b12_relu_rgb (Activation) (None, None, None, 1 0 res4b12_rgb[0][0] __________________________________________________________________________________________________ res4b8_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b7_relu_d[0][0] __________________________________________________________________________________________________ res4b13_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b12_relu_rgb[0][0] __________________________________________________________________________________________________ res4b8_branch2a_relu_d (Activat (None, None, None, 2 0 res4b8_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b13_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b13_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4b8_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b8_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b13_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b13_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b8_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b8_branch2b_d[0][0] __________________________________________________________________________________________________ padding4b13_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b13_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b8_branch2b_relu_d (Activat (None, None, None, 2 0 res4b8_branch2b_d[0][0] __________________________________________________________________________________________________ res4b13_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b13_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b8_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b8_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b13_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b13_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b8_d (Add) (None, None, None, 1 0 res4b8_branch2c_d[0][0] res4b7_relu_d[0][0] __________________________________________________________________________________________________ res4b13_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b13_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b8_relu_d (Activation) (None, None, None, 1 0 res4b8_d[0][0] __________________________________________________________________________________________________ res4b13_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b13_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b9_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b8_relu_d[0][0] __________________________________________________________________________________________________ bn4b13_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b13_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b9_branch2a_relu_d (Activat (None, None, None, 2 0 res4b9_branch2a_d[0][0] __________________________________________________________________________________________________ res4b13_rgb (Add) (None, None, None, 1 0 bn4b13_branch2c_rgb[0][0] res4b12_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b9_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b9_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b13_relu_rgb (Activation) (None, None, None, 1 0 res4b13_rgb[0][0] __________________________________________________________________________________________________ res4b9_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b9_branch2b_d[0][0] __________________________________________________________________________________________________ res4b14_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b13_relu_rgb[0][0] __________________________________________________________________________________________________ res4b9_branch2b_relu_d (Activat (None, None, None, 2 0 res4b9_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b14_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b14_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b9_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b9_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b14_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b14_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b9_d (Add) (None, None, None, 1 0 res4b9_branch2c_d[0][0] res4b8_relu_d[0][0] __________________________________________________________________________________________________ padding4b14_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b14_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b9_relu_d (Activation) (None, None, None, 1 0 res4b9_d[0][0] __________________________________________________________________________________________________ res4b14_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b14_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b10_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b9_relu_d[0][0] __________________________________________________________________________________________________ bn4b14_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b14_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b10_branch2a_relu_d (Activa (None, None, None, 2 0 res4b10_branch2a_d[0][0] __________________________________________________________________________________________________ res4b14_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b14_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b10_branch2b_d (ZeroPad (None, None, None, 2 0 res4b10_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b14_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b14_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b10_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b10_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b14_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b14_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b10_branch2b_relu_d (Activa (None, None, None, 2 0 res4b10_branch2b_d[0][0] __________________________________________________________________________________________________ res4b14_rgb (Add) (None, None, None, 1 0 bn4b14_branch2c_rgb[0][0] res4b13_relu_rgb[0][0] __________________________________________________________________________________________________ res4b10_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b10_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b14_relu_rgb (Activation) (None, None, None, 1 0 res4b14_rgb[0][0] __________________________________________________________________________________________________ res4b10_d (Add) (None, None, None, 1 0 res4b10_branch2c_d[0][0] res4b9_relu_d[0][0] __________________________________________________________________________________________________ res4b15_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b14_relu_rgb[0][0] __________________________________________________________________________________________________ res4b10_relu_d (Activation) (None, None, None, 1 0 res4b10_d[0][0] __________________________________________________________________________________________________ bn4b15_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b15_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b11_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b10_relu_d[0][0] __________________________________________________________________________________________________ res4b15_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b15_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b11_branch2a_relu_d (Activa (None, None, None, 2 0 res4b11_branch2a_d[0][0] __________________________________________________________________________________________________ padding4b15_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b15_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b11_branch2b_d (ZeroPad (None, None, None, 2 0 res4b11_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b15_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b15_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b11_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b11_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b15_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b15_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b11_branch2b_relu_d (Activa (None, None, None, 2 0 res4b11_branch2b_d[0][0] __________________________________________________________________________________________________ res4b15_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b15_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b11_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b11_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b15_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b15_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b11_d (Add) (None, None, None, 1 0 res4b11_branch2c_d[0][0] res4b10_relu_d[0][0] __________________________________________________________________________________________________ bn4b15_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b15_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b11_relu_d (Activation) (None, None, None, 1 0 res4b11_d[0][0] __________________________________________________________________________________________________ res4b15_rgb (Add) (None, None, None, 1 0 bn4b15_branch2c_rgb[0][0] res4b14_relu_rgb[0][0] __________________________________________________________________________________________________ res4b12_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b11_relu_d[0][0] __________________________________________________________________________________________________ res4b15_relu_rgb (Activation) (None, None, None, 1 0 res4b15_rgb[0][0] __________________________________________________________________________________________________ res4b12_branch2a_relu_d (Activa (None, None, None, 2 0 res4b12_branch2a_d[0][0] __________________________________________________________________________________________________ res4b16_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b15_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b12_branch2b_d (ZeroPad (None, None, None, 2 0 res4b12_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b16_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b16_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b12_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b12_branch2b_d[0][0] __________________________________________________________________________________________________ res4b16_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b16_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b12_branch2b_relu_d (Activa (None, None, None, 2 0 res4b12_branch2b_d[0][0] __________________________________________________________________________________________________ padding4b16_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b16_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b12_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b12_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b16_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b16_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b12_d (Add) (None, None, None, 1 0 res4b12_branch2c_d[0][0] res4b11_relu_d[0][0] __________________________________________________________________________________________________ bn4b16_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b16_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b12_relu_d (Activation) (None, None, None, 1 0 res4b12_d[0][0] __________________________________________________________________________________________________ res4b16_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b16_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b13_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b12_relu_d[0][0] __________________________________________________________________________________________________ res4b16_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b16_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b13_branch2a_relu_d (Activa (None, None, None, 2 0 res4b13_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b16_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b16_branch2c_rgb[0][0] __________________________________________________________________________________________________ padding4b13_branch2b_d (ZeroPad (None, None, None, 2 0 res4b13_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b16_rgb (Add) (None, None, None, 1 0 bn4b16_branch2c_rgb[0][0] res4b15_relu_rgb[0][0] __________________________________________________________________________________________________ res4b13_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b13_branch2b_d[0][0] __________________________________________________________________________________________________ res4b16_relu_rgb (Activation) (None, None, None, 1 0 res4b16_rgb[0][0] __________________________________________________________________________________________________ res4b13_branch2b_relu_d (Activa (None, None, None, 2 0 res4b13_branch2b_d[0][0] __________________________________________________________________________________________________ res4b17_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b16_relu_rgb[0][0] __________________________________________________________________________________________________ res4b13_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b13_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b17_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b17_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b13_d (Add) (None, None, None, 1 0 res4b13_branch2c_d[0][0] res4b12_relu_d[0][0] __________________________________________________________________________________________________ res4b17_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b17_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b13_relu_d (Activation) (None, None, None, 1 0 res4b13_d[0][0] __________________________________________________________________________________________________ padding4b17_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b17_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b14_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b13_relu_d[0][0] __________________________________________________________________________________________________ res4b17_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b17_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b14_branch2a_relu_d (Activa (None, None, None, 2 0 res4b14_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b17_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b17_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b14_branch2b_d (ZeroPad (None, None, None, 2 0 res4b14_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b17_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b17_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b14_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b14_branch2b_d[0][0] __________________________________________________________________________________________________ res4b17_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b17_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b14_branch2b_relu_d (Activa (None, None, None, 2 0 res4b14_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b17_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b17_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b14_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b14_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b17_rgb (Add) (None, None, None, 1 0 bn4b17_branch2c_rgb[0][0] res4b16_relu_rgb[0][0] __________________________________________________________________________________________________ res4b14_d (Add) (None, None, None, 1 0 res4b14_branch2c_d[0][0] res4b13_relu_d[0][0] __________________________________________________________________________________________________ res4b17_relu_rgb (Activation) (None, None, None, 1 0 res4b17_rgb[0][0] __________________________________________________________________________________________________ res4b14_relu_d (Activation) (None, None, None, 1 0 res4b14_d[0][0] __________________________________________________________________________________________________ res4b18_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b17_relu_rgb[0][0] __________________________________________________________________________________________________ res4b15_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b14_relu_d[0][0] __________________________________________________________________________________________________ bn4b18_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b18_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b15_branch2a_relu_d (Activa (None, None, None, 2 0 res4b15_branch2a_d[0][0] __________________________________________________________________________________________________ res4b18_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b18_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4b15_branch2b_d (ZeroPad (None, None, None, 2 0 res4b15_branch2a_relu_d[0][0] __________________________________________________________________________________________________ padding4b18_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b18_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b15_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b15_branch2b_d[0][0] __________________________________________________________________________________________________ res4b18_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b18_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b15_branch2b_relu_d (Activa (None, None, None, 2 0 res4b15_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b18_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b18_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b15_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b15_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b18_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b18_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b15_d (Add) (None, None, None, 1 0 res4b15_branch2c_d[0][0] res4b14_relu_d[0][0] __________________________________________________________________________________________________ res4b18_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b18_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b15_relu_d (Activation) (None, None, None, 1 0 res4b15_d[0][0] __________________________________________________________________________________________________ bn4b18_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b18_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b16_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b15_relu_d[0][0] __________________________________________________________________________________________________ res4b18_rgb (Add) (None, None, None, 1 0 bn4b18_branch2c_rgb[0][0] res4b17_relu_rgb[0][0] __________________________________________________________________________________________________ res4b16_branch2a_relu_d (Activa (None, None, None, 2 0 res4b16_branch2a_d[0][0] __________________________________________________________________________________________________ res4b18_relu_rgb (Activation) (None, None, None, 1 0 res4b18_rgb[0][0] __________________________________________________________________________________________________ padding4b16_branch2b_d (ZeroPad (None, None, None, 2 0 res4b16_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b19_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b18_relu_rgb[0][0] __________________________________________________________________________________________________ res4b16_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b16_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b19_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b19_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b16_branch2b_relu_d (Activa (None, None, None, 2 0 res4b16_branch2b_d[0][0] __________________________________________________________________________________________________ res4b19_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b19_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b16_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b16_branch2b_relu_d[0][0] __________________________________________________________________________________________________ padding4b19_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b19_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b16_d (Add) (None, None, None, 1 0 res4b16_branch2c_d[0][0] res4b15_relu_d[0][0] __________________________________________________________________________________________________ res4b19_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b19_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b16_relu_d (Activation) (None, None, None, 1 0 res4b16_d[0][0] __________________________________________________________________________________________________ bn4b19_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b19_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b17_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b16_relu_d[0][0] __________________________________________________________________________________________________ res4b19_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b19_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b17_branch2a_relu_d (Activa (None, None, None, 2 0 res4b17_branch2a_d[0][0] __________________________________________________________________________________________________ res4b19_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b19_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b17_branch2b_d (ZeroPad (None, None, None, 2 0 res4b17_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b19_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b19_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b17_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b17_branch2b_d[0][0] __________________________________________________________________________________________________ res4b19_rgb (Add) (None, None, None, 1 0 bn4b19_branch2c_rgb[0][0] res4b18_relu_rgb[0][0] __________________________________________________________________________________________________ res4b17_branch2b_relu_d (Activa (None, None, None, 2 0 res4b17_branch2b_d[0][0] __________________________________________________________________________________________________ res4b19_relu_rgb (Activation) (None, None, None, 1 0 res4b19_rgb[0][0] __________________________________________________________________________________________________ res4b17_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b17_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b20_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b19_relu_rgb[0][0] __________________________________________________________________________________________________ res4b17_d (Add) (None, None, None, 1 0 res4b17_branch2c_d[0][0] res4b16_relu_d[0][0] __________________________________________________________________________________________________ bn4b20_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b20_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b17_relu_d (Activation) (None, None, None, 1 0 res4b17_d[0][0] __________________________________________________________________________________________________ res4b20_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b20_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b18_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b17_relu_d[0][0] __________________________________________________________________________________________________ padding4b20_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b20_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b18_branch2a_relu_d (Activa (None, None, None, 2 0 res4b18_branch2a_d[0][0] __________________________________________________________________________________________________ res4b20_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b20_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b18_branch2b_d (ZeroPad (None, None, None, 2 0 res4b18_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b20_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b20_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b18_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b18_branch2b_d[0][0] __________________________________________________________________________________________________ res4b20_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b20_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b18_branch2b_relu_d (Activa (None, None, None, 2 0 res4b18_branch2b_d[0][0] __________________________________________________________________________________________________ res4b20_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b20_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b18_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b18_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b20_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b20_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b18_d (Add) (None, None, None, 1 0 res4b18_branch2c_d[0][0] res4b17_relu_d[0][0] __________________________________________________________________________________________________ res4b20_rgb (Add) (None, None, None, 1 0 bn4b20_branch2c_rgb[0][0] res4b19_relu_rgb[0][0] __________________________________________________________________________________________________ res4b18_relu_d (Activation) (None, None, None, 1 0 res4b18_d[0][0] __________________________________________________________________________________________________ res4b20_relu_rgb (Activation) (None, None, None, 1 0 res4b20_rgb[0][0] __________________________________________________________________________________________________ res4b19_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b18_relu_d[0][0] __________________________________________________________________________________________________ res4b21_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b20_relu_rgb[0][0] __________________________________________________________________________________________________ res4b19_branch2a_relu_d (Activa (None, None, None, 2 0 res4b19_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b21_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b21_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4b19_branch2b_d (ZeroPad (None, None, None, 2 0 res4b19_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b21_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b21_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b19_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b19_branch2b_d[0][0] __________________________________________________________________________________________________ padding4b21_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b21_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b19_branch2b_relu_d (Activa (None, None, None, 2 0 res4b19_branch2b_d[0][0] __________________________________________________________________________________________________ res4b21_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b21_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b19_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b19_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b21_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b21_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b19_d (Add) (None, None, None, 1 0 res4b19_branch2c_d[0][0] res4b18_relu_d[0][0] __________________________________________________________________________________________________ res4b21_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b21_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b19_relu_d (Activation) (None, None, None, 1 0 res4b19_d[0][0] __________________________________________________________________________________________________ res4b21_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b21_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b20_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b19_relu_d[0][0] __________________________________________________________________________________________________ bn4b21_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b21_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b20_branch2a_relu_d (Activa (None, None, None, 2 0 res4b20_branch2a_d[0][0] __________________________________________________________________________________________________ res4b21_rgb (Add) (None, None, None, 1 0 bn4b21_branch2c_rgb[0][0] res4b20_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b20_branch2b_d (ZeroPad (None, None, None, 2 0 res4b20_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b21_relu_rgb (Activation) (None, None, None, 1 0 res4b21_rgb[0][0] __________________________________________________________________________________________________ res4b20_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b20_branch2b_d[0][0] __________________________________________________________________________________________________ res4b22_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b21_relu_rgb[0][0] __________________________________________________________________________________________________ res4b20_branch2b_relu_d (Activa (None, None, None, 2 0 res4b20_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b22_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b22_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b20_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b20_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b22_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b22_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b20_d (Add) (None, None, None, 1 0 res4b20_branch2c_d[0][0] res4b19_relu_d[0][0] __________________________________________________________________________________________________ padding4b22_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b22_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b20_relu_d (Activation) (None, None, None, 1 0 res4b20_d[0][0] __________________________________________________________________________________________________ res4b22_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b22_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b21_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b20_relu_d[0][0] __________________________________________________________________________________________________ bn4b22_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b22_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b21_branch2a_relu_d (Activa (None, None, None, 2 0 res4b21_branch2a_d[0][0] __________________________________________________________________________________________________ res4b22_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b22_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b21_branch2b_d (ZeroPad (None, None, None, 2 0 res4b21_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b22_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b22_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b21_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b21_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b22_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b22_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b21_branch2b_relu_d (Activa (None, None, None, 2 0 res4b21_branch2b_d[0][0] __________________________________________________________________________________________________ res4b22_rgb (Add) (None, None, None, 1 0 bn4b22_branch2c_rgb[0][0] res4b21_relu_rgb[0][0] __________________________________________________________________________________________________ res4b21_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b21_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b22_relu_rgb (Activation) (None, None, None, 1 0 res4b22_rgb[0][0] __________________________________________________________________________________________________ res4b21_d (Add) (None, None, None, 1 0 res4b21_branch2c_d[0][0] res4b20_relu_d[0][0] __________________________________________________________________________________________________ res5a_branch2a_rgb (Conv2D) (None, None, None, 5 524288 res4b22_relu_rgb[0][0] __________________________________________________________________________________________________ res4b21_relu_d (Activation) (None, None, None, 1 0 res4b21_d[0][0] __________________________________________________________________________________________________ bn5a_branch2a_rgb (BatchNormali (None, None, None, 5 2048 res5a_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b22_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b21_relu_d[0][0] __________________________________________________________________________________________________ res5a_branch2a_relu_rgb (Activa (None, None, None, 5 0 bn5a_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b22_branch2a_relu_d (Activa (None, None, None, 2 0 res4b22_branch2a_d[0][0] __________________________________________________________________________________________________ padding5a_branch2b_rgb (ZeroPad (None, None, None, 5 0 res5a_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b22_branch2b_d (ZeroPad (None, None, None, 2 0 res4b22_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res5a_branch2b_rgb (Conv2D) (None, None, None, 5 2359296 padding5a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b22_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b22_branch2b_d[0][0] __________________________________________________________________________________________________ bn5a_branch2b_rgb (BatchNormali (None, None, None, 5 2048 res5a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b22_branch2b_relu_d (Activa (None, None, None, 2 0 res4b22_branch2b_d[0][0] __________________________________________________________________________________________________ res5a_branch2b_relu_rgb (Activa (None, None, None, 5 0 bn5a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b22_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b22_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res5a_branch2c_rgb (Conv2D) (None, None, None, 2 1048576 res5a_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res5a_branch1_rgb (Conv2D) (None, None, None, 2 2097152 res4b22_relu_rgb[0][0] __________________________________________________________________________________________________ res4b22_d (Add) (None, None, None, 1 0 res4b22_branch2c_d[0][0] res4b21_relu_d[0][0] __________________________________________________________________________________________________ bn5a_branch2c_rgb (BatchNormali (None, None, None, 2 8192 res5a_branch2c_rgb[0][0] __________________________________________________________________________________________________ bn5a_branch1_rgb (BatchNormaliz (None, None, None, 2 8192 res5a_branch1_rgb[0][0] __________________________________________________________________________________________________ res4b22_relu_d (Activation) (None, None, None, 1 0 res4b22_d[0][0] __________________________________________________________________________________________________ res5a_rgb (Add) (None, None, None, 2 0 bn5a_branch2c_rgb[0][0] bn5a_branch1_rgb[0][0] __________________________________________________________________________________________________ res5a_branch2a_d (Conv2D) (None, None, None, 5 524288 res4b22_relu_d[0][0] __________________________________________________________________________________________________ res5a_relu_rgb (Activation) (None, None, None, 2 0 res5a_rgb[0][0] __________________________________________________________________________________________________ res5a_branch2a_relu_d (Activati (None, None, None, 5 0 res5a_branch2a_d[0][0] __________________________________________________________________________________________________ res5b_branch2a_rgb (Conv2D) (None, None, None, 5 1048576 res5a_relu_rgb[0][0] __________________________________________________________________________________________________ padding5a_branch2b_d (ZeroPaddi (None, None, None, 5 0 res5a_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn5b_branch2a_rgb (BatchNormali (None, None, None, 5 2048 res5b_branch2a_rgb[0][0] __________________________________________________________________________________________________ res5a_branch2b_d (Conv2D) (None, None, None, 5 2359296 padding5a_branch2b_d[0][0] __________________________________________________________________________________________________ res5b_branch2a_relu_rgb (Activa (None, None, None, 5 0 bn5b_branch2a_rgb[0][0] __________________________________________________________________________________________________ res5a_branch2b_relu_d (Activati (None, None, None, 5 0 res5a_branch2b_d[0][0] __________________________________________________________________________________________________ padding5b_branch2b_rgb (ZeroPad (None, None, None, 5 0 res5b_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res5a_branch2c_d (Conv2D) (None, None, None, 2 1048576 res5a_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res5a_branch1_d (Conv2D) (None, None, None, 2 2097152 res4b22_relu_d[0][0] __________________________________________________________________________________________________ res5b_branch2b_rgb (Conv2D) (None, None, None, 5 2359296 padding5b_branch2b_rgb[0][0] __________________________________________________________________________________________________ res5a_d (Add) (None, None, None, 2 0 res5a_branch2c_d[0][0] res5a_branch1_d[0][0] __________________________________________________________________________________________________ bn5b_branch2b_rgb (BatchNormali (None, None, None, 5 2048 res5b_branch2b_rgb[0][0] __________________________________________________________________________________________________ res5a_relu_d (Activation) (None, None, None, 2 0 res5a_d[0][0] __________________________________________________________________________________________________ res5b_branch2b_relu_rgb (Activa (None, None, None, 5 0 bn5b_branch2b_rgb[0][0] __________________________________________________________________________________________________ res5b_branch2a_d (Conv2D) (None, None, None, 5 1048576 res5a_relu_d[0][0] __________________________________________________________________________________________________ res5b_branch2c_rgb (Conv2D) (None, None, None, 2 1048576 res5b_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res5b_branch2a_relu_d (Activati (None, None, None, 5 0 res5b_branch2a_d[0][0] __________________________________________________________________________________________________ bn5b_branch2c_rgb (BatchNormali (None, None, None, 2 8192 res5b_branch2c_rgb[0][0] __________________________________________________________________________________________________ padding5b_branch2b_d (ZeroPaddi (None, None, None, 5 0 res5b_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res5b_rgb (Add) (None, None, None, 2 0 bn5b_branch2c_rgb[0][0] res5a_relu_rgb[0][0] __________________________________________________________________________________________________ res5b_branch2b_d (Conv2D) (None, None, None, 5 2359296 padding5b_branch2b_d[0][0] __________________________________________________________________________________________________ res5b_relu_rgb (Activation) (None, None, None, 2 0 res5b_rgb[0][0] __________________________________________________________________________________________________ res5b_branch2b_relu_d (Activati (None, None, None, 5 0 res5b_branch2b_d[0][0] __________________________________________________________________________________________________ res5c_branch2a_rgb (Conv2D) (None, None, None, 5 1048576 res5b_relu_rgb[0][0] __________________________________________________________________________________________________ res5b_branch2c_d (Conv2D) (None, None, None, 2 1048576 res5b_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn5c_branch2a_rgb (BatchNormali (None, None, None, 5 2048 res5c_branch2a_rgb[0][0] __________________________________________________________________________________________________ res5b_d (Add) (None, None, None, 2 0 res5b_branch2c_d[0][0] res5a_relu_d[0][0] __________________________________________________________________________________________________ res5c_branch2a_relu_rgb (Activa (None, None, None, 5 0 bn5c_branch2a_rgb[0][0] __________________________________________________________________________________________________ res5b_relu_d (Activation) (None, None, None, 2 0 res5b_d[0][0] __________________________________________________________________________________________________ padding5c_branch2b_rgb (ZeroPad (None, None, None, 5 0 res5c_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res5c_branch2a_d (Conv2D) (None, None, None, 5 1048576 res5b_relu_d[0][0] __________________________________________________________________________________________________ res5c_branch2b_rgb (Conv2D) (None, None, None, 5 2359296 padding5c_branch2b_rgb[0][0] __________________________________________________________________________________________________ res5c_branch2a_relu_d (Activati (None, None, None, 5 0 res5c_branch2a_d[0][0] __________________________________________________________________________________________________ bn5c_branch2b_rgb (BatchNormali (None, None, None, 5 2048 res5c_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding5c_branch2b_d (ZeroPaddi (None, None, None, 5 0 res5c_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res5c_branch2b_relu_rgb (Activa (None, None, None, 5 0 bn5c_branch2b_rgb[0][0] __________________________________________________________________________________________________ res5c_branch2b_d (Conv2D) (None, None, None, 5 2359296 padding5c_branch2b_d[0][0] __________________________________________________________________________________________________ res5c_branch2c_rgb (Conv2D) (None, None, None, 2 1048576 res5c_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res5c_branch2b_relu_d (Activati (None, None, None, 5 0 res5c_branch2b_d[0][0] __________________________________________________________________________________________________ bn5c_branch2c_rgb (BatchNormali (None, None, None, 2 8192 res5c_branch2c_rgb[0][0] __________________________________________________________________________________________________ res5c_branch2c_d (Conv2D) (None, None, None, 2 1048576 res5c_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res5c_rgb (Add) (None, None, None, 2 0 bn5c_branch2c_rgb[0][0] res5b_relu_rgb[0][0] __________________________________________________________________________________________________ res5c_d (Add) (None, None, None, 2 0 res5c_branch2c_d[0][0] res5b_relu_d[0][0] __________________________________________________________________________________________________ res5c_relu_rgb (Activation) (None, None, None, 2 0 res5c_rgb[0][0] __________________________________________________________________________________________________ res5c_relu_d (Activation) (None, None, None, 2 0 res5c_d[0][0] __________________________________________________________________________________________________ concatenate_33 (Concatenate) (None, None, None, 4 0 res5c_relu_rgb[0][0] res5c_relu_d[0][0] __________________________________________________________________________________________________ C5_reduced (Conv2D) (None, None, None, 2 1048832 concatenate_33[0][0] __________________________________________________________________________________________________ concatenate_30 (Concatenate) (None, None, None, 2 0 res4b22_relu_rgb[0][0] res4b22_relu_d[0][0] __________________________________________________________________________________________________ P5_upsampled (UpsampleLike) (None, None, None, 2 0 C5_reduced[0][0] concatenate_30[0][0] __________________________________________________________________________________________________ C4_reduced (Conv2D) (None, None, None, 2 524544 concatenate_30[0][0] __________________________________________________________________________________________________ P4_merged (Add) (None, None, None, 2 0 P5_upsampled[0][0] C4_reduced[0][0] __________________________________________________________________________________________________ concatenate_7 (Concatenate) (None, None, None, 1 0 res3b3_relu_rgb[0][0] res3b3_relu_d[0][0] __________________________________________________________________________________________________ P4_upsampled (UpsampleLike) (None, None, None, 2 0 P4_merged[0][0] concatenate_7[0][0] __________________________________________________________________________________________________ C3_reduced (Conv2D) (None, None, None, 2 262400 concatenate_7[0][0] __________________________________________________________________________________________________ P6 (Conv2D) (None, None, None, 2 9437440 concatenate_33[0][0] __________________________________________________________________________________________________ P3_merged (Add) (None, None, None, 2 0 P4_upsampled[0][0] C3_reduced[0][0] __________________________________________________________________________________________________ C6_relu (Activation) (None, None, None, 2 0 P6[0][0] __________________________________________________________________________________________________ P3 (Conv2D) (None, None, None, 2 590080 P3_merged[0][0] __________________________________________________________________________________________________ P4 (Conv2D) (None, None, None, 2 590080 P4_merged[0][0] __________________________________________________________________________________________________ P5 (Conv2D) (None, None, None, 2 590080 C5_reduced[0][0] __________________________________________________________________________________________________ P7 (Conv2D) (None, None, None, 2 590080 C6_relu[0][0] __________________________________________________________________________________________________ regression_submodel (Model) (None, None, 4) 2443300 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ classification_submodel (Model) (None, None, 1) 2381065 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ regression (Concatenate) (None, None, 4) 0 regression_submodel[1][0] regression_submodel[2][0] regression_submodel[3][0] regression_submodel[4][0] regression_submodel[5][0] __________________________________________________________________________________________________ classification (Concatenate) (None, None, 1) 0 classification_submodel[1][0] classification_submodel[2][0] classification_submodel[3][0] classification_submodel[4][0] classification_submodel[5][0] ================================================================================================== Total params: 103,451,949 Trainable params: 103,241,261 Non-trainable params: 210,688 __________________________________________________________________________________________________ None Epoch 1/20 1/500 [..............................] - ETA: 1:05:17 - loss: 3.7812 - regression_loss: 2.6478 - classification_loss: 1.1335 2/500 [..............................] - ETA: 34:04 - loss: 3.8996 - regression_loss: 2.7680 - classification_loss: 1.1316 3/500 [..............................] - ETA: 23:41 - loss: 3.9660 - regression_loss: 2.8351 - classification_loss: 1.1309 4/500 [..............................] - ETA: 18:31 - loss: 3.9788 - regression_loss: 2.8484 - classification_loss: 1.1304 5/500 [..............................] - ETA: 15:24 - loss: 3.9580 - regression_loss: 2.8277 - classification_loss: 1.1303 6/500 [..............................] - ETA: 13:22 - loss: 3.9738 - regression_loss: 2.8437 - classification_loss: 1.1301 7/500 [..............................] - ETA: 11:54 - loss: 3.9961 - regression_loss: 2.8660 - classification_loss: 1.1301 8/500 [..............................] - ETA: 10:46 - loss: 3.9797 - regression_loss: 2.8498 - classification_loss: 1.1299 9/500 [..............................] - ETA: 9:53 - loss: 3.9788 - regression_loss: 2.8488 - classification_loss: 1.1300 10/500 [..............................] - ETA: 9:12 - loss: 3.9831 - regression_loss: 2.8532 - classification_loss: 1.1298 11/500 [..............................] - ETA: 8:38 - loss: 3.9854 - regression_loss: 2.8557 - classification_loss: 1.1297 12/500 [..............................] - ETA: 8:09 - loss: 3.9731 - regression_loss: 2.8430 - classification_loss: 1.1301 13/500 [..............................] - ETA: 7:45 - loss: 3.9879 - regression_loss: 2.8576 - classification_loss: 1.1303 14/500 [..............................] - ETA: 7:23 - loss: 3.9801 - regression_loss: 2.8500 - classification_loss: 1.1301 15/500 [..............................] - ETA: 7:06 - loss: 3.9894 - regression_loss: 2.8594 - classification_loss: 1.1299 16/500 [..............................] - ETA: 6:50 - loss: 3.9858 - regression_loss: 2.8559 - classification_loss: 1.1299 17/500 [>.............................] - ETA: 6:36 - loss: 3.9861 - regression_loss: 2.8563 - classification_loss: 1.1297 18/500 [>.............................] - ETA: 6:24 - loss: 3.9843 - regression_loss: 2.8547 - classification_loss: 1.1297 19/500 [>.............................] - ETA: 6:12 - loss: 3.9702 - regression_loss: 2.8405 - classification_loss: 1.1297 20/500 [>.............................] - ETA: 6:02 - loss: 3.9868 - regression_loss: 2.8572 - classification_loss: 1.1296 21/500 [>.............................] - ETA: 5:53 - loss: 3.9923 - regression_loss: 2.8628 - classification_loss: 1.1295 22/500 [>.............................] - ETA: 5:45 - loss: 3.9946 - regression_loss: 2.8652 - classification_loss: 1.1294 23/500 [>.............................] - ETA: 5:38 - loss: 3.9936 - regression_loss: 2.8642 - classification_loss: 1.1294 24/500 [>.............................] - ETA: 5:31 - loss: 3.9991 - regression_loss: 2.8698 - classification_loss: 1.1293 25/500 [>.............................] - ETA: 5:24 - loss: 4.0018 - regression_loss: 2.8726 - classification_loss: 1.1292 26/500 [>.............................] - ETA: 5:18 - loss: 4.0011 - regression_loss: 2.8720 - classification_loss: 1.1291 27/500 [>.............................] - ETA: 5:12 - loss: 3.9988 - regression_loss: 2.8697 - classification_loss: 1.1291 28/500 [>.............................] - ETA: 5:07 - loss: 3.9969 - regression_loss: 2.8680 - classification_loss: 1.1289 29/500 [>.............................] - ETA: 5:01 - loss: 3.9912 - regression_loss: 2.8625 - classification_loss: 1.1288 30/500 [>.............................] - ETA: 4:57 - loss: 3.9924 - regression_loss: 2.8638 - classification_loss: 1.1286 31/500 [>.............................] - ETA: 4:53 - loss: 3.9952 - regression_loss: 2.8666 - classification_loss: 1.1286 32/500 [>.............................] - ETA: 4:49 - loss: 3.9918 - regression_loss: 2.8632 - classification_loss: 1.1286 33/500 [>.............................] - ETA: 4:45 - loss: 3.9920 - regression_loss: 2.8636 - classification_loss: 1.1284 34/500 [=>............................] - ETA: 4:41 - loss: 3.9942 - regression_loss: 2.8660 - classification_loss: 1.1282 35/500 [=>............................] - ETA: 4:37 - loss: 3.9938 - regression_loss: 2.8658 - classification_loss: 1.1281 36/500 [=>............................] - ETA: 4:34 - loss: 3.9925 - regression_loss: 2.8644 - classification_loss: 1.1281 37/500 [=>............................] - ETA: 4:32 - loss: 3.9952 - regression_loss: 2.8671 - classification_loss: 1.1280 38/500 [=>............................] - ETA: 4:28 - loss: 3.9960 - regression_loss: 2.8681 - classification_loss: 1.1279 39/500 [=>............................] - ETA: 4:26 - loss: 3.9959 - regression_loss: 2.8681 - classification_loss: 1.1278 40/500 [=>............................] - ETA: 4:23 - loss: 3.9949 - regression_loss: 2.8672 - classification_loss: 1.1276 41/500 [=>............................] - ETA: 4:20 - loss: 3.9936 - regression_loss: 2.8661 - classification_loss: 1.1275 42/500 [=>............................] - ETA: 4:17 - loss: 3.9903 - regression_loss: 2.8629 - classification_loss: 1.1274 43/500 [=>............................] - ETA: 4:15 - loss: 3.9882 - regression_loss: 2.8609 - classification_loss: 1.1273 44/500 [=>............................] - ETA: 4:13 - loss: 3.9827 - regression_loss: 2.8555 - classification_loss: 1.1272 45/500 [=>............................] - ETA: 4:10 - loss: 3.9809 - regression_loss: 2.8539 - classification_loss: 1.1269 46/500 [=>............................] - ETA: 4:08 - loss: 3.9794 - regression_loss: 2.8526 - classification_loss: 1.1268 47/500 [=>............................] - ETA: 4:06 - loss: 3.9856 - regression_loss: 2.8587 - classification_loss: 1.1269 48/500 [=>............................] - ETA: 4:04 - loss: 3.9836 - regression_loss: 2.8568 - classification_loss: 1.1268 49/500 [=>............................] - ETA: 4:02 - loss: 3.9812 - regression_loss: 2.8546 - classification_loss: 1.1267 50/500 [==>...........................] - ETA: 4:00 - loss: 3.9782 - regression_loss: 2.8516 - classification_loss: 1.1266 51/500 [==>...........................] - ETA: 3:58 - loss: 3.9743 - regression_loss: 2.8480 - classification_loss: 1.1263 52/500 [==>...........................] - ETA: 3:56 - loss: 3.9714 - regression_loss: 2.8454 - classification_loss: 1.1260 53/500 [==>...........................] - ETA: 3:55 - loss: 3.9738 - regression_loss: 2.8479 - classification_loss: 1.1259 54/500 [==>...........................] - ETA: 3:53 - loss: 3.9692 - regression_loss: 2.8434 - classification_loss: 1.1258 55/500 [==>...........................] - ETA: 3:52 - loss: 3.9672 - regression_loss: 2.8414 - classification_loss: 1.1258 56/500 [==>...........................] - ETA: 3:50 - loss: 3.9624 - regression_loss: 2.8370 - classification_loss: 1.1254 57/500 [==>...........................] - ETA: 3:49 - loss: 3.9583 - regression_loss: 2.8332 - classification_loss: 1.1251 58/500 [==>...........................] - ETA: 3:47 - loss: 3.9532 - regression_loss: 2.8285 - classification_loss: 1.1247 59/500 [==>...........................] - ETA: 3:45 - loss: 3.9505 - regression_loss: 2.8260 - classification_loss: 1.1245 60/500 [==>...........................] - ETA: 3:44 - loss: 3.9512 - regression_loss: 2.8267 - classification_loss: 1.1245 61/500 [==>...........................] - ETA: 3:42 - loss: 3.9476 - regression_loss: 2.8234 - classification_loss: 1.1242 62/500 [==>...........................] - ETA: 3:41 - loss: 3.9474 - regression_loss: 2.8233 - classification_loss: 1.1240 63/500 [==>...........................] - ETA: 3:39 - loss: 3.9421 - regression_loss: 2.8186 - classification_loss: 1.1235 64/500 [==>...........................] - ETA: 3:38 - loss: 3.9375 - regression_loss: 2.8143 - classification_loss: 1.1232 65/500 [==>...........................] - ETA: 3:36 - loss: 3.9330 - regression_loss: 2.8104 - classification_loss: 1.1225 66/500 [==>...........................] - ETA: 3:35 - loss: 3.9333 - regression_loss: 2.8108 - classification_loss: 1.1225 67/500 [===>..........................] - ETA: 3:33 - loss: 3.9309 - regression_loss: 2.8090 - classification_loss: 1.1219 68/500 [===>..........................] - ETA: 3:32 - loss: 3.9279 - regression_loss: 2.8064 - classification_loss: 1.1215 69/500 [===>..........................] - ETA: 3:31 - loss: 3.9250 - regression_loss: 2.8039 - classification_loss: 1.1211 70/500 [===>..........................] - ETA: 3:29 - loss: 3.9206 - regression_loss: 2.8002 - classification_loss: 1.1204 71/500 [===>..........................] - ETA: 3:28 - loss: 3.9191 - regression_loss: 2.7988 - classification_loss: 1.1203 72/500 [===>..........................] - ETA: 3:27 - loss: 3.9135 - regression_loss: 2.7939 - classification_loss: 1.1197 73/500 [===>..........................] - ETA: 3:25 - loss: 3.9107 - regression_loss: 2.7915 - classification_loss: 1.1193 74/500 [===>..........................] - ETA: 3:24 - loss: 3.9082 - regression_loss: 2.7892 - classification_loss: 1.1191 75/500 [===>..........................] - ETA: 3:23 - loss: 3.9099 - regression_loss: 2.7910 - classification_loss: 1.1190 76/500 [===>..........................] - ETA: 3:22 - loss: 3.9078 - regression_loss: 2.7891 - classification_loss: 1.1187 77/500 [===>..........................] - ETA: 3:21 - loss: 3.9028 - regression_loss: 2.7848 - classification_loss: 1.1179 78/500 [===>..........................] - ETA: 3:20 - loss: 3.8989 - regression_loss: 2.7814 - classification_loss: 1.1175 79/500 [===>..........................] - ETA: 3:19 - loss: 3.8942 - regression_loss: 2.7773 - classification_loss: 1.1169 80/500 [===>..........................] - ETA: 3:17 - loss: 3.8905 - regression_loss: 2.7740 - classification_loss: 1.1165 81/500 [===>..........................] - ETA: 3:16 - loss: 3.8876 - regression_loss: 2.7715 - classification_loss: 1.1161 82/500 [===>..........................] - ETA: 3:15 - loss: 3.8838 - regression_loss: 2.7683 - classification_loss: 1.1155 83/500 [===>..........................] - ETA: 3:14 - loss: 3.8808 - regression_loss: 2.7657 - classification_loss: 1.1151 84/500 [====>.........................] - ETA: 3:13 - loss: 3.8779 - regression_loss: 2.7635 - classification_loss: 1.1144 85/500 [====>.........................] - ETA: 3:12 - loss: 3.8729 - regression_loss: 2.7595 - classification_loss: 1.1134 86/500 [====>.........................] - ETA: 3:11 - loss: 3.8708 - regression_loss: 2.7577 - classification_loss: 1.1131 87/500 [====>.........................] - ETA: 3:10 - loss: 3.8688 - regression_loss: 2.7559 - classification_loss: 1.1129 88/500 [====>.........................] - ETA: 3:09 - loss: 3.8686 - regression_loss: 2.7561 - classification_loss: 1.1124 89/500 [====>.........................] - ETA: 3:08 - loss: 3.8673 - regression_loss: 2.7556 - classification_loss: 1.1116 90/500 [====>.........................] - ETA: 3:07 - loss: 3.8631 - regression_loss: 2.7525 - classification_loss: 1.1105 91/500 [====>.........................] - ETA: 3:06 - loss: 3.8627 - regression_loss: 2.7528 - classification_loss: 1.1099 92/500 [====>.........................] - ETA: 3:06 - loss: 3.8571 - regression_loss: 2.7487 - classification_loss: 1.1084 93/500 [====>.........................] - ETA: 3:05 - loss: 3.8572 - regression_loss: 2.7490 - classification_loss: 1.1082 94/500 [====>.........................] - ETA: 3:04 - loss: 3.8542 - regression_loss: 2.7472 - classification_loss: 1.1070 95/500 [====>.........................] - ETA: 3:03 - loss: 3.8513 - regression_loss: 2.7455 - classification_loss: 1.1058 96/500 [====>.........................] - ETA: 3:02 - loss: 3.8482 - regression_loss: 2.7436 - classification_loss: 1.1046 97/500 [====>.........................] - ETA: 3:01 - loss: 3.8435 - regression_loss: 2.7402 - classification_loss: 1.1033 98/500 [====>.........................] - ETA: 3:00 - loss: 3.8403 - regression_loss: 2.7378 - classification_loss: 1.1026 99/500 [====>.........................] - ETA: 3:00 - loss: 3.8388 - regression_loss: 2.7369 - classification_loss: 1.1019 100/500 [=====>........................] - ETA: 2:59 - loss: 3.8358 - regression_loss: 2.7354 - classification_loss: 1.1004 101/500 [=====>........................] - ETA: 2:58 - loss: 3.8332 - regression_loss: 2.7332 - classification_loss: 1.1000 102/500 [=====>........................] - ETA: 2:57 - loss: 3.8302 - regression_loss: 2.7317 - classification_loss: 1.0984 103/500 [=====>........................] - ETA: 2:56 - loss: 3.8297 - regression_loss: 2.7329 - classification_loss: 1.0968 104/500 [=====>........................] - ETA: 2:56 - loss: 3.8255 - regression_loss: 2.7303 - classification_loss: 1.0952 105/500 [=====>........................] - ETA: 2:55 - loss: 3.8212 - regression_loss: 2.7277 - classification_loss: 1.0936 106/500 [=====>........................] - ETA: 2:54 - loss: 3.8174 - regression_loss: 2.7259 - classification_loss: 1.0915 107/500 [=====>........................] - ETA: 2:53 - loss: 3.8116 - regression_loss: 2.7224 - classification_loss: 1.0892 108/500 [=====>........................] - ETA: 2:52 - loss: 3.8066 - regression_loss: 2.7200 - classification_loss: 1.0866 109/500 [=====>........................] - ETA: 2:52 - loss: 3.8032 - regression_loss: 2.7182 - classification_loss: 1.0851 110/500 [=====>........................] - ETA: 2:51 - loss: 3.8032 - regression_loss: 2.7187 - classification_loss: 1.0845 111/500 [=====>........................] - ETA: 2:50 - loss: 3.7997 - regression_loss: 2.7172 - classification_loss: 1.0826 112/500 [=====>........................] - ETA: 2:50 - loss: 3.7947 - regression_loss: 2.7148 - classification_loss: 1.0798 113/500 [=====>........................] - ETA: 2:49 - loss: 3.7922 - regression_loss: 2.7145 - classification_loss: 1.0777 114/500 [=====>........................] - ETA: 2:48 - loss: 3.7868 - regression_loss: 2.7118 - classification_loss: 1.0750 115/500 [=====>........................] - ETA: 2:47 - loss: 3.7850 - regression_loss: 2.7096 - classification_loss: 1.0754 116/500 [=====>........................] - ETA: 2:47 - loss: 3.7791 - regression_loss: 2.7069 - classification_loss: 1.0722 117/500 [======>.......................] - ETA: 2:46 - loss: 3.7738 - regression_loss: 2.7046 - classification_loss: 1.0692 118/500 [======>.......................] - ETA: 2:45 - loss: 3.7710 - regression_loss: 2.7036 - classification_loss: 1.0674 119/500 [======>.......................] - ETA: 2:45 - loss: 3.7667 - regression_loss: 2.7022 - classification_loss: 1.0645 120/500 [======>.......................] - ETA: 2:44 - loss: 3.7655 - regression_loss: 2.7028 - classification_loss: 1.0627 121/500 [======>.......................] - ETA: 2:43 - loss: 3.7601 - regression_loss: 2.7005 - classification_loss: 1.0596 122/500 [======>.......................] - ETA: 2:43 - loss: 3.7542 - regression_loss: 2.6984 - classification_loss: 1.0558 123/500 [======>.......................] - ETA: 2:42 - loss: 3.7528 - regression_loss: 2.6955 - classification_loss: 1.0573 124/500 [======>.......................] - ETA: 2:41 - loss: 3.7532 - regression_loss: 2.6952 - classification_loss: 1.0580 125/500 [======>.......................] - ETA: 2:41 - loss: 3.7508 - regression_loss: 2.6946 - classification_loss: 1.0562 126/500 [======>.......................] - ETA: 2:40 - loss: 3.7476 - regression_loss: 2.6918 - classification_loss: 1.0558 127/500 [======>.......................] - ETA: 2:39 - loss: 3.7435 - regression_loss: 2.6904 - classification_loss: 1.0531 128/500 [======>.......................] - ETA: 2:39 - loss: 3.7407 - regression_loss: 2.6891 - classification_loss: 1.0516 129/500 [======>.......................] - ETA: 2:38 - loss: 3.7389 - regression_loss: 2.6893 - classification_loss: 1.0496 130/500 [======>.......................] - ETA: 2:37 - loss: 3.7340 - regression_loss: 2.6876 - classification_loss: 1.0464 131/500 [======>.......................] - ETA: 2:37 - loss: 3.7291 - regression_loss: 2.6856 - classification_loss: 1.0435 132/500 [======>.......................] - ETA: 2:36 - loss: 3.7230 - regression_loss: 2.6830 - classification_loss: 1.0400 133/500 [======>.......................] - ETA: 2:36 - loss: 3.7178 - regression_loss: 2.6810 - classification_loss: 1.0368 134/500 [=======>......................] - ETA: 2:35 - loss: 3.7129 - regression_loss: 2.6793 - classification_loss: 1.0336 135/500 [=======>......................] - ETA: 2:34 - loss: 3.7081 - regression_loss: 2.6777 - classification_loss: 1.0305 136/500 [=======>......................] - ETA: 2:34 - loss: 3.7061 - regression_loss: 2.6767 - classification_loss: 1.0294 137/500 [=======>......................] - ETA: 2:33 - loss: 3.7019 - regression_loss: 2.6750 - classification_loss: 1.0269 138/500 [=======>......................] - ETA: 2:33 - loss: 3.6969 - regression_loss: 2.6732 - classification_loss: 1.0236 139/500 [=======>......................] - ETA: 2:32 - loss: 3.6921 - regression_loss: 2.6720 - classification_loss: 1.0200 140/500 [=======>......................] - ETA: 2:31 - loss: 3.6882 - regression_loss: 2.6711 - classification_loss: 1.0171 141/500 [=======>......................] - ETA: 2:31 - loss: 3.6832 - regression_loss: 2.6694 - classification_loss: 1.0138 142/500 [=======>......................] - ETA: 2:30 - loss: 3.6794 - regression_loss: 2.6679 - classification_loss: 1.0115 143/500 [=======>......................] - ETA: 2:30 - loss: 3.6781 - regression_loss: 2.6683 - classification_loss: 1.0098 144/500 [=======>......................] - ETA: 2:29 - loss: 3.6733 - regression_loss: 2.6668 - classification_loss: 1.0065 145/500 [=======>......................] - ETA: 2:29 - loss: 3.6683 - regression_loss: 2.6655 - classification_loss: 1.0028 146/500 [=======>......................] - ETA: 2:28 - loss: 3.6690 - regression_loss: 2.6652 - classification_loss: 1.0038 147/500 [=======>......................] - ETA: 2:27 - loss: 3.6665 - regression_loss: 2.6646 - classification_loss: 1.0019 148/500 [=======>......................] - ETA: 2:27 - loss: 3.6630 - regression_loss: 2.6635 - classification_loss: 0.9994 149/500 [=======>......................] - ETA: 2:26 - loss: 3.6590 - regression_loss: 2.6626 - classification_loss: 0.9964 150/500 [========>.....................] - ETA: 2:26 - loss: 3.6550 - regression_loss: 2.6608 - classification_loss: 0.9941 151/500 [========>.....................] - ETA: 2:25 - loss: 3.6541 - regression_loss: 2.6601 - classification_loss: 0.9941 152/500 [========>.....................] - ETA: 2:25 - loss: 3.6503 - regression_loss: 2.6588 - classification_loss: 0.9915 153/500 [========>.....................] - ETA: 2:24 - loss: 3.6455 - regression_loss: 2.6569 - classification_loss: 0.9886 154/500 [========>.....................] - ETA: 2:23 - loss: 3.6413 - regression_loss: 2.6561 - classification_loss: 0.9852 155/500 [========>.....................] - ETA: 2:23 - loss: 3.6425 - regression_loss: 2.6586 - classification_loss: 0.9839 156/500 [========>.....................] - ETA: 2:22 - loss: 3.6381 - regression_loss: 2.6574 - classification_loss: 0.9807 157/500 [========>.....................] - ETA: 2:22 - loss: 3.6381 - regression_loss: 2.6566 - classification_loss: 0.9815 158/500 [========>.....................] - ETA: 2:21 - loss: 3.6378 - regression_loss: 2.6569 - classification_loss: 0.9809 159/500 [========>.....................] - ETA: 2:21 - loss: 3.6376 - regression_loss: 2.6562 - classification_loss: 0.9814 160/500 [========>.....................] - ETA: 2:20 - loss: 3.6336 - regression_loss: 2.6547 - classification_loss: 0.9789 161/500 [========>.....................] - ETA: 2:20 - loss: 3.6291 - regression_loss: 2.6535 - classification_loss: 0.9756 162/500 [========>.....................] - ETA: 2:19 - loss: 3.6252 - regression_loss: 2.6522 - classification_loss: 0.9729 163/500 [========>.....................] - ETA: 2:19 - loss: 3.6247 - regression_loss: 2.6531 - classification_loss: 0.9716 164/500 [========>.....................] - ETA: 2:18 - loss: 3.6202 - regression_loss: 2.6517 - classification_loss: 0.9685 165/500 [========>.....................] - ETA: 2:18 - loss: 3.6208 - regression_loss: 2.6530 - classification_loss: 0.9679 166/500 [========>.....................] - ETA: 2:17 - loss: 3.6160 - regression_loss: 2.6512 - classification_loss: 0.9648 167/500 [=========>....................] - ETA: 2:17 - loss: 3.6131 - regression_loss: 2.6506 - classification_loss: 0.9625 168/500 [=========>....................] - ETA: 2:16 - loss: 3.6091 - regression_loss: 2.6489 - classification_loss: 0.9602 169/500 [=========>....................] - ETA: 2:16 - loss: 3.6059 - regression_loss: 2.6479 - classification_loss: 0.9580 170/500 [=========>....................] - ETA: 2:15 - loss: 3.6020 - regression_loss: 2.6465 - classification_loss: 0.9555 171/500 [=========>....................] - ETA: 2:15 - loss: 3.5979 - regression_loss: 2.6452 - classification_loss: 0.9527 172/500 [=========>....................] - ETA: 2:14 - loss: 3.5945 - regression_loss: 2.6442 - classification_loss: 0.9504 173/500 [=========>....................] - ETA: 2:13 - loss: 3.5905 - regression_loss: 2.6429 - classification_loss: 0.9476 174/500 [=========>....................] - ETA: 2:13 - loss: 3.5865 - regression_loss: 2.6417 - classification_loss: 0.9449 175/500 [=========>....................] - ETA: 2:12 - loss: 3.5829 - regression_loss: 2.6403 - classification_loss: 0.9426 176/500 [=========>....................] - ETA: 2:12 - loss: 3.5828 - regression_loss: 2.6411 - classification_loss: 0.9417 177/500 [=========>....................] - ETA: 2:11 - loss: 3.5797 - regression_loss: 2.6402 - classification_loss: 0.9395 178/500 [=========>....................] - ETA: 2:11 - loss: 3.5786 - regression_loss: 2.6405 - classification_loss: 0.9381 179/500 [=========>....................] - ETA: 2:10 - loss: 3.5759 - regression_loss: 2.6396 - classification_loss: 0.9364 180/500 [=========>....................] - ETA: 2:10 - loss: 3.5728 - regression_loss: 2.6387 - classification_loss: 0.9342 181/500 [=========>....................] - ETA: 2:10 - loss: 3.5693 - regression_loss: 2.6372 - classification_loss: 0.9321 182/500 [=========>....................] - ETA: 2:09 - loss: 3.5662 - regression_loss: 2.6363 - classification_loss: 0.9298 183/500 [=========>....................] - ETA: 2:09 - loss: 3.5626 - regression_loss: 2.6349 - classification_loss: 0.9277 184/500 [==========>...................] - ETA: 2:08 - loss: 3.5590 - regression_loss: 2.6339 - classification_loss: 0.9250 185/500 [==========>...................] - ETA: 2:08 - loss: 3.5569 - regression_loss: 2.6334 - classification_loss: 0.9235 186/500 [==========>...................] - ETA: 2:07 - loss: 3.5588 - regression_loss: 2.6361 - classification_loss: 0.9227 187/500 [==========>...................] - ETA: 2:07 - loss: 3.5558 - regression_loss: 2.6354 - classification_loss: 0.9204 188/500 [==========>...................] - ETA: 2:06 - loss: 3.5562 - regression_loss: 2.6363 - classification_loss: 0.9199 189/500 [==========>...................] - ETA: 2:06 - loss: 3.5550 - regression_loss: 2.6366 - classification_loss: 0.9184 190/500 [==========>...................] - ETA: 2:05 - loss: 3.5521 - regression_loss: 2.6360 - classification_loss: 0.9161 191/500 [==========>...................] - ETA: 2:05 - loss: 3.5485 - regression_loss: 2.6346 - classification_loss: 0.9139 192/500 [==========>...................] - ETA: 2:04 - loss: 3.5454 - regression_loss: 2.6337 - classification_loss: 0.9117 193/500 [==========>...................] - ETA: 2:04 - loss: 3.5446 - regression_loss: 2.6328 - classification_loss: 0.9118 194/500 [==========>...................] - ETA: 2:03 - loss: 3.5416 - regression_loss: 2.6318 - classification_loss: 0.9098 195/500 [==========>...................] - ETA: 2:03 - loss: 3.5379 - regression_loss: 2.6304 - classification_loss: 0.9075 196/500 [==========>...................] - ETA: 2:02 - loss: 3.5348 - regression_loss: 2.6294 - classification_loss: 0.9054 197/500 [==========>...................] - ETA: 2:02 - loss: 3.5344 - regression_loss: 2.6298 - classification_loss: 0.9046 198/500 [==========>...................] - ETA: 2:01 - loss: 3.5311 - regression_loss: 2.6285 - classification_loss: 0.9026 199/500 [==========>...................] - ETA: 2:01 - loss: 3.5291 - regression_loss: 2.6279 - classification_loss: 0.9012 200/500 [===========>..................] - ETA: 2:00 - loss: 3.5267 - regression_loss: 2.6268 - classification_loss: 0.8999 201/500 [===========>..................] - ETA: 2:00 - loss: 3.5249 - regression_loss: 2.6271 - classification_loss: 0.8978 202/500 [===========>..................] - ETA: 1:59 - loss: 3.5251 - regression_loss: 2.6288 - classification_loss: 0.8963 203/500 [===========>..................] - ETA: 1:59 - loss: 3.5246 - regression_loss: 2.6297 - classification_loss: 0.8949 204/500 [===========>..................] - ETA: 1:58 - loss: 3.5212 - regression_loss: 2.6286 - classification_loss: 0.8926 205/500 [===========>..................] - ETA: 1:58 - loss: 3.5193 - regression_loss: 2.6281 - classification_loss: 0.8912 206/500 [===========>..................] - ETA: 1:58 - loss: 3.5173 - regression_loss: 2.6280 - classification_loss: 0.8893 207/500 [===========>..................] - ETA: 1:57 - loss: 3.5143 - regression_loss: 2.6271 - classification_loss: 0.8873 208/500 [===========>..................] - ETA: 1:57 - loss: 3.5116 - regression_loss: 2.6260 - classification_loss: 0.8856 209/500 [===========>..................] - ETA: 1:56 - loss: 3.5095 - regression_loss: 2.6258 - classification_loss: 0.8837 210/500 [===========>..................] - ETA: 1:56 - loss: 3.5071 - regression_loss: 2.6252 - classification_loss: 0.8820 211/500 [===========>..................] - ETA: 1:55 - loss: 3.5039 - regression_loss: 2.6240 - classification_loss: 0.8799 212/500 [===========>..................] - ETA: 1:55 - loss: 3.5010 - regression_loss: 2.6230 - classification_loss: 0.8779 213/500 [===========>..................] - ETA: 1:54 - loss: 3.4993 - regression_loss: 2.6225 - classification_loss: 0.8768 214/500 [===========>..................] - ETA: 1:54 - loss: 3.4962 - regression_loss: 2.6212 - classification_loss: 0.8750 215/500 [===========>..................] - ETA: 1:53 - loss: 3.4932 - regression_loss: 2.6201 - classification_loss: 0.8732 216/500 [===========>..................] - ETA: 1:53 - loss: 3.4906 - regression_loss: 2.6189 - classification_loss: 0.8717 217/500 [============>.................] - ETA: 1:53 - loss: 3.4880 - regression_loss: 2.6178 - classification_loss: 0.8702 218/500 [============>.................] - ETA: 1:52 - loss: 3.4850 - regression_loss: 2.6167 - classification_loss: 0.8683 219/500 [============>.................] - ETA: 1:52 - loss: 3.4840 - regression_loss: 2.6158 - classification_loss: 0.8682 220/500 [============>.................] - ETA: 1:51 - loss: 3.4845 - regression_loss: 2.6168 - classification_loss: 0.8677 221/500 [============>.................] - ETA: 1:51 - loss: 3.4821 - regression_loss: 2.6161 - classification_loss: 0.8660 222/500 [============>.................] - ETA: 1:50 - loss: 3.4792 - regression_loss: 2.6153 - classification_loss: 0.8640 223/500 [============>.................] - ETA: 1:50 - loss: 3.4776 - regression_loss: 2.6152 - classification_loss: 0.8624 224/500 [============>.................] - ETA: 1:49 - loss: 3.4760 - regression_loss: 2.6151 - classification_loss: 0.8609 225/500 [============>.................] - ETA: 1:49 - loss: 3.4736 - regression_loss: 2.6145 - classification_loss: 0.8591 226/500 [============>.................] - ETA: 1:48 - loss: 3.4705 - regression_loss: 2.6134 - classification_loss: 0.8570 227/500 [============>.................] - ETA: 1:48 - loss: 3.4686 - regression_loss: 2.6133 - classification_loss: 0.8554 228/500 [============>.................] - ETA: 1:48 - loss: 3.4664 - regression_loss: 2.6126 - classification_loss: 0.8538 229/500 [============>.................] - ETA: 1:47 - loss: 3.4620 - regression_loss: 2.6105 - classification_loss: 0.8515 230/500 [============>.................] - ETA: 1:47 - loss: 3.4593 - regression_loss: 2.6092 - classification_loss: 0.8501 231/500 [============>.................] - ETA: 1:46 - loss: 3.4569 - regression_loss: 2.6083 - classification_loss: 0.8485 232/500 [============>.................] - ETA: 1:46 - loss: 3.4550 - regression_loss: 2.6077 - classification_loss: 0.8473 233/500 [============>.................] - ETA: 1:45 - loss: 3.4526 - regression_loss: 2.6070 - classification_loss: 0.8456 234/500 [=============>................] - ETA: 1:45 - loss: 3.4512 - regression_loss: 2.6070 - classification_loss: 0.8442 235/500 [=============>................] - ETA: 1:44 - loss: 3.4490 - regression_loss: 2.6066 - classification_loss: 0.8424 236/500 [=============>................] - ETA: 1:44 - loss: 3.4461 - regression_loss: 2.6048 - classification_loss: 0.8412 237/500 [=============>................] - ETA: 1:44 - loss: 3.4452 - regression_loss: 2.6038 - classification_loss: 0.8414 238/500 [=============>................] - ETA: 1:43 - loss: 3.4429 - regression_loss: 2.6030 - classification_loss: 0.8398 239/500 [=============>................] - ETA: 1:43 - loss: 3.4408 - regression_loss: 2.6026 - classification_loss: 0.8382 240/500 [=============>................] - ETA: 1:42 - loss: 3.4387 - regression_loss: 2.6019 - classification_loss: 0.8368 241/500 [=============>................] - ETA: 1:42 - loss: 3.4357 - regression_loss: 2.6008 - classification_loss: 0.8350 242/500 [=============>................] - ETA: 1:41 - loss: 3.4339 - regression_loss: 2.6005 - classification_loss: 0.8333 243/500 [=============>................] - ETA: 1:41 - loss: 3.4319 - regression_loss: 2.5998 - classification_loss: 0.8321 244/500 [=============>................] - ETA: 1:41 - loss: 3.4296 - regression_loss: 2.5990 - classification_loss: 0.8306 245/500 [=============>................] - ETA: 1:40 - loss: 3.4276 - regression_loss: 2.5983 - classification_loss: 0.8293 246/500 [=============>................] - ETA: 1:40 - loss: 3.4250 - regression_loss: 2.5973 - classification_loss: 0.8277 247/500 [=============>................] - ETA: 1:39 - loss: 3.4231 - regression_loss: 2.5967 - classification_loss: 0.8264 248/500 [=============>................] - ETA: 1:39 - loss: 3.4212 - regression_loss: 2.5963 - classification_loss: 0.8249 249/500 [=============>................] - ETA: 1:38 - loss: 3.4190 - regression_loss: 2.5959 - classification_loss: 0.8231 250/500 [==============>...............] - ETA: 1:38 - loss: 3.4164 - regression_loss: 2.5946 - classification_loss: 0.8218 251/500 [==============>...............] - ETA: 1:38 - loss: 3.4148 - regression_loss: 2.5942 - classification_loss: 0.8206 252/500 [==============>...............] - ETA: 1:37 - loss: 3.4125 - regression_loss: 2.5931 - classification_loss: 0.8193 253/500 [==============>...............] - ETA: 1:37 - loss: 3.4125 - regression_loss: 2.5931 - classification_loss: 0.8195 254/500 [==============>...............] - ETA: 1:36 - loss: 3.4102 - regression_loss: 2.5924 - classification_loss: 0.8178 255/500 [==============>...............] - ETA: 1:36 - loss: 3.4098 - regression_loss: 2.5929 - classification_loss: 0.8168 256/500 [==============>...............] - ETA: 1:35 - loss: 3.4076 - regression_loss: 2.5922 - classification_loss: 0.8154 257/500 [==============>...............] - ETA: 1:35 - loss: 3.4059 - regression_loss: 2.5917 - classification_loss: 0.8143 258/500 [==============>...............] - ETA: 1:35 - loss: 3.4057 - regression_loss: 2.5923 - classification_loss: 0.8133 259/500 [==============>...............] - ETA: 1:34 - loss: 3.4038 - regression_loss: 2.5917 - classification_loss: 0.8121 260/500 [==============>...............] - ETA: 1:34 - loss: 3.4028 - regression_loss: 2.5913 - classification_loss: 0.8115 261/500 [==============>...............] - ETA: 1:33 - loss: 3.4006 - regression_loss: 2.5905 - classification_loss: 0.8101 262/500 [==============>...............] - ETA: 1:33 - loss: 3.3986 - regression_loss: 2.5898 - classification_loss: 0.8088 263/500 [==============>...............] - ETA: 1:32 - loss: 3.3980 - regression_loss: 2.5906 - classification_loss: 0.8074 264/500 [==============>...............] - ETA: 1:32 - loss: 3.3975 - regression_loss: 2.5908 - classification_loss: 0.8067 265/500 [==============>...............] - ETA: 1:32 - loss: 3.3955 - regression_loss: 2.5902 - classification_loss: 0.8053 266/500 [==============>...............] - ETA: 1:31 - loss: 3.3934 - regression_loss: 2.5894 - classification_loss: 0.8040 267/500 [===============>..............] - ETA: 1:31 - loss: 3.3907 - regression_loss: 2.5881 - classification_loss: 0.8026 268/500 [===============>..............] - ETA: 1:30 - loss: 3.3891 - regression_loss: 2.5876 - classification_loss: 0.8015 269/500 [===============>..............] - ETA: 1:30 - loss: 3.3879 - regression_loss: 2.5876 - classification_loss: 0.8003 270/500 [===============>..............] - ETA: 1:29 - loss: 3.3865 - regression_loss: 2.5875 - classification_loss: 0.7990 271/500 [===============>..............] - ETA: 1:29 - loss: 3.3859 - regression_loss: 2.5878 - classification_loss: 0.7981 272/500 [===============>..............] - ETA: 1:29 - loss: 3.3845 - regression_loss: 2.5878 - classification_loss: 0.7967 273/500 [===============>..............] - ETA: 1:28 - loss: 3.3830 - regression_loss: 2.5873 - classification_loss: 0.7958 274/500 [===============>..............] - ETA: 1:28 - loss: 3.3811 - regression_loss: 2.5865 - classification_loss: 0.7946 275/500 [===============>..............] - ETA: 1:27 - loss: 3.3790 - regression_loss: 2.5857 - classification_loss: 0.7933 276/500 [===============>..............] - ETA: 1:27 - loss: 3.3773 - regression_loss: 2.5853 - classification_loss: 0.7920 277/500 [===============>..............] - ETA: 1:27 - loss: 3.3748 - regression_loss: 2.5843 - classification_loss: 0.7905 278/500 [===============>..............] - ETA: 1:26 - loss: 3.3724 - regression_loss: 2.5829 - classification_loss: 0.7894 279/500 [===============>..............] - ETA: 1:26 - loss: 3.3707 - regression_loss: 2.5821 - classification_loss: 0.7886 280/500 [===============>..............] - ETA: 1:25 - loss: 3.3697 - regression_loss: 2.5818 - classification_loss: 0.7878 281/500 [===============>..............] - ETA: 1:25 - loss: 3.3689 - regression_loss: 2.5822 - classification_loss: 0.7868 282/500 [===============>..............] - ETA: 1:24 - loss: 3.3696 - regression_loss: 2.5837 - classification_loss: 0.7859 283/500 [===============>..............] - ETA: 1:24 - loss: 3.3675 - regression_loss: 2.5828 - classification_loss: 0.7847 284/500 [================>.............] - ETA: 1:24 - loss: 3.3660 - regression_loss: 2.5824 - classification_loss: 0.7836 285/500 [================>.............] - ETA: 1:23 - loss: 3.3639 - regression_loss: 2.5815 - classification_loss: 0.7824 286/500 [================>.............] - ETA: 1:23 - loss: 3.3619 - regression_loss: 2.5808 - classification_loss: 0.7812 287/500 [================>.............] - ETA: 1:22 - loss: 3.3607 - regression_loss: 2.5807 - classification_loss: 0.7801 288/500 [================>.............] - ETA: 1:22 - loss: 3.3587 - regression_loss: 2.5798 - classification_loss: 0.7788 289/500 [================>.............] - ETA: 1:22 - loss: 3.3583 - regression_loss: 2.5799 - classification_loss: 0.7783 290/500 [================>.............] - ETA: 1:21 - loss: 3.3565 - regression_loss: 2.5792 - classification_loss: 0.7772 291/500 [================>.............] - ETA: 1:21 - loss: 3.3564 - regression_loss: 2.5803 - classification_loss: 0.7762 292/500 [================>.............] - ETA: 1:20 - loss: 3.3543 - regression_loss: 2.5795 - classification_loss: 0.7748 293/500 [================>.............] - ETA: 1:20 - loss: 3.3522 - regression_loss: 2.5787 - classification_loss: 0.7736 294/500 [================>.............] - ETA: 1:19 - loss: 3.3513 - regression_loss: 2.5784 - classification_loss: 0.7729 295/500 [================>.............] - ETA: 1:19 - loss: 3.3495 - regression_loss: 2.5778 - classification_loss: 0.7718 296/500 [================>.............] - ETA: 1:19 - loss: 3.3500 - regression_loss: 2.5792 - classification_loss: 0.7708 297/500 [================>.............] - ETA: 1:18 - loss: 3.3485 - regression_loss: 2.5787 - classification_loss: 0.7699 298/500 [================>.............] - ETA: 1:18 - loss: 3.3458 - regression_loss: 2.5771 - classification_loss: 0.7687 299/500 [================>.............] - ETA: 1:17 - loss: 3.3440 - regression_loss: 2.5764 - classification_loss: 0.7677 300/500 [=================>............] - ETA: 1:17 - loss: 3.3425 - regression_loss: 2.5758 - classification_loss: 0.7667 301/500 [=================>............] - ETA: 1:17 - loss: 3.3406 - regression_loss: 2.5751 - classification_loss: 0.7655 302/500 [=================>............] - ETA: 1:16 - loss: 3.3389 - regression_loss: 2.5744 - classification_loss: 0.7645 303/500 [=================>............] - ETA: 1:16 - loss: 3.3368 - regression_loss: 2.5736 - classification_loss: 0.7632 304/500 [=================>............] - ETA: 1:15 - loss: 3.3367 - regression_loss: 2.5743 - classification_loss: 0.7624 305/500 [=================>............] - ETA: 1:15 - loss: 3.3349 - regression_loss: 2.5736 - classification_loss: 0.7613 306/500 [=================>............] - ETA: 1:15 - loss: 3.3335 - regression_loss: 2.5729 - classification_loss: 0.7605 307/500 [=================>............] - ETA: 1:14 - loss: 3.3319 - regression_loss: 2.5724 - classification_loss: 0.7594 308/500 [=================>............] - ETA: 1:14 - loss: 3.3303 - regression_loss: 2.5719 - classification_loss: 0.7584 309/500 [=================>............] - ETA: 1:13 - loss: 3.3282 - regression_loss: 2.5710 - classification_loss: 0.7572 310/500 [=================>............] - ETA: 1:13 - loss: 3.3269 - regression_loss: 2.5707 - classification_loss: 0.7562 311/500 [=================>............] - ETA: 1:13 - loss: 3.3256 - regression_loss: 2.5705 - classification_loss: 0.7551 312/500 [=================>............] - ETA: 1:12 - loss: 3.3233 - regression_loss: 2.5694 - classification_loss: 0.7539 313/500 [=================>............] - ETA: 1:12 - loss: 3.3215 - regression_loss: 2.5687 - classification_loss: 0.7528 314/500 [=================>............] - ETA: 1:11 - loss: 3.3207 - regression_loss: 2.5690 - classification_loss: 0.7517 315/500 [=================>............] - ETA: 1:11 - loss: 3.3194 - regression_loss: 2.5687 - classification_loss: 0.7507 316/500 [=================>............] - ETA: 1:11 - loss: 3.3175 - regression_loss: 2.5678 - classification_loss: 0.7497 317/500 [==================>...........] - ETA: 1:10 - loss: 3.3169 - regression_loss: 2.5676 - classification_loss: 0.7492 318/500 [==================>...........] - ETA: 1:10 - loss: 3.3163 - regression_loss: 2.5677 - classification_loss: 0.7485 319/500 [==================>...........] - ETA: 1:09 - loss: 3.3146 - regression_loss: 2.5672 - classification_loss: 0.7474 320/500 [==================>...........] - ETA: 1:09 - loss: 3.3132 - regression_loss: 2.5667 - classification_loss: 0.7465 321/500 [==================>...........] - ETA: 1:09 - loss: 3.3119 - regression_loss: 2.5662 - classification_loss: 0.7457 322/500 [==================>...........] - ETA: 1:08 - loss: 3.3105 - regression_loss: 2.5658 - classification_loss: 0.7447 323/500 [==================>...........] - ETA: 1:08 - loss: 3.3093 - regression_loss: 2.5655 - classification_loss: 0.7437 324/500 [==================>...........] - ETA: 1:07 - loss: 3.3079 - regression_loss: 2.5652 - classification_loss: 0.7427 325/500 [==================>...........] - ETA: 1:07 - loss: 3.3060 - regression_loss: 2.5645 - classification_loss: 0.7415 326/500 [==================>...........] - ETA: 1:07 - loss: 3.3050 - regression_loss: 2.5643 - classification_loss: 0.7408 327/500 [==================>...........] - ETA: 1:06 - loss: 3.3030 - regression_loss: 2.5634 - classification_loss: 0.7396 328/500 [==================>...........] - ETA: 1:06 - loss: 3.3013 - regression_loss: 2.5628 - classification_loss: 0.7385 329/500 [==================>...........] - ETA: 1:05 - loss: 3.3004 - regression_loss: 2.5627 - classification_loss: 0.7377 330/500 [==================>...........] - ETA: 1:05 - loss: 3.2987 - regression_loss: 2.5622 - classification_loss: 0.7366 331/500 [==================>...........] - ETA: 1:05 - loss: 3.2972 - regression_loss: 2.5615 - classification_loss: 0.7357 332/500 [==================>...........] - ETA: 1:04 - loss: 3.2956 - regression_loss: 2.5608 - classification_loss: 0.7348 333/500 [==================>...........] - ETA: 1:04 - loss: 3.2937 - regression_loss: 2.5601 - classification_loss: 0.7336 334/500 [===================>..........] - ETA: 1:03 - loss: 3.2922 - regression_loss: 2.5595 - classification_loss: 0.7327 335/500 [===================>..........] - ETA: 1:03 - loss: 3.2905 - regression_loss: 2.5588 - classification_loss: 0.7317 336/500 [===================>..........] - ETA: 1:03 - loss: 3.2892 - regression_loss: 2.5583 - classification_loss: 0.7309 337/500 [===================>..........] - ETA: 1:02 - loss: 3.2877 - regression_loss: 2.5579 - classification_loss: 0.7299 338/500 [===================>..........] - ETA: 1:02 - loss: 3.2868 - regression_loss: 2.5577 - classification_loss: 0.7291 339/500 [===================>..........] - ETA: 1:01 - loss: 3.2866 - regression_loss: 2.5579 - classification_loss: 0.7286 340/500 [===================>..........] - ETA: 1:01 - loss: 3.2857 - regression_loss: 2.5575 - classification_loss: 0.7282 341/500 [===================>..........] - ETA: 1:01 - loss: 3.2862 - regression_loss: 2.5587 - classification_loss: 0.7275 342/500 [===================>..........] - ETA: 1:00 - loss: 3.2843 - regression_loss: 2.5580 - classification_loss: 0.7264 343/500 [===================>..........] - ETA: 1:00 - loss: 3.2822 - regression_loss: 2.5565 - classification_loss: 0.7258 344/500 [===================>..........] - ETA: 59s - loss: 3.2811 - regression_loss: 2.5562 - classification_loss: 0.7249 345/500 [===================>..........] - ETA: 59s - loss: 3.2810 - regression_loss: 2.5566 - classification_loss: 0.7244 346/500 [===================>..........] - ETA: 59s - loss: 3.2790 - regression_loss: 2.5557 - classification_loss: 0.7234 347/500 [===================>..........] - ETA: 58s - loss: 3.2771 - regression_loss: 2.5546 - classification_loss: 0.7225 348/500 [===================>..........] - ETA: 58s - loss: 3.2763 - regression_loss: 2.5544 - classification_loss: 0.7219 349/500 [===================>..........] - ETA: 57s - loss: 3.2749 - regression_loss: 2.5539 - classification_loss: 0.7210 350/500 [====================>.........] - ETA: 57s - loss: 3.2735 - regression_loss: 2.5532 - classification_loss: 0.7203 351/500 [====================>.........] - ETA: 57s - loss: 3.2715 - regression_loss: 2.5523 - classification_loss: 0.7192 352/500 [====================>.........] - ETA: 56s - loss: 3.2717 - regression_loss: 2.5527 - classification_loss: 0.7190 353/500 [====================>.........] - ETA: 56s - loss: 3.2709 - regression_loss: 2.5526 - classification_loss: 0.7183 354/500 [====================>.........] - ETA: 55s - loss: 3.2701 - regression_loss: 2.5526 - classification_loss: 0.7175 355/500 [====================>.........] - ETA: 55s - loss: 3.2689 - regression_loss: 2.5521 - classification_loss: 0.7167 356/500 [====================>.........] - ETA: 55s - loss: 3.2687 - regression_loss: 2.5526 - classification_loss: 0.7160 357/500 [====================>.........] - ETA: 54s - loss: 3.2669 - regression_loss: 2.5519 - classification_loss: 0.7150 358/500 [====================>.........] - ETA: 54s - loss: 3.2658 - regression_loss: 2.5515 - classification_loss: 0.7143 359/500 [====================>.........] - ETA: 53s - loss: 3.2646 - regression_loss: 2.5513 - classification_loss: 0.7133 360/500 [====================>.........] - ETA: 53s - loss: 3.2635 - regression_loss: 2.5509 - classification_loss: 0.7126 361/500 [====================>.........] - ETA: 53s - loss: 3.2621 - regression_loss: 2.5504 - classification_loss: 0.7117 362/500 [====================>.........] - ETA: 52s - loss: 3.2615 - regression_loss: 2.5504 - classification_loss: 0.7111 363/500 [====================>.........] - ETA: 52s - loss: 3.2598 - regression_loss: 2.5496 - classification_loss: 0.7102 364/500 [====================>.........] - ETA: 52s - loss: 3.2582 - regression_loss: 2.5489 - classification_loss: 0.7093 365/500 [====================>.........] - ETA: 51s - loss: 3.2567 - regression_loss: 2.5484 - classification_loss: 0.7084 366/500 [====================>.........] - ETA: 51s - loss: 3.2565 - regression_loss: 2.5486 - classification_loss: 0.7079 367/500 [=====================>........] - ETA: 50s - loss: 3.2552 - regression_loss: 2.5480 - classification_loss: 0.7072 368/500 [=====================>........] - ETA: 50s - loss: 3.2541 - regression_loss: 2.5477 - classification_loss: 0.7064 369/500 [=====================>........] - ETA: 50s - loss: 3.2534 - regression_loss: 2.5475 - classification_loss: 0.7059 370/500 [=====================>........] - ETA: 49s - loss: 3.2536 - regression_loss: 2.5482 - classification_loss: 0.7053 371/500 [=====================>........] - ETA: 49s - loss: 3.2518 - regression_loss: 2.5474 - classification_loss: 0.7044 372/500 [=====================>........] - ETA: 48s - loss: 3.2504 - regression_loss: 2.5468 - classification_loss: 0.7037 373/500 [=====================>........] - ETA: 48s - loss: 3.2508 - regression_loss: 2.5477 - classification_loss: 0.7030 374/500 [=====================>........] - ETA: 48s - loss: 3.2495 - regression_loss: 2.5473 - classification_loss: 0.7022 375/500 [=====================>........] - ETA: 47s - loss: 3.2482 - regression_loss: 2.5467 - classification_loss: 0.7015 376/500 [=====================>........] - ETA: 47s - loss: 3.2469 - regression_loss: 2.5462 - classification_loss: 0.7007 377/500 [=====================>........] - ETA: 46s - loss: 3.2454 - regression_loss: 2.5456 - classification_loss: 0.6998 378/500 [=====================>........] - ETA: 46s - loss: 3.2444 - regression_loss: 2.5452 - classification_loss: 0.6992 379/500 [=====================>........] - ETA: 46s - loss: 3.2429 - regression_loss: 2.5445 - classification_loss: 0.6983 380/500 [=====================>........] - ETA: 45s - loss: 3.2419 - regression_loss: 2.5440 - classification_loss: 0.6978 381/500 [=====================>........] - ETA: 45s - loss: 3.2397 - regression_loss: 2.5429 - classification_loss: 0.6968 382/500 [=====================>........] - ETA: 45s - loss: 3.2387 - regression_loss: 2.5425 - classification_loss: 0.6962 383/500 [=====================>........] - ETA: 44s - loss: 3.2376 - regression_loss: 2.5421 - classification_loss: 0.6956 384/500 [======================>.......] - ETA: 44s - loss: 3.2354 - regression_loss: 2.5408 - classification_loss: 0.6946 385/500 [======================>.......] - ETA: 43s - loss: 3.2340 - regression_loss: 2.5401 - classification_loss: 0.6938 386/500 [======================>.......] - ETA: 43s - loss: 3.2346 - regression_loss: 2.5407 - classification_loss: 0.6940 387/500 [======================>.......] - ETA: 43s - loss: 3.2332 - regression_loss: 2.5402 - classification_loss: 0.6930 388/500 [======================>.......] - ETA: 42s - loss: 3.2313 - regression_loss: 2.5393 - classification_loss: 0.6920 389/500 [======================>.......] - ETA: 42s - loss: 3.2290 - regression_loss: 2.5381 - classification_loss: 0.6909 390/500 [======================>.......] - ETA: 41s - loss: 3.2270 - regression_loss: 2.5368 - classification_loss: 0.6902 391/500 [======================>.......] - ETA: 41s - loss: 3.2263 - regression_loss: 2.5367 - classification_loss: 0.6895 392/500 [======================>.......] - ETA: 41s - loss: 3.2259 - regression_loss: 2.5367 - classification_loss: 0.6892 393/500 [======================>.......] - ETA: 40s - loss: 3.2245 - regression_loss: 2.5360 - classification_loss: 0.6886 394/500 [======================>.......] - ETA: 40s - loss: 3.2241 - regression_loss: 2.5358 - classification_loss: 0.6883 395/500 [======================>.......] - ETA: 39s - loss: 3.2226 - regression_loss: 2.5348 - classification_loss: 0.6877 396/500 [======================>.......] - ETA: 39s - loss: 3.2210 - regression_loss: 2.5341 - classification_loss: 0.6869 397/500 [======================>.......] - ETA: 39s - loss: 3.2198 - regression_loss: 2.5336 - classification_loss: 0.6862 398/500 [======================>.......] - ETA: 38s - loss: 3.2189 - regression_loss: 2.5329 - classification_loss: 0.6859 399/500 [======================>.......] - ETA: 38s - loss: 3.2180 - regression_loss: 2.5328 - classification_loss: 0.6853 400/500 [=======================>......] - ETA: 38s - loss: 3.2168 - regression_loss: 2.5322 - classification_loss: 0.6846 401/500 [=======================>......] - ETA: 37s - loss: 3.2147 - regression_loss: 2.5308 - classification_loss: 0.6839 402/500 [=======================>......] - ETA: 37s - loss: 3.2131 - regression_loss: 2.5301 - classification_loss: 0.6830 403/500 [=======================>......] - ETA: 36s - loss: 3.2120 - regression_loss: 2.5297 - classification_loss: 0.6824 404/500 [=======================>......] - ETA: 36s - loss: 3.2116 - regression_loss: 2.5300 - classification_loss: 0.6816 405/500 [=======================>......] - ETA: 36s - loss: 3.2114 - regression_loss: 2.5300 - classification_loss: 0.6813 406/500 [=======================>......] - ETA: 35s - loss: 3.2108 - regression_loss: 2.5299 - classification_loss: 0.6810 407/500 [=======================>......] - ETA: 35s - loss: 3.2096 - regression_loss: 2.5291 - classification_loss: 0.6806 408/500 [=======================>......] - ETA: 34s - loss: 3.2084 - regression_loss: 2.5285 - classification_loss: 0.6799 409/500 [=======================>......] - ETA: 34s - loss: 3.2076 - regression_loss: 2.5280 - classification_loss: 0.6795 410/500 [=======================>......] - ETA: 34s - loss: 3.2066 - regression_loss: 2.5274 - classification_loss: 0.6791 411/500 [=======================>......] - ETA: 33s - loss: 3.2062 - regression_loss: 2.5273 - classification_loss: 0.6789 412/500 [=======================>......] - ETA: 33s - loss: 3.2049 - regression_loss: 2.5267 - classification_loss: 0.6782 413/500 [=======================>......] - ETA: 33s - loss: 3.2047 - regression_loss: 2.5271 - classification_loss: 0.6776 414/500 [=======================>......] - ETA: 32s - loss: 3.2031 - regression_loss: 2.5263 - classification_loss: 0.6768 415/500 [=======================>......] - ETA: 32s - loss: 3.2037 - regression_loss: 2.5274 - classification_loss: 0.6764 416/500 [=======================>......] - ETA: 31s - loss: 3.2024 - regression_loss: 2.5268 - classification_loss: 0.6756 417/500 [========================>.....] - ETA: 31s - loss: 3.2016 - regression_loss: 2.5268 - classification_loss: 0.6749 418/500 [========================>.....] - ETA: 31s - loss: 3.2006 - regression_loss: 2.5264 - classification_loss: 0.6742 419/500 [========================>.....] - ETA: 30s - loss: 3.1996 - regression_loss: 2.5258 - classification_loss: 0.6738 420/500 [========================>.....] - ETA: 30s - loss: 3.1984 - regression_loss: 2.5252 - classification_loss: 0.6732 421/500 [========================>.....] - ETA: 29s - loss: 3.1973 - regression_loss: 2.5248 - classification_loss: 0.6725 422/500 [========================>.....] - ETA: 29s - loss: 3.1955 - regression_loss: 2.5238 - classification_loss: 0.6717 423/500 [========================>.....] - ETA: 29s - loss: 3.1937 - regression_loss: 2.5226 - classification_loss: 0.6711 424/500 [========================>.....] - ETA: 28s - loss: 3.1922 - regression_loss: 2.5218 - classification_loss: 0.6703 425/500 [========================>.....] - ETA: 28s - loss: 3.1911 - regression_loss: 2.5213 - classification_loss: 0.6698 426/500 [========================>.....] - ETA: 28s - loss: 3.1905 - regression_loss: 2.5214 - classification_loss: 0.6691 427/500 [========================>.....] - ETA: 27s - loss: 3.1903 - regression_loss: 2.5218 - classification_loss: 0.6685 428/500 [========================>.....] - ETA: 27s - loss: 3.1882 - regression_loss: 2.5206 - classification_loss: 0.6675 429/500 [========================>.....] - ETA: 26s - loss: 3.1863 - regression_loss: 2.5195 - classification_loss: 0.6669 430/500 [========================>.....] - ETA: 26s - loss: 3.1851 - regression_loss: 2.5190 - classification_loss: 0.6661 431/500 [========================>.....] - ETA: 26s - loss: 3.1834 - regression_loss: 2.5181 - classification_loss: 0.6653 432/500 [========================>.....] - ETA: 25s - loss: 3.1813 - regression_loss: 2.5170 - classification_loss: 0.6643 433/500 [========================>.....] - ETA: 25s - loss: 3.1797 - regression_loss: 2.5161 - classification_loss: 0.6636 434/500 [=========================>....] - ETA: 24s - loss: 3.1792 - regression_loss: 2.5157 - classification_loss: 0.6635 435/500 [=========================>....] - ETA: 24s - loss: 3.1782 - regression_loss: 2.5152 - classification_loss: 0.6630 436/500 [=========================>....] - ETA: 24s - loss: 3.1770 - regression_loss: 2.5147 - classification_loss: 0.6623 437/500 [=========================>....] - ETA: 23s - loss: 3.1750 - regression_loss: 2.5136 - classification_loss: 0.6615 438/500 [=========================>....] - ETA: 23s - loss: 3.1743 - regression_loss: 2.5132 - classification_loss: 0.6610 439/500 [=========================>....] - ETA: 23s - loss: 3.1719 - regression_loss: 2.5117 - classification_loss: 0.6602 440/500 [=========================>....] - ETA: 22s - loss: 3.1729 - regression_loss: 2.5127 - classification_loss: 0.6603 441/500 [=========================>....] - ETA: 22s - loss: 3.1714 - regression_loss: 2.5118 - classification_loss: 0.6595 442/500 [=========================>....] - ETA: 21s - loss: 3.1700 - regression_loss: 2.5113 - classification_loss: 0.6588 443/500 [=========================>....] - ETA: 21s - loss: 3.1690 - regression_loss: 2.5105 - classification_loss: 0.6585 444/500 [=========================>....] - ETA: 21s - loss: 3.1676 - regression_loss: 2.5098 - classification_loss: 0.6578 445/500 [=========================>....] - ETA: 20s - loss: 3.1677 - regression_loss: 2.5100 - classification_loss: 0.6577 446/500 [=========================>....] - ETA: 20s - loss: 3.1658 - regression_loss: 2.5088 - classification_loss: 0.6570 447/500 [=========================>....] - ETA: 20s - loss: 3.1646 - regression_loss: 2.5082 - classification_loss: 0.6564 448/500 [=========================>....] - ETA: 19s - loss: 3.1631 - regression_loss: 2.5073 - classification_loss: 0.6557 449/500 [=========================>....] - ETA: 19s - loss: 3.1622 - regression_loss: 2.5069 - classification_loss: 0.6553 450/500 [==========================>...] - ETA: 18s - loss: 3.1613 - regression_loss: 2.5066 - classification_loss: 0.6547 451/500 [==========================>...] - ETA: 18s - loss: 3.1605 - regression_loss: 2.5062 - classification_loss: 0.6543 452/500 [==========================>...] - ETA: 18s - loss: 3.1591 - regression_loss: 2.5053 - classification_loss: 0.6538 453/500 [==========================>...] - ETA: 17s - loss: 3.1586 - regression_loss: 2.5051 - classification_loss: 0.6535 454/500 [==========================>...] - ETA: 17s - loss: 3.1578 - regression_loss: 2.5047 - classification_loss: 0.6530 455/500 [==========================>...] - ETA: 16s - loss: 3.1566 - regression_loss: 2.5043 - classification_loss: 0.6523 456/500 [==========================>...] - ETA: 16s - loss: 3.1553 - regression_loss: 2.5036 - classification_loss: 0.6517 457/500 [==========================>...] - ETA: 16s - loss: 3.1541 - regression_loss: 2.5030 - classification_loss: 0.6511 458/500 [==========================>...] - ETA: 15s - loss: 3.1526 - regression_loss: 2.5020 - classification_loss: 0.6506 459/500 [==========================>...] - ETA: 15s - loss: 3.1517 - regression_loss: 2.5017 - classification_loss: 0.6500 460/500 [==========================>...] - ETA: 15s - loss: 3.1507 - regression_loss: 2.5012 - classification_loss: 0.6495 461/500 [==========================>...] - ETA: 14s - loss: 3.1475 - regression_loss: 2.4990 - classification_loss: 0.6485 462/500 [==========================>...] - ETA: 14s - loss: 3.1462 - regression_loss: 2.4983 - classification_loss: 0.6480 463/500 [==========================>...] - ETA: 13s - loss: 3.1446 - regression_loss: 2.4974 - classification_loss: 0.6473 464/500 [==========================>...] - ETA: 13s - loss: 3.1438 - regression_loss: 2.4971 - classification_loss: 0.6468 465/500 [==========================>...] - ETA: 13s - loss: 3.1429 - regression_loss: 2.4966 - classification_loss: 0.6463 466/500 [==========================>...] - ETA: 12s - loss: 3.1418 - regression_loss: 2.4961 - classification_loss: 0.6457 467/500 [===========================>..] - ETA: 12s - loss: 3.1410 - regression_loss: 2.4958 - classification_loss: 0.6452 468/500 [===========================>..] - ETA: 12s - loss: 3.1399 - regression_loss: 2.4953 - classification_loss: 0.6447 469/500 [===========================>..] - ETA: 11s - loss: 3.1385 - regression_loss: 2.4945 - classification_loss: 0.6440 470/500 [===========================>..] - ETA: 11s - loss: 3.1377 - regression_loss: 2.4942 - classification_loss: 0.6435 471/500 [===========================>..] - ETA: 10s - loss: 3.1366 - regression_loss: 2.4937 - classification_loss: 0.6429 472/500 [===========================>..] - ETA: 10s - loss: 3.1351 - regression_loss: 2.4928 - classification_loss: 0.6423 473/500 [===========================>..] - ETA: 10s - loss: 3.1336 - regression_loss: 2.4917 - classification_loss: 0.6419 474/500 [===========================>..] - ETA: 9s - loss: 3.1327 - regression_loss: 2.4913 - classification_loss: 0.6414 475/500 [===========================>..] - ETA: 9s - loss: 3.1313 - regression_loss: 2.4905 - classification_loss: 0.6408 476/500 [===========================>..] - ETA: 9s - loss: 3.1293 - regression_loss: 2.4894 - classification_loss: 0.6400 477/500 [===========================>..] - ETA: 8s - loss: 3.1282 - regression_loss: 2.4888 - classification_loss: 0.6395 478/500 [===========================>..] - ETA: 8s - loss: 3.1286 - regression_loss: 2.4896 - classification_loss: 0.6391 479/500 [===========================>..] - ETA: 7s - loss: 3.1270 - regression_loss: 2.4886 - classification_loss: 0.6384 480/500 [===========================>..] - ETA: 7s - loss: 3.1262 - regression_loss: 2.4883 - classification_loss: 0.6379 481/500 [===========================>..] - ETA: 7s - loss: 3.1249 - regression_loss: 2.4876 - classification_loss: 0.6373 482/500 [===========================>..] - ETA: 6s - loss: 3.1237 - regression_loss: 2.4870 - classification_loss: 0.6367 483/500 [===========================>..] - ETA: 6s - loss: 3.1225 - regression_loss: 2.4863 - classification_loss: 0.6362 484/500 [============================>.] - ETA: 6s - loss: 3.1211 - regression_loss: 2.4856 - classification_loss: 0.6356 485/500 [============================>.] - ETA: 5s - loss: 3.1203 - regression_loss: 2.4849 - classification_loss: 0.6353 486/500 [============================>.] - ETA: 5s - loss: 3.1192 - regression_loss: 2.4844 - classification_loss: 0.6348 487/500 [============================>.] - ETA: 4s - loss: 3.1180 - regression_loss: 2.4838 - classification_loss: 0.6342 488/500 [============================>.] - ETA: 4s - loss: 3.1169 - regression_loss: 2.4832 - classification_loss: 0.6337 489/500 [============================>.] - ETA: 4s - loss: 3.1154 - regression_loss: 2.4823 - classification_loss: 0.6331 490/500 [============================>.] - ETA: 3s - loss: 3.1138 - regression_loss: 2.4814 - classification_loss: 0.6324 491/500 [============================>.] - ETA: 3s - loss: 3.1135 - regression_loss: 2.4815 - classification_loss: 0.6320 492/500 [============================>.] - ETA: 3s - loss: 3.1123 - regression_loss: 2.4808 - classification_loss: 0.6315 493/500 [============================>.] - ETA: 2s - loss: 3.1112 - regression_loss: 2.4801 - classification_loss: 0.6311 494/500 [============================>.] - ETA: 2s - loss: 3.1106 - regression_loss: 2.4800 - classification_loss: 0.6306 495/500 [============================>.] - ETA: 1s - loss: 3.1094 - regression_loss: 2.4794 - classification_loss: 0.6300 496/500 [============================>.] - ETA: 1s - loss: 3.1075 - regression_loss: 2.4782 - classification_loss: 0.6293 497/500 [============================>.] - ETA: 1s - loss: 3.1062 - regression_loss: 2.4774 - classification_loss: 0.6288 498/500 [============================>.] - ETA: 0s - loss: 3.1051 - regression_loss: 2.4768 - classification_loss: 0.6284 499/500 [============================>.] - ETA: 0s - loss: 3.1041 - regression_loss: 2.4762 - classification_loss: 0.6279 500/500 [==============================] - 188s 376ms/step - loss: 3.1026 - regression_loss: 2.4753 - classification_loss: 0.6273 1172 instances of class plum with average precision: 0.2574 mAP: 0.2574 Epoch 00001: saving model to ./training/snapshots/resnet101_pascal_01.h5 Epoch 2/20 1/500 [..............................] - ETA: 3:16 - loss: 2.4172 - regression_loss: 1.9492 - classification_loss: 0.4679 2/500 [..............................] - ETA: 3:10 - loss: 2.3523 - regression_loss: 1.9874 - classification_loss: 0.3650 3/500 [..............................] - ETA: 3:11 - loss: 2.4340 - regression_loss: 2.0615 - classification_loss: 0.3725 4/500 [..............................] - ETA: 3:10 - loss: 2.5624 - regression_loss: 2.1549 - classification_loss: 0.4075 5/500 [..............................] - ETA: 3:11 - loss: 2.5164 - regression_loss: 2.1231 - classification_loss: 0.3933 6/500 [..............................] - ETA: 3:09 - loss: 2.5283 - regression_loss: 2.1453 - classification_loss: 0.3830 7/500 [..............................] - ETA: 3:08 - loss: 2.5001 - regression_loss: 2.1252 - classification_loss: 0.3749 8/500 [..............................] - ETA: 3:10 - loss: 2.5518 - regression_loss: 2.1633 - classification_loss: 0.3885 9/500 [..............................] - ETA: 3:08 - loss: 2.5658 - regression_loss: 2.1775 - classification_loss: 0.3883 10/500 [..............................] - ETA: 3:08 - loss: 2.5996 - regression_loss: 2.2154 - classification_loss: 0.3842 11/500 [..............................] - ETA: 3:07 - loss: 2.6065 - regression_loss: 2.2228 - classification_loss: 0.3837 12/500 [..............................] - ETA: 3:06 - loss: 2.6301 - regression_loss: 2.2454 - classification_loss: 0.3847 13/500 [..............................] - ETA: 3:05 - loss: 2.6281 - regression_loss: 2.2449 - classification_loss: 0.3832 14/500 [..............................] - ETA: 3:04 - loss: 2.6754 - regression_loss: 2.2547 - classification_loss: 0.4207 15/500 [..............................] - ETA: 3:04 - loss: 2.6538 - regression_loss: 2.2364 - classification_loss: 0.4174 16/500 [..............................] - ETA: 3:03 - loss: 2.6564 - regression_loss: 2.2394 - classification_loss: 0.4170 17/500 [>.............................] - ETA: 3:03 - loss: 2.6880 - regression_loss: 2.2683 - classification_loss: 0.4197 18/500 [>.............................] - ETA: 3:02 - loss: 2.6345 - regression_loss: 2.2217 - classification_loss: 0.4128 19/500 [>.............................] - ETA: 3:01 - loss: 2.6176 - regression_loss: 2.2084 - classification_loss: 0.4092 20/500 [>.............................] - ETA: 3:01 - loss: 2.5973 - regression_loss: 2.1944 - classification_loss: 0.4029 21/500 [>.............................] - ETA: 3:00 - loss: 2.5882 - regression_loss: 2.1875 - classification_loss: 0.4008 22/500 [>.............................] - ETA: 3:00 - loss: 2.5787 - regression_loss: 2.1789 - classification_loss: 0.3997 23/500 [>.............................] - ETA: 3:00 - loss: 2.5629 - regression_loss: 2.1660 - classification_loss: 0.3969 24/500 [>.............................] - ETA: 2:59 - loss: 2.5685 - regression_loss: 2.1720 - classification_loss: 0.3965 25/500 [>.............................] - ETA: 2:59 - loss: 2.5705 - regression_loss: 2.1739 - classification_loss: 0.3967 26/500 [>.............................] - ETA: 2:58 - loss: 2.5803 - regression_loss: 2.1844 - classification_loss: 0.3959 27/500 [>.............................] - ETA: 2:57 - loss: 2.5636 - regression_loss: 2.1713 - classification_loss: 0.3923 28/500 [>.............................] - ETA: 2:57 - loss: 2.5640 - regression_loss: 2.1725 - classification_loss: 0.3915 29/500 [>.............................] - ETA: 2:56 - loss: 2.5432 - regression_loss: 2.1560 - classification_loss: 0.3872 30/500 [>.............................] - ETA: 2:56 - loss: 2.5487 - regression_loss: 2.1584 - classification_loss: 0.3903 31/500 [>.............................] - ETA: 2:56 - loss: 2.5402 - regression_loss: 2.1524 - classification_loss: 0.3878 32/500 [>.............................] - ETA: 2:55 - loss: 2.5234 - regression_loss: 2.1394 - classification_loss: 0.3839 33/500 [>.............................] - ETA: 2:55 - loss: 2.5295 - regression_loss: 2.1465 - classification_loss: 0.3830 34/500 [=>............................] - ETA: 2:54 - loss: 2.5272 - regression_loss: 2.1451 - classification_loss: 0.3821 35/500 [=>............................] - ETA: 2:54 - loss: 2.5267 - regression_loss: 2.1457 - classification_loss: 0.3810 36/500 [=>............................] - ETA: 2:53 - loss: 2.5108 - regression_loss: 2.1336 - classification_loss: 0.3772 37/500 [=>............................] - ETA: 2:53 - loss: 2.5140 - regression_loss: 2.1361 - classification_loss: 0.3780 38/500 [=>............................] - ETA: 2:53 - loss: 2.5139 - regression_loss: 2.1343 - classification_loss: 0.3796 39/500 [=>............................] - ETA: 2:52 - loss: 2.5129 - regression_loss: 2.1332 - classification_loss: 0.3797 40/500 [=>............................] - ETA: 2:52 - loss: 2.5113 - regression_loss: 2.1326 - classification_loss: 0.3787 41/500 [=>............................] - ETA: 2:51 - loss: 2.5060 - regression_loss: 2.1287 - classification_loss: 0.3773 42/500 [=>............................] - ETA: 2:51 - loss: 2.5092 - regression_loss: 2.1317 - classification_loss: 0.3775 43/500 [=>............................] - ETA: 2:50 - loss: 2.5139 - regression_loss: 2.1352 - classification_loss: 0.3787 44/500 [=>............................] - ETA: 2:50 - loss: 2.5120 - regression_loss: 2.1338 - classification_loss: 0.3782 45/500 [=>............................] - ETA: 2:49 - loss: 2.5269 - regression_loss: 2.1417 - classification_loss: 0.3852 46/500 [=>............................] - ETA: 2:49 - loss: 2.5161 - regression_loss: 2.1334 - classification_loss: 0.3827 47/500 [=>............................] - ETA: 2:49 - loss: 2.5135 - regression_loss: 2.1307 - classification_loss: 0.3828 48/500 [=>............................] - ETA: 2:48 - loss: 2.5130 - regression_loss: 2.1304 - classification_loss: 0.3825 49/500 [=>............................] - ETA: 2:48 - loss: 2.5053 - regression_loss: 2.1251 - classification_loss: 0.3802 50/500 [==>...........................] - ETA: 2:48 - loss: 2.5028 - regression_loss: 2.1229 - classification_loss: 0.3800 51/500 [==>...........................] - ETA: 2:47 - loss: 2.5040 - regression_loss: 2.1243 - classification_loss: 0.3797 52/500 [==>...........................] - ETA: 2:47 - loss: 2.5027 - regression_loss: 2.1239 - classification_loss: 0.3788 53/500 [==>...........................] - ETA: 2:47 - loss: 2.4982 - regression_loss: 2.1206 - classification_loss: 0.3776 54/500 [==>...........................] - ETA: 2:46 - loss: 2.4895 - regression_loss: 2.1138 - classification_loss: 0.3757 55/500 [==>...........................] - ETA: 2:46 - loss: 2.4871 - regression_loss: 2.1119 - classification_loss: 0.3752 56/500 [==>...........................] - ETA: 2:46 - loss: 2.4965 - regression_loss: 2.1178 - classification_loss: 0.3786 57/500 [==>...........................] - ETA: 2:45 - loss: 2.4940 - regression_loss: 2.1164 - classification_loss: 0.3776 58/500 [==>...........................] - ETA: 2:45 - loss: 2.4961 - regression_loss: 2.1163 - classification_loss: 0.3798 59/500 [==>...........................] - ETA: 2:44 - loss: 2.4960 - regression_loss: 2.1166 - classification_loss: 0.3793 60/500 [==>...........................] - ETA: 2:44 - loss: 2.4940 - regression_loss: 2.1152 - classification_loss: 0.3788 61/500 [==>...........................] - ETA: 2:44 - loss: 2.4899 - regression_loss: 2.1123 - classification_loss: 0.3776 62/500 [==>...........................] - ETA: 2:44 - loss: 2.4914 - regression_loss: 2.1142 - classification_loss: 0.3773 63/500 [==>...........................] - ETA: 2:43 - loss: 2.4944 - regression_loss: 2.1169 - classification_loss: 0.3775 64/500 [==>...........................] - ETA: 2:43 - loss: 2.4942 - regression_loss: 2.1168 - classification_loss: 0.3774 65/500 [==>...........................] - ETA: 2:43 - loss: 2.4886 - regression_loss: 2.1127 - classification_loss: 0.3759 66/500 [==>...........................] - ETA: 2:42 - loss: 2.4868 - regression_loss: 2.1116 - classification_loss: 0.3752 67/500 [===>..........................] - ETA: 2:42 - loss: 2.4855 - regression_loss: 2.1107 - classification_loss: 0.3748 68/500 [===>..........................] - ETA: 2:42 - loss: 2.4853 - regression_loss: 2.1104 - classification_loss: 0.3749 69/500 [===>..........................] - ETA: 2:41 - loss: 2.4891 - regression_loss: 2.1135 - classification_loss: 0.3756 70/500 [===>..........................] - ETA: 2:41 - loss: 2.4685 - regression_loss: 2.0961 - classification_loss: 0.3724 71/500 [===>..........................] - ETA: 2:41 - loss: 2.4633 - regression_loss: 2.0923 - classification_loss: 0.3710 72/500 [===>..........................] - ETA: 2:41 - loss: 2.4683 - regression_loss: 2.0967 - classification_loss: 0.3716 73/500 [===>..........................] - ETA: 2:40 - loss: 2.4677 - regression_loss: 2.0962 - classification_loss: 0.3715 74/500 [===>..........................] - ETA: 2:40 - loss: 2.4676 - regression_loss: 2.0960 - classification_loss: 0.3716 75/500 [===>..........................] - ETA: 2:39 - loss: 2.4683 - regression_loss: 2.0968 - classification_loss: 0.3715 76/500 [===>..........................] - ETA: 2:39 - loss: 2.4682 - regression_loss: 2.0972 - classification_loss: 0.3709 77/500 [===>..........................] - ETA: 2:38 - loss: 2.4592 - regression_loss: 2.0893 - classification_loss: 0.3700 78/500 [===>..........................] - ETA: 2:38 - loss: 2.4561 - regression_loss: 2.0867 - classification_loss: 0.3694 79/500 [===>..........................] - ETA: 2:37 - loss: 2.4527 - regression_loss: 2.0837 - classification_loss: 0.3690 80/500 [===>..........................] - ETA: 2:37 - loss: 2.4511 - regression_loss: 2.0826 - classification_loss: 0.3685 81/500 [===>..........................] - ETA: 2:37 - loss: 2.4528 - regression_loss: 2.0832 - classification_loss: 0.3696 82/500 [===>..........................] - ETA: 2:37 - loss: 2.4493 - regression_loss: 2.0805 - classification_loss: 0.3688 83/500 [===>..........................] - ETA: 2:36 - loss: 2.4446 - regression_loss: 2.0757 - classification_loss: 0.3689 84/500 [====>.........................] - ETA: 2:36 - loss: 2.4431 - regression_loss: 2.0719 - classification_loss: 0.3713 85/500 [====>.........................] - ETA: 2:36 - loss: 2.4432 - regression_loss: 2.0723 - classification_loss: 0.3709 86/500 [====>.........................] - ETA: 2:35 - loss: 2.4432 - regression_loss: 2.0723 - classification_loss: 0.3709 87/500 [====>.........................] - ETA: 2:35 - loss: 2.4390 - regression_loss: 2.0689 - classification_loss: 0.3701 88/500 [====>.........................] - ETA: 2:35 - loss: 2.4425 - regression_loss: 2.0720 - classification_loss: 0.3704 89/500 [====>.........................] - ETA: 2:34 - loss: 2.4450 - regression_loss: 2.0738 - classification_loss: 0.3712 90/500 [====>.........................] - ETA: 2:34 - loss: 2.4465 - regression_loss: 2.0747 - classification_loss: 0.3717 91/500 [====>.........................] - ETA: 2:34 - loss: 2.4466 - regression_loss: 2.0744 - classification_loss: 0.3722 92/500 [====>.........................] - ETA: 2:33 - loss: 2.4511 - regression_loss: 2.0779 - classification_loss: 0.3731 93/500 [====>.........................] - ETA: 2:33 - loss: 2.4480 - regression_loss: 2.0756 - classification_loss: 0.3724 94/500 [====>.........................] - ETA: 2:33 - loss: 2.4522 - regression_loss: 2.0787 - classification_loss: 0.3734 95/500 [====>.........................] - ETA: 2:32 - loss: 2.4512 - regression_loss: 2.0783 - classification_loss: 0.3729 96/500 [====>.........................] - ETA: 2:32 - loss: 2.4492 - regression_loss: 2.0770 - classification_loss: 0.3722 97/500 [====>.........................] - ETA: 2:32 - loss: 2.4474 - regression_loss: 2.0756 - classification_loss: 0.3718 98/500 [====>.........................] - ETA: 2:31 - loss: 2.4504 - regression_loss: 2.0780 - classification_loss: 0.3725 99/500 [====>.........................] - ETA: 2:31 - loss: 2.4528 - regression_loss: 2.0799 - classification_loss: 0.3729 100/500 [=====>........................] - ETA: 2:30 - loss: 2.4519 - regression_loss: 2.0785 - classification_loss: 0.3735 101/500 [=====>........................] - ETA: 2:30 - loss: 2.4470 - regression_loss: 2.0741 - classification_loss: 0.3728 102/500 [=====>........................] - ETA: 2:30 - loss: 2.4404 - regression_loss: 2.0686 - classification_loss: 0.3719 103/500 [=====>........................] - ETA: 2:29 - loss: 2.4322 - regression_loss: 2.0617 - classification_loss: 0.3705 104/500 [=====>........................] - ETA: 2:29 - loss: 2.4331 - regression_loss: 2.0626 - classification_loss: 0.3705 105/500 [=====>........................] - ETA: 2:29 - loss: 2.4257 - regression_loss: 2.0562 - classification_loss: 0.3694 106/500 [=====>........................] - ETA: 2:28 - loss: 2.4263 - regression_loss: 2.0572 - classification_loss: 0.3691 107/500 [=====>........................] - ETA: 2:28 - loss: 2.4250 - regression_loss: 2.0552 - classification_loss: 0.3698 108/500 [=====>........................] - ETA: 2:28 - loss: 2.4276 - regression_loss: 2.0575 - classification_loss: 0.3700 109/500 [=====>........................] - ETA: 2:27 - loss: 2.4278 - regression_loss: 2.0580 - classification_loss: 0.3698 110/500 [=====>........................] - ETA: 2:27 - loss: 2.4266 - regression_loss: 2.0570 - classification_loss: 0.3696 111/500 [=====>........................] - ETA: 2:27 - loss: 2.4251 - regression_loss: 2.0559 - classification_loss: 0.3692 112/500 [=====>........................] - ETA: 2:26 - loss: 2.4234 - regression_loss: 2.0547 - classification_loss: 0.3687 113/500 [=====>........................] - ETA: 2:26 - loss: 2.4255 - regression_loss: 2.0565 - classification_loss: 0.3690 114/500 [=====>........................] - ETA: 2:25 - loss: 2.4242 - regression_loss: 2.0539 - classification_loss: 0.3703 115/500 [=====>........................] - ETA: 2:25 - loss: 2.4300 - regression_loss: 2.0590 - classification_loss: 0.3710 116/500 [=====>........................] - ETA: 2:24 - loss: 2.4306 - regression_loss: 2.0591 - classification_loss: 0.3715 117/500 [======>.......................] - ETA: 2:24 - loss: 2.4285 - regression_loss: 2.0574 - classification_loss: 0.3712 118/500 [======>.......................] - ETA: 2:23 - loss: 2.4297 - regression_loss: 2.0580 - classification_loss: 0.3716 119/500 [======>.......................] - ETA: 2:23 - loss: 2.4294 - regression_loss: 2.0576 - classification_loss: 0.3717 120/500 [======>.......................] - ETA: 2:22 - loss: 2.4284 - regression_loss: 2.0569 - classification_loss: 0.3715 121/500 [======>.......................] - ETA: 2:22 - loss: 2.4272 - regression_loss: 2.0561 - classification_loss: 0.3711 122/500 [======>.......................] - ETA: 2:22 - loss: 2.4237 - regression_loss: 2.0524 - classification_loss: 0.3713 123/500 [======>.......................] - ETA: 2:21 - loss: 2.4246 - regression_loss: 2.0532 - classification_loss: 0.3714 124/500 [======>.......................] - ETA: 2:21 - loss: 2.4272 - regression_loss: 2.0548 - classification_loss: 0.3724 125/500 [======>.......................] - ETA: 2:21 - loss: 2.4261 - regression_loss: 2.0539 - classification_loss: 0.3722 126/500 [======>.......................] - ETA: 2:20 - loss: 2.4333 - regression_loss: 2.0595 - classification_loss: 0.3738 127/500 [======>.......................] - ETA: 2:20 - loss: 2.4353 - regression_loss: 2.0602 - classification_loss: 0.3751 128/500 [======>.......................] - ETA: 2:19 - loss: 2.4349 - regression_loss: 2.0598 - classification_loss: 0.3751 129/500 [======>.......................] - ETA: 2:19 - loss: 2.4380 - regression_loss: 2.0610 - classification_loss: 0.3770 130/500 [======>.......................] - ETA: 2:19 - loss: 2.4371 - regression_loss: 2.0599 - classification_loss: 0.3772 131/500 [======>.......................] - ETA: 2:18 - loss: 2.4361 - regression_loss: 2.0593 - classification_loss: 0.3768 132/500 [======>.......................] - ETA: 2:18 - loss: 2.4357 - regression_loss: 2.0593 - classification_loss: 0.3764 133/500 [======>.......................] - ETA: 2:18 - loss: 2.4326 - regression_loss: 2.0567 - classification_loss: 0.3759 134/500 [=======>......................] - ETA: 2:17 - loss: 2.4351 - regression_loss: 2.0589 - classification_loss: 0.3761 135/500 [=======>......................] - ETA: 2:17 - loss: 2.4393 - regression_loss: 2.0627 - classification_loss: 0.3766 136/500 [=======>......................] - ETA: 2:17 - loss: 2.4414 - regression_loss: 2.0636 - classification_loss: 0.3778 137/500 [=======>......................] - ETA: 2:16 - loss: 2.4421 - regression_loss: 2.0640 - classification_loss: 0.3782 138/500 [=======>......................] - ETA: 2:16 - loss: 2.4366 - regression_loss: 2.0592 - classification_loss: 0.3774 139/500 [=======>......................] - ETA: 2:15 - loss: 2.4407 - regression_loss: 2.0630 - classification_loss: 0.3777 140/500 [=======>......................] - ETA: 2:15 - loss: 2.4408 - regression_loss: 2.0628 - classification_loss: 0.3780 141/500 [=======>......................] - ETA: 2:15 - loss: 2.4401 - regression_loss: 2.0619 - classification_loss: 0.3781 142/500 [=======>......................] - ETA: 2:14 - loss: 2.4399 - regression_loss: 2.0617 - classification_loss: 0.3782 143/500 [=======>......................] - ETA: 2:14 - loss: 2.4356 - regression_loss: 2.0584 - classification_loss: 0.3773 144/500 [=======>......................] - ETA: 2:14 - loss: 2.4347 - regression_loss: 2.0579 - classification_loss: 0.3768 145/500 [=======>......................] - ETA: 2:13 - loss: 2.4356 - regression_loss: 2.0588 - classification_loss: 0.3769 146/500 [=======>......................] - ETA: 2:13 - loss: 2.4309 - regression_loss: 2.0546 - classification_loss: 0.3762 147/500 [=======>......................] - ETA: 2:13 - loss: 2.4359 - regression_loss: 2.0592 - classification_loss: 0.3766 148/500 [=======>......................] - ETA: 2:12 - loss: 2.4357 - regression_loss: 2.0595 - classification_loss: 0.3762 149/500 [=======>......................] - ETA: 2:12 - loss: 2.4361 - regression_loss: 2.0603 - classification_loss: 0.3758 150/500 [========>.....................] - ETA: 2:12 - loss: 2.4359 - regression_loss: 2.0603 - classification_loss: 0.3756 151/500 [========>.....................] - ETA: 2:11 - loss: 2.4376 - regression_loss: 2.0618 - classification_loss: 0.3758 152/500 [========>.....................] - ETA: 2:11 - loss: 2.4345 - regression_loss: 2.0591 - classification_loss: 0.3754 153/500 [========>.....................] - ETA: 2:10 - loss: 2.4347 - regression_loss: 2.0596 - classification_loss: 0.3751 154/500 [========>.....................] - ETA: 2:10 - loss: 2.4299 - regression_loss: 2.0553 - classification_loss: 0.3746 155/500 [========>.....................] - ETA: 2:10 - loss: 2.4290 - regression_loss: 2.0546 - classification_loss: 0.3744 156/500 [========>.....................] - ETA: 2:09 - loss: 2.4281 - regression_loss: 2.0540 - classification_loss: 0.3742 157/500 [========>.....................] - ETA: 2:09 - loss: 2.4290 - regression_loss: 2.0549 - classification_loss: 0.3742 158/500 [========>.....................] - ETA: 2:09 - loss: 2.4272 - regression_loss: 2.0535 - classification_loss: 0.3737 159/500 [========>.....................] - ETA: 2:08 - loss: 2.4272 - regression_loss: 2.0538 - classification_loss: 0.3734 160/500 [========>.....................] - ETA: 2:08 - loss: 2.4276 - regression_loss: 2.0542 - classification_loss: 0.3734 161/500 [========>.....................] - ETA: 2:07 - loss: 2.4285 - regression_loss: 2.0552 - classification_loss: 0.3733 162/500 [========>.....................] - ETA: 2:07 - loss: 2.4295 - regression_loss: 2.0562 - classification_loss: 0.3733 163/500 [========>.....................] - ETA: 2:07 - loss: 2.4284 - regression_loss: 2.0548 - classification_loss: 0.3736 164/500 [========>.....................] - ETA: 2:06 - loss: 2.4284 - regression_loss: 2.0551 - classification_loss: 0.3733 165/500 [========>.....................] - ETA: 2:06 - loss: 2.4248 - regression_loss: 2.0523 - classification_loss: 0.3725 166/500 [========>.....................] - ETA: 2:06 - loss: 2.4210 - regression_loss: 2.0487 - classification_loss: 0.3724 167/500 [=========>....................] - ETA: 2:05 - loss: 2.4177 - regression_loss: 2.0460 - classification_loss: 0.3717 168/500 [=========>....................] - ETA: 2:05 - loss: 2.4217 - regression_loss: 2.0494 - classification_loss: 0.3722 169/500 [=========>....................] - ETA: 2:04 - loss: 2.4214 - regression_loss: 2.0489 - classification_loss: 0.3725 170/500 [=========>....................] - ETA: 2:04 - loss: 2.4212 - regression_loss: 2.0488 - classification_loss: 0.3724 171/500 [=========>....................] - ETA: 2:04 - loss: 2.4175 - regression_loss: 2.0456 - classification_loss: 0.3718 172/500 [=========>....................] - ETA: 2:03 - loss: 2.4184 - regression_loss: 2.0467 - classification_loss: 0.3718 173/500 [=========>....................] - ETA: 2:03 - loss: 2.4189 - regression_loss: 2.0472 - classification_loss: 0.3717 174/500 [=========>....................] - ETA: 2:03 - loss: 2.4195 - regression_loss: 2.0477 - classification_loss: 0.3718 175/500 [=========>....................] - ETA: 2:02 - loss: 2.4177 - regression_loss: 2.0464 - classification_loss: 0.3713 176/500 [=========>....................] - ETA: 2:02 - loss: 2.4175 - regression_loss: 2.0460 - classification_loss: 0.3715 177/500 [=========>....................] - ETA: 2:01 - loss: 2.4147 - regression_loss: 2.0439 - classification_loss: 0.3707 178/500 [=========>....................] - ETA: 2:01 - loss: 2.4160 - regression_loss: 2.0454 - classification_loss: 0.3706 179/500 [=========>....................] - ETA: 2:01 - loss: 2.4162 - regression_loss: 2.0455 - classification_loss: 0.3708 180/500 [=========>....................] - ETA: 2:00 - loss: 2.4143 - regression_loss: 2.0444 - classification_loss: 0.3700 181/500 [=========>....................] - ETA: 2:00 - loss: 2.4140 - regression_loss: 2.0439 - classification_loss: 0.3700 182/500 [=========>....................] - ETA: 1:59 - loss: 2.4126 - regression_loss: 2.0431 - classification_loss: 0.3695 183/500 [=========>....................] - ETA: 1:59 - loss: 2.4125 - regression_loss: 2.0430 - classification_loss: 0.3695 184/500 [==========>...................] - ETA: 1:59 - loss: 2.4146 - regression_loss: 2.0453 - classification_loss: 0.3692 185/500 [==========>...................] - ETA: 1:58 - loss: 2.4133 - regression_loss: 2.0444 - classification_loss: 0.3688 186/500 [==========>...................] - ETA: 1:58 - loss: 2.4152 - regression_loss: 2.0461 - classification_loss: 0.3691 187/500 [==========>...................] - ETA: 1:58 - loss: 2.4157 - regression_loss: 2.0468 - classification_loss: 0.3689 188/500 [==========>...................] - ETA: 1:57 - loss: 2.4157 - regression_loss: 2.0469 - classification_loss: 0.3688 189/500 [==========>...................] - ETA: 1:57 - loss: 2.4136 - regression_loss: 2.0452 - classification_loss: 0.3685 190/500 [==========>...................] - ETA: 1:56 - loss: 2.4103 - regression_loss: 2.0421 - classification_loss: 0.3682 191/500 [==========>...................] - ETA: 1:56 - loss: 2.4087 - regression_loss: 2.0410 - classification_loss: 0.3677 192/500 [==========>...................] - ETA: 1:56 - loss: 2.4066 - regression_loss: 2.0393 - classification_loss: 0.3674 193/500 [==========>...................] - ETA: 1:55 - loss: 2.4076 - regression_loss: 2.0404 - classification_loss: 0.3672 194/500 [==========>...................] - ETA: 1:55 - loss: 2.4043 - regression_loss: 2.0373 - classification_loss: 0.3671 195/500 [==========>...................] - ETA: 1:55 - loss: 2.4042 - regression_loss: 2.0372 - classification_loss: 0.3669 196/500 [==========>...................] - ETA: 1:54 - loss: 2.4032 - regression_loss: 2.0366 - classification_loss: 0.3665 197/500 [==========>...................] - ETA: 1:54 - loss: 2.4036 - regression_loss: 2.0369 - classification_loss: 0.3667 198/500 [==========>...................] - ETA: 1:54 - loss: 2.4021 - regression_loss: 2.0354 - classification_loss: 0.3667 199/500 [==========>...................] - ETA: 1:53 - loss: 2.3994 - regression_loss: 2.0333 - classification_loss: 0.3660 200/500 [===========>..................] - ETA: 1:53 - loss: 2.4000 - regression_loss: 2.0342 - classification_loss: 0.3658 201/500 [===========>..................] - ETA: 1:53 - loss: 2.3985 - regression_loss: 2.0330 - classification_loss: 0.3656 202/500 [===========>..................] - ETA: 1:52 - loss: 2.4001 - regression_loss: 2.0344 - classification_loss: 0.3657 203/500 [===========>..................] - ETA: 1:52 - loss: 2.3982 - regression_loss: 2.0327 - classification_loss: 0.3655 204/500 [===========>..................] - ETA: 1:51 - loss: 2.3971 - regression_loss: 2.0321 - classification_loss: 0.3650 205/500 [===========>..................] - ETA: 1:51 - loss: 2.3977 - regression_loss: 2.0324 - classification_loss: 0.3653 206/500 [===========>..................] - ETA: 1:51 - loss: 2.3985 - regression_loss: 2.0326 - classification_loss: 0.3658 207/500 [===========>..................] - ETA: 1:50 - loss: 2.3981 - regression_loss: 2.0327 - classification_loss: 0.3653 208/500 [===========>..................] - ETA: 1:50 - loss: 2.3978 - regression_loss: 2.0325 - classification_loss: 0.3653 209/500 [===========>..................] - ETA: 1:50 - loss: 2.3978 - regression_loss: 2.0326 - classification_loss: 0.3652 210/500 [===========>..................] - ETA: 1:49 - loss: 2.3972 - regression_loss: 2.0321 - classification_loss: 0.3652 211/500 [===========>..................] - ETA: 1:49 - loss: 2.3973 - regression_loss: 2.0317 - classification_loss: 0.3656 212/500 [===========>..................] - ETA: 1:48 - loss: 2.3987 - regression_loss: 2.0326 - classification_loss: 0.3661 213/500 [===========>..................] - ETA: 1:48 - loss: 2.3987 - regression_loss: 2.0328 - classification_loss: 0.3659 214/500 [===========>..................] - ETA: 1:48 - loss: 2.3970 - regression_loss: 2.0314 - classification_loss: 0.3656 215/500 [===========>..................] - ETA: 1:47 - loss: 2.3960 - regression_loss: 2.0301 - classification_loss: 0.3659 216/500 [===========>..................] - ETA: 1:47 - loss: 2.3954 - regression_loss: 2.0298 - classification_loss: 0.3655 217/500 [============>.................] - ETA: 1:47 - loss: 2.3963 - regression_loss: 2.0309 - classification_loss: 0.3655 218/500 [============>.................] - ETA: 1:46 - loss: 2.3914 - regression_loss: 2.0259 - classification_loss: 0.3655 219/500 [============>.................] - ETA: 1:46 - loss: 2.3898 - regression_loss: 2.0243 - classification_loss: 0.3655 220/500 [============>.................] - ETA: 1:45 - loss: 2.3892 - regression_loss: 2.0236 - classification_loss: 0.3656 221/500 [============>.................] - ETA: 1:45 - loss: 2.3880 - regression_loss: 2.0226 - classification_loss: 0.3653 222/500 [============>.................] - ETA: 1:45 - loss: 2.3845 - regression_loss: 2.0194 - classification_loss: 0.3651 223/500 [============>.................] - ETA: 1:44 - loss: 2.3831 - regression_loss: 2.0184 - classification_loss: 0.3647 224/500 [============>.................] - ETA: 1:44 - loss: 2.3822 - regression_loss: 2.0175 - classification_loss: 0.3648 225/500 [============>.................] - ETA: 1:44 - loss: 2.3810 - regression_loss: 2.0162 - classification_loss: 0.3648 226/500 [============>.................] - ETA: 1:43 - loss: 2.3795 - regression_loss: 2.0150 - classification_loss: 0.3645 227/500 [============>.................] - ETA: 1:43 - loss: 2.3775 - regression_loss: 2.0134 - classification_loss: 0.3641 228/500 [============>.................] - ETA: 1:42 - loss: 2.3779 - regression_loss: 2.0134 - classification_loss: 0.3645 229/500 [============>.................] - ETA: 1:42 - loss: 2.3785 - regression_loss: 2.0138 - classification_loss: 0.3647 230/500 [============>.................] - ETA: 1:42 - loss: 2.3795 - regression_loss: 2.0145 - classification_loss: 0.3650 231/500 [============>.................] - ETA: 1:41 - loss: 2.3776 - regression_loss: 2.0128 - classification_loss: 0.3648 232/500 [============>.................] - ETA: 1:41 - loss: 2.3796 - regression_loss: 2.0141 - classification_loss: 0.3655 233/500 [============>.................] - ETA: 1:41 - loss: 2.3796 - regression_loss: 2.0140 - classification_loss: 0.3657 234/500 [=============>................] - ETA: 1:40 - loss: 2.3792 - regression_loss: 2.0135 - classification_loss: 0.3657 235/500 [=============>................] - ETA: 1:40 - loss: 2.3793 - regression_loss: 2.0139 - classification_loss: 0.3654 236/500 [=============>................] - ETA: 1:39 - loss: 2.3785 - regression_loss: 2.0133 - classification_loss: 0.3653 237/500 [=============>................] - ETA: 1:39 - loss: 2.3803 - regression_loss: 2.0150 - classification_loss: 0.3652 238/500 [=============>................] - ETA: 1:39 - loss: 2.3788 - regression_loss: 2.0139 - classification_loss: 0.3649 239/500 [=============>................] - ETA: 1:38 - loss: 2.3774 - regression_loss: 2.0129 - classification_loss: 0.3645 240/500 [=============>................] - ETA: 1:38 - loss: 2.3806 - regression_loss: 2.0155 - classification_loss: 0.3651 241/500 [=============>................] - ETA: 1:37 - loss: 2.3814 - regression_loss: 2.0164 - classification_loss: 0.3651 242/500 [=============>................] - ETA: 1:37 - loss: 2.3811 - regression_loss: 2.0161 - classification_loss: 0.3650 243/500 [=============>................] - ETA: 1:37 - loss: 2.3812 - regression_loss: 2.0162 - classification_loss: 0.3649 244/500 [=============>................] - ETA: 1:36 - loss: 2.3784 - regression_loss: 2.0137 - classification_loss: 0.3647 245/500 [=============>................] - ETA: 1:36 - loss: 2.3787 - regression_loss: 2.0140 - classification_loss: 0.3646 246/500 [=============>................] - ETA: 1:36 - loss: 2.3752 - regression_loss: 2.0111 - classification_loss: 0.3640 247/500 [=============>................] - ETA: 1:35 - loss: 2.3763 - regression_loss: 2.0122 - classification_loss: 0.3640 248/500 [=============>................] - ETA: 1:35 - loss: 2.3773 - regression_loss: 2.0128 - classification_loss: 0.3644 249/500 [=============>................] - ETA: 1:35 - loss: 2.3766 - regression_loss: 2.0124 - classification_loss: 0.3641 250/500 [==============>...............] - ETA: 1:34 - loss: 2.3766 - regression_loss: 2.0123 - classification_loss: 0.3643 251/500 [==============>...............] - ETA: 1:34 - loss: 2.3763 - regression_loss: 2.0121 - classification_loss: 0.3641 252/500 [==============>...............] - ETA: 1:33 - loss: 2.3764 - regression_loss: 2.0123 - classification_loss: 0.3641 253/500 [==============>...............] - ETA: 1:33 - loss: 2.3765 - regression_loss: 2.0123 - classification_loss: 0.3641 254/500 [==============>...............] - ETA: 1:33 - loss: 2.3736 - regression_loss: 2.0099 - classification_loss: 0.3637 255/500 [==============>...............] - ETA: 1:32 - loss: 2.3721 - regression_loss: 2.0086 - classification_loss: 0.3635 256/500 [==============>...............] - ETA: 1:32 - loss: 2.3721 - regression_loss: 2.0086 - classification_loss: 0.3635 257/500 [==============>...............] - ETA: 1:32 - loss: 2.3727 - regression_loss: 2.0090 - classification_loss: 0.3636 258/500 [==============>...............] - ETA: 1:31 - loss: 2.3726 - regression_loss: 2.0091 - classification_loss: 0.3635 259/500 [==============>...............] - ETA: 1:31 - loss: 2.3746 - regression_loss: 2.0109 - classification_loss: 0.3636 260/500 [==============>...............] - ETA: 1:30 - loss: 2.3735 - regression_loss: 2.0103 - classification_loss: 0.3632 261/500 [==============>...............] - ETA: 1:30 - loss: 2.3716 - regression_loss: 2.0086 - classification_loss: 0.3630 262/500 [==============>...............] - ETA: 1:30 - loss: 2.3697 - regression_loss: 2.0070 - classification_loss: 0.3627 263/500 [==============>...............] - ETA: 1:29 - loss: 2.3685 - regression_loss: 2.0059 - classification_loss: 0.3626 264/500 [==============>...............] - ETA: 1:29 - loss: 2.3696 - regression_loss: 2.0057 - classification_loss: 0.3639 265/500 [==============>...............] - ETA: 1:29 - loss: 2.3665 - regression_loss: 2.0033 - classification_loss: 0.3632 266/500 [==============>...............] - ETA: 1:28 - loss: 2.3660 - regression_loss: 2.0030 - classification_loss: 0.3630 267/500 [===============>..............] - ETA: 1:28 - loss: 2.3649 - regression_loss: 2.0022 - classification_loss: 0.3627 268/500 [===============>..............] - ETA: 1:27 - loss: 2.3648 - regression_loss: 2.0022 - classification_loss: 0.3626 269/500 [===============>..............] - ETA: 1:27 - loss: 2.3642 - regression_loss: 2.0018 - classification_loss: 0.3623 270/500 [===============>..............] - ETA: 1:27 - loss: 2.3649 - regression_loss: 2.0024 - classification_loss: 0.3625 271/500 [===============>..............] - ETA: 1:26 - loss: 2.3648 - regression_loss: 2.0023 - classification_loss: 0.3624 272/500 [===============>..............] - ETA: 1:26 - loss: 2.3645 - regression_loss: 2.0020 - classification_loss: 0.3625 273/500 [===============>..............] - ETA: 1:26 - loss: 2.3631 - regression_loss: 2.0009 - classification_loss: 0.3622 274/500 [===============>..............] - ETA: 1:25 - loss: 2.3641 - regression_loss: 2.0019 - classification_loss: 0.3622 275/500 [===============>..............] - ETA: 1:25 - loss: 2.3647 - regression_loss: 2.0024 - classification_loss: 0.3623 276/500 [===============>..............] - ETA: 1:24 - loss: 2.3625 - regression_loss: 2.0003 - classification_loss: 0.3623 277/500 [===============>..............] - ETA: 1:24 - loss: 2.3628 - regression_loss: 2.0005 - classification_loss: 0.3622 278/500 [===============>..............] - ETA: 1:24 - loss: 2.3632 - regression_loss: 2.0010 - classification_loss: 0.3622 279/500 [===============>..............] - ETA: 1:23 - loss: 2.3627 - regression_loss: 2.0006 - classification_loss: 0.3621 280/500 [===============>..............] - ETA: 1:23 - loss: 2.3624 - regression_loss: 2.0003 - classification_loss: 0.3620 281/500 [===============>..............] - ETA: 1:23 - loss: 2.3619 - regression_loss: 1.9999 - classification_loss: 0.3620 282/500 [===============>..............] - ETA: 1:22 - loss: 2.3618 - regression_loss: 2.0002 - classification_loss: 0.3617 283/500 [===============>..............] - ETA: 1:22 - loss: 2.3621 - regression_loss: 2.0004 - classification_loss: 0.3617 284/500 [================>.............] - ETA: 1:21 - loss: 2.3638 - regression_loss: 2.0017 - classification_loss: 0.3621 285/500 [================>.............] - ETA: 1:21 - loss: 2.3632 - regression_loss: 2.0010 - classification_loss: 0.3622 286/500 [================>.............] - ETA: 1:21 - loss: 2.3621 - regression_loss: 2.0002 - classification_loss: 0.3619 287/500 [================>.............] - ETA: 1:20 - loss: 2.3623 - regression_loss: 2.0005 - classification_loss: 0.3618 288/500 [================>.............] - ETA: 1:20 - loss: 2.3592 - regression_loss: 1.9977 - classification_loss: 0.3614 289/500 [================>.............] - ETA: 1:20 - loss: 2.3596 - regression_loss: 1.9981 - classification_loss: 0.3615 290/500 [================>.............] - ETA: 1:19 - loss: 2.3602 - regression_loss: 1.9988 - classification_loss: 0.3614 291/500 [================>.............] - ETA: 1:19 - loss: 2.3602 - regression_loss: 1.9989 - classification_loss: 0.3614 292/500 [================>.............] - ETA: 1:18 - loss: 2.3605 - regression_loss: 1.9991 - classification_loss: 0.3613 293/500 [================>.............] - ETA: 1:18 - loss: 2.3604 - regression_loss: 1.9991 - classification_loss: 0.3613 294/500 [================>.............] - ETA: 1:18 - loss: 2.3594 - regression_loss: 1.9983 - classification_loss: 0.3611 295/500 [================>.............] - ETA: 1:17 - loss: 2.3580 - regression_loss: 1.9974 - classification_loss: 0.3605 296/500 [================>.............] - ETA: 1:17 - loss: 2.3587 - regression_loss: 1.9981 - classification_loss: 0.3605 297/500 [================>.............] - ETA: 1:16 - loss: 2.3568 - regression_loss: 1.9966 - classification_loss: 0.3602 298/500 [================>.............] - ETA: 1:16 - loss: 2.3568 - regression_loss: 1.9966 - classification_loss: 0.3602 299/500 [================>.............] - ETA: 1:16 - loss: 2.3571 - regression_loss: 1.9967 - classification_loss: 0.3604 300/500 [=================>............] - ETA: 1:15 - loss: 2.3563 - regression_loss: 1.9961 - classification_loss: 0.3602 301/500 [=================>............] - ETA: 1:15 - loss: 2.3540 - regression_loss: 1.9941 - classification_loss: 0.3599 302/500 [=================>............] - ETA: 1:15 - loss: 2.3521 - regression_loss: 1.9926 - classification_loss: 0.3595 303/500 [=================>............] - ETA: 1:14 - loss: 2.3505 - regression_loss: 1.9914 - classification_loss: 0.3590 304/500 [=================>............] - ETA: 1:14 - loss: 2.3479 - regression_loss: 1.9894 - classification_loss: 0.3585 305/500 [=================>............] - ETA: 1:13 - loss: 2.3472 - regression_loss: 1.9889 - classification_loss: 0.3583 306/500 [=================>............] - ETA: 1:13 - loss: 2.3471 - regression_loss: 1.9890 - classification_loss: 0.3582 307/500 [=================>............] - ETA: 1:13 - loss: 2.3459 - regression_loss: 1.9880 - classification_loss: 0.3579 308/500 [=================>............] - ETA: 1:12 - loss: 2.3468 - regression_loss: 1.9888 - classification_loss: 0.3580 309/500 [=================>............] - ETA: 1:12 - loss: 2.3459 - regression_loss: 1.9881 - classification_loss: 0.3578 310/500 [=================>............] - ETA: 1:12 - loss: 2.3456 - regression_loss: 1.9879 - classification_loss: 0.3578 311/500 [=================>............] - ETA: 1:11 - loss: 2.3444 - regression_loss: 1.9870 - classification_loss: 0.3574 312/500 [=================>............] - ETA: 1:11 - loss: 2.3437 - regression_loss: 1.9865 - classification_loss: 0.3572 313/500 [=================>............] - ETA: 1:10 - loss: 2.3431 - regression_loss: 1.9861 - classification_loss: 0.3570 314/500 [=================>............] - ETA: 1:10 - loss: 2.3427 - regression_loss: 1.9858 - classification_loss: 0.3568 315/500 [=================>............] - ETA: 1:10 - loss: 2.3434 - regression_loss: 1.9862 - classification_loss: 0.3572 316/500 [=================>............] - ETA: 1:09 - loss: 2.3435 - regression_loss: 1.9864 - classification_loss: 0.3571 317/500 [==================>...........] - ETA: 1:09 - loss: 2.3429 - regression_loss: 1.9859 - classification_loss: 0.3570 318/500 [==================>...........] - ETA: 1:09 - loss: 2.3422 - regression_loss: 1.9852 - classification_loss: 0.3570 319/500 [==================>...........] - ETA: 1:08 - loss: 2.3398 - regression_loss: 1.9829 - classification_loss: 0.3569 320/500 [==================>...........] - ETA: 1:08 - loss: 2.3419 - regression_loss: 1.9849 - classification_loss: 0.3570 321/500 [==================>...........] - ETA: 1:07 - loss: 2.3404 - regression_loss: 1.9835 - classification_loss: 0.3569 322/500 [==================>...........] - ETA: 1:07 - loss: 2.3389 - regression_loss: 1.9823 - classification_loss: 0.3566 323/500 [==================>...........] - ETA: 1:07 - loss: 2.3383 - regression_loss: 1.9819 - classification_loss: 0.3564 324/500 [==================>...........] - ETA: 1:06 - loss: 2.3381 - regression_loss: 1.9818 - classification_loss: 0.3563 325/500 [==================>...........] - ETA: 1:06 - loss: 2.3364 - regression_loss: 1.9804 - classification_loss: 0.3561 326/500 [==================>...........] - ETA: 1:06 - loss: 2.3356 - regression_loss: 1.9797 - classification_loss: 0.3559 327/500 [==================>...........] - ETA: 1:05 - loss: 2.3357 - regression_loss: 1.9798 - classification_loss: 0.3558 328/500 [==================>...........] - ETA: 1:05 - loss: 2.3340 - regression_loss: 1.9785 - classification_loss: 0.3555 329/500 [==================>...........] - ETA: 1:04 - loss: 2.3332 - regression_loss: 1.9777 - classification_loss: 0.3555 330/500 [==================>...........] - ETA: 1:04 - loss: 2.3328 - regression_loss: 1.9775 - classification_loss: 0.3553 331/500 [==================>...........] - ETA: 1:04 - loss: 2.3320 - regression_loss: 1.9769 - classification_loss: 0.3551 332/500 [==================>...........] - ETA: 1:03 - loss: 2.3305 - regression_loss: 1.9756 - classification_loss: 0.3549 333/500 [==================>...........] - ETA: 1:03 - loss: 2.3310 - regression_loss: 1.9761 - classification_loss: 0.3549 334/500 [===================>..........] - ETA: 1:03 - loss: 2.3291 - regression_loss: 1.9747 - classification_loss: 0.3544 335/500 [===================>..........] - ETA: 1:02 - loss: 2.3283 - regression_loss: 1.9739 - classification_loss: 0.3544 336/500 [===================>..........] - ETA: 1:02 - loss: 2.3267 - regression_loss: 1.9726 - classification_loss: 0.3541 337/500 [===================>..........] - ETA: 1:01 - loss: 2.3260 - regression_loss: 1.9718 - classification_loss: 0.3542 338/500 [===================>..........] - ETA: 1:01 - loss: 2.3274 - regression_loss: 1.9730 - classification_loss: 0.3544 339/500 [===================>..........] - ETA: 1:01 - loss: 2.3287 - regression_loss: 1.9741 - classification_loss: 0.3546 340/500 [===================>..........] - ETA: 1:00 - loss: 2.3287 - regression_loss: 1.9741 - classification_loss: 0.3546 341/500 [===================>..........] - ETA: 1:00 - loss: 2.3254 - regression_loss: 1.9713 - classification_loss: 0.3540 342/500 [===================>..........] - ETA: 59s - loss: 2.3262 - regression_loss: 1.9718 - classification_loss: 0.3544  343/500 [===================>..........] - ETA: 59s - loss: 2.3263 - regression_loss: 1.9719 - classification_loss: 0.3543 344/500 [===================>..........] - ETA: 59s - loss: 2.3262 - regression_loss: 1.9719 - classification_loss: 0.3543 345/500 [===================>..........] - ETA: 58s - loss: 2.3257 - regression_loss: 1.9716 - classification_loss: 0.3540 346/500 [===================>..........] - ETA: 58s - loss: 2.3258 - regression_loss: 1.9717 - classification_loss: 0.3540 347/500 [===================>..........] - ETA: 58s - loss: 2.3256 - regression_loss: 1.9715 - classification_loss: 0.3541 348/500 [===================>..........] - ETA: 57s - loss: 2.3255 - regression_loss: 1.9715 - classification_loss: 0.3540 349/500 [===================>..........] - ETA: 57s - loss: 2.3255 - regression_loss: 1.9713 - classification_loss: 0.3541 350/500 [====================>.........] - ETA: 56s - loss: 2.3243 - regression_loss: 1.9705 - classification_loss: 0.3537 351/500 [====================>.........] - ETA: 56s - loss: 2.3240 - regression_loss: 1.9704 - classification_loss: 0.3536 352/500 [====================>.........] - ETA: 56s - loss: 2.3239 - regression_loss: 1.9705 - classification_loss: 0.3534 353/500 [====================>.........] - ETA: 55s - loss: 2.3234 - regression_loss: 1.9702 - classification_loss: 0.3532 354/500 [====================>.........] - ETA: 55s - loss: 2.3249 - regression_loss: 1.9699 - classification_loss: 0.3550 355/500 [====================>.........] - ETA: 55s - loss: 2.3238 - regression_loss: 1.9690 - classification_loss: 0.3548 356/500 [====================>.........] - ETA: 54s - loss: 2.3244 - regression_loss: 1.9695 - classification_loss: 0.3549 357/500 [====================>.........] - ETA: 54s - loss: 2.3237 - regression_loss: 1.9689 - classification_loss: 0.3547 358/500 [====================>.........] - ETA: 53s - loss: 2.3212 - regression_loss: 1.9667 - classification_loss: 0.3545 359/500 [====================>.........] - ETA: 53s - loss: 2.3201 - regression_loss: 1.9659 - classification_loss: 0.3542 360/500 [====================>.........] - ETA: 53s - loss: 2.3184 - regression_loss: 1.9644 - classification_loss: 0.3540 361/500 [====================>.........] - ETA: 52s - loss: 2.3181 - regression_loss: 1.9642 - classification_loss: 0.3539 362/500 [====================>.........] - ETA: 52s - loss: 2.3177 - regression_loss: 1.9641 - classification_loss: 0.3536 363/500 [====================>.........] - ETA: 51s - loss: 2.3172 - regression_loss: 1.9638 - classification_loss: 0.3535 364/500 [====================>.........] - ETA: 51s - loss: 2.3165 - regression_loss: 1.9631 - classification_loss: 0.3533 365/500 [====================>.........] - ETA: 51s - loss: 2.3166 - regression_loss: 1.9633 - classification_loss: 0.3533 366/500 [====================>.........] - ETA: 50s - loss: 2.3169 - regression_loss: 1.9636 - classification_loss: 0.3533 367/500 [=====================>........] - ETA: 50s - loss: 2.3174 - regression_loss: 1.9639 - classification_loss: 0.3535 368/500 [=====================>........] - ETA: 50s - loss: 2.3157 - regression_loss: 1.9622 - classification_loss: 0.3535 369/500 [=====================>........] - ETA: 49s - loss: 2.3160 - regression_loss: 1.9626 - classification_loss: 0.3534 370/500 [=====================>........] - ETA: 49s - loss: 2.3161 - regression_loss: 1.9629 - classification_loss: 0.3532 371/500 [=====================>........] - ETA: 48s - loss: 2.3151 - regression_loss: 1.9623 - classification_loss: 0.3528 372/500 [=====================>........] - ETA: 48s - loss: 2.3113 - regression_loss: 1.9591 - classification_loss: 0.3522 373/500 [=====================>........] - ETA: 48s - loss: 2.3120 - regression_loss: 1.9597 - classification_loss: 0.3523 374/500 [=====================>........] - ETA: 47s - loss: 2.3116 - regression_loss: 1.9595 - classification_loss: 0.3521 375/500 [=====================>........] - ETA: 47s - loss: 2.3102 - regression_loss: 1.9583 - classification_loss: 0.3518 376/500 [=====================>........] - ETA: 46s - loss: 2.3118 - regression_loss: 1.9591 - classification_loss: 0.3526 377/500 [=====================>........] - ETA: 46s - loss: 2.3123 - regression_loss: 1.9597 - classification_loss: 0.3525 378/500 [=====================>........] - ETA: 46s - loss: 2.3119 - regression_loss: 1.9595 - classification_loss: 0.3524 379/500 [=====================>........] - ETA: 45s - loss: 2.3105 - regression_loss: 1.9584 - classification_loss: 0.3521 380/500 [=====================>........] - ETA: 45s - loss: 2.3102 - regression_loss: 1.9583 - classification_loss: 0.3519 381/500 [=====================>........] - ETA: 45s - loss: 2.3099 - regression_loss: 1.9582 - classification_loss: 0.3517 382/500 [=====================>........] - ETA: 44s - loss: 2.3094 - regression_loss: 1.9578 - classification_loss: 0.3515 383/500 [=====================>........] - ETA: 44s - loss: 2.3074 - regression_loss: 1.9562 - classification_loss: 0.3512 384/500 [======================>.......] - ETA: 43s - loss: 2.3058 - regression_loss: 1.9549 - classification_loss: 0.3509 385/500 [======================>.......] - ETA: 43s - loss: 2.3057 - regression_loss: 1.9549 - classification_loss: 0.3508 386/500 [======================>.......] - ETA: 43s - loss: 2.3051 - regression_loss: 1.9542 - classification_loss: 0.3509 387/500 [======================>.......] - ETA: 42s - loss: 2.3040 - regression_loss: 1.9533 - classification_loss: 0.3507 388/500 [======================>.......] - ETA: 42s - loss: 2.3051 - regression_loss: 1.9542 - classification_loss: 0.3509 389/500 [======================>.......] - ETA: 42s - loss: 2.3046 - regression_loss: 1.9537 - classification_loss: 0.3509 390/500 [======================>.......] - ETA: 41s - loss: 2.3035 - regression_loss: 1.9529 - classification_loss: 0.3506 391/500 [======================>.......] - ETA: 41s - loss: 2.3035 - regression_loss: 1.9528 - classification_loss: 0.3507 392/500 [======================>.......] - ETA: 40s - loss: 2.3041 - regression_loss: 1.9534 - classification_loss: 0.3507 393/500 [======================>.......] - ETA: 40s - loss: 2.3014 - regression_loss: 1.9510 - classification_loss: 0.3504 394/500 [======================>.......] - ETA: 40s - loss: 2.3020 - regression_loss: 1.9515 - classification_loss: 0.3505 395/500 [======================>.......] - ETA: 39s - loss: 2.3014 - regression_loss: 1.9511 - classification_loss: 0.3504 396/500 [======================>.......] - ETA: 39s - loss: 2.3010 - regression_loss: 1.9507 - classification_loss: 0.3503 397/500 [======================>.......] - ETA: 39s - loss: 2.3007 - regression_loss: 1.9504 - classification_loss: 0.3503 398/500 [======================>.......] - ETA: 38s - loss: 2.3016 - regression_loss: 1.9512 - classification_loss: 0.3504 399/500 [======================>.......] - ETA: 38s - loss: 2.3011 - regression_loss: 1.9507 - classification_loss: 0.3503 400/500 [=======================>......] - ETA: 37s - loss: 2.3012 - regression_loss: 1.9509 - classification_loss: 0.3503 401/500 [=======================>......] - ETA: 37s - loss: 2.3009 - regression_loss: 1.9508 - classification_loss: 0.3501 402/500 [=======================>......] - ETA: 37s - loss: 2.3015 - regression_loss: 1.9509 - classification_loss: 0.3505 403/500 [=======================>......] - ETA: 36s - loss: 2.3013 - regression_loss: 1.9509 - classification_loss: 0.3505 404/500 [=======================>......] - ETA: 36s - loss: 2.3011 - regression_loss: 1.9506 - classification_loss: 0.3505 405/500 [=======================>......] - ETA: 36s - loss: 2.3004 - regression_loss: 1.9500 - classification_loss: 0.3504 406/500 [=======================>......] - ETA: 35s - loss: 2.3007 - regression_loss: 1.9503 - classification_loss: 0.3504 407/500 [=======================>......] - ETA: 35s - loss: 2.3002 - regression_loss: 1.9499 - classification_loss: 0.3503 408/500 [=======================>......] - ETA: 34s - loss: 2.2985 - regression_loss: 1.9485 - classification_loss: 0.3500 409/500 [=======================>......] - ETA: 34s - loss: 2.2952 - regression_loss: 1.9457 - classification_loss: 0.3495 410/500 [=======================>......] - ETA: 34s - loss: 2.2935 - regression_loss: 1.9443 - classification_loss: 0.3493 411/500 [=======================>......] - ETA: 33s - loss: 2.2937 - regression_loss: 1.9445 - classification_loss: 0.3492 412/500 [=======================>......] - ETA: 33s - loss: 2.2955 - regression_loss: 1.9458 - classification_loss: 0.3497 413/500 [=======================>......] - ETA: 32s - loss: 2.2954 - regression_loss: 1.9458 - classification_loss: 0.3496 414/500 [=======================>......] - ETA: 32s - loss: 2.2960 - regression_loss: 1.9461 - classification_loss: 0.3499 415/500 [=======================>......] - ETA: 32s - loss: 2.2953 - regression_loss: 1.9456 - classification_loss: 0.3497 416/500 [=======================>......] - ETA: 31s - loss: 2.2954 - regression_loss: 1.9457 - classification_loss: 0.3497 417/500 [========================>.....] - ETA: 31s - loss: 2.2952 - regression_loss: 1.9456 - classification_loss: 0.3496 418/500 [========================>.....] - ETA: 31s - loss: 2.2942 - regression_loss: 1.9447 - classification_loss: 0.3495 419/500 [========================>.....] - ETA: 30s - loss: 2.2926 - regression_loss: 1.9433 - classification_loss: 0.3493 420/500 [========================>.....] - ETA: 30s - loss: 2.2934 - regression_loss: 1.9439 - classification_loss: 0.3495 421/500 [========================>.....] - ETA: 29s - loss: 2.2921 - regression_loss: 1.9427 - classification_loss: 0.3494 422/500 [========================>.....] - ETA: 29s - loss: 2.2915 - regression_loss: 1.9422 - classification_loss: 0.3493 423/500 [========================>.....] - ETA: 29s - loss: 2.2902 - regression_loss: 1.9411 - classification_loss: 0.3492 424/500 [========================>.....] - ETA: 28s - loss: 2.2906 - regression_loss: 1.9415 - classification_loss: 0.3492 425/500 [========================>.....] - ETA: 28s - loss: 2.2894 - regression_loss: 1.9394 - classification_loss: 0.3500 426/500 [========================>.....] - ETA: 28s - loss: 2.2892 - regression_loss: 1.9393 - classification_loss: 0.3499 427/500 [========================>.....] - ETA: 27s - loss: 2.2893 - regression_loss: 1.9395 - classification_loss: 0.3498 428/500 [========================>.....] - ETA: 27s - loss: 2.2899 - regression_loss: 1.9401 - classification_loss: 0.3498 429/500 [========================>.....] - ETA: 26s - loss: 2.2903 - regression_loss: 1.9403 - classification_loss: 0.3500 430/500 [========================>.....] - ETA: 26s - loss: 2.2890 - regression_loss: 1.9391 - classification_loss: 0.3499 431/500 [========================>.....] - ETA: 26s - loss: 2.2884 - regression_loss: 1.9385 - classification_loss: 0.3498 432/500 [========================>.....] - ETA: 25s - loss: 2.2886 - regression_loss: 1.9387 - classification_loss: 0.3499 433/500 [========================>.....] - ETA: 25s - loss: 2.2886 - regression_loss: 1.9387 - classification_loss: 0.3499 434/500 [=========================>....] - ETA: 25s - loss: 2.2874 - regression_loss: 1.9376 - classification_loss: 0.3497 435/500 [=========================>....] - ETA: 24s - loss: 2.2882 - regression_loss: 1.9384 - classification_loss: 0.3499 436/500 [=========================>....] - ETA: 24s - loss: 2.2878 - regression_loss: 1.9381 - classification_loss: 0.3497 437/500 [=========================>....] - ETA: 23s - loss: 2.2875 - regression_loss: 1.9379 - classification_loss: 0.3497 438/500 [=========================>....] - ETA: 23s - loss: 2.2870 - regression_loss: 1.9375 - classification_loss: 0.3495 439/500 [=========================>....] - ETA: 23s - loss: 2.2873 - regression_loss: 1.9378 - classification_loss: 0.3495 440/500 [=========================>....] - ETA: 22s - loss: 2.2860 - regression_loss: 1.9367 - classification_loss: 0.3494 441/500 [=========================>....] - ETA: 22s - loss: 2.2859 - regression_loss: 1.9364 - classification_loss: 0.3495 442/500 [=========================>....] - ETA: 22s - loss: 2.2856 - regression_loss: 1.9361 - classification_loss: 0.3495 443/500 [=========================>....] - ETA: 21s - loss: 2.2858 - regression_loss: 1.9364 - classification_loss: 0.3493 444/500 [=========================>....] - ETA: 21s - loss: 2.2853 - regression_loss: 1.9361 - classification_loss: 0.3492 445/500 [=========================>....] - ETA: 20s - loss: 2.2848 - regression_loss: 1.9357 - classification_loss: 0.3491 446/500 [=========================>....] - ETA: 20s - loss: 2.2837 - regression_loss: 1.9347 - classification_loss: 0.3490 447/500 [=========================>....] - ETA: 20s - loss: 2.2820 - regression_loss: 1.9334 - classification_loss: 0.3487 448/500 [=========================>....] - ETA: 19s - loss: 2.2817 - regression_loss: 1.9331 - classification_loss: 0.3486 449/500 [=========================>....] - ETA: 19s - loss: 2.2815 - regression_loss: 1.9330 - classification_loss: 0.3485 450/500 [==========================>...] - ETA: 18s - loss: 2.2797 - regression_loss: 1.9316 - classification_loss: 0.3481 451/500 [==========================>...] - ETA: 18s - loss: 2.2804 - regression_loss: 1.9323 - classification_loss: 0.3481 452/500 [==========================>...] - ETA: 18s - loss: 2.2797 - regression_loss: 1.9318 - classification_loss: 0.3479 453/500 [==========================>...] - ETA: 17s - loss: 2.2795 - regression_loss: 1.9317 - classification_loss: 0.3478 454/500 [==========================>...] - ETA: 17s - loss: 2.2790 - regression_loss: 1.9313 - classification_loss: 0.3477 455/500 [==========================>...] - ETA: 17s - loss: 2.2801 - regression_loss: 1.9324 - classification_loss: 0.3477 456/500 [==========================>...] - ETA: 16s - loss: 2.2793 - regression_loss: 1.9317 - classification_loss: 0.3476 457/500 [==========================>...] - ETA: 16s - loss: 2.2792 - regression_loss: 1.9315 - classification_loss: 0.3477 458/500 [==========================>...] - ETA: 15s - loss: 2.2792 - regression_loss: 1.9313 - classification_loss: 0.3479 459/500 [==========================>...] - ETA: 15s - loss: 2.2787 - regression_loss: 1.9309 - classification_loss: 0.3478 460/500 [==========================>...] - ETA: 15s - loss: 2.2789 - regression_loss: 1.9310 - classification_loss: 0.3480 461/500 [==========================>...] - ETA: 14s - loss: 2.2779 - regression_loss: 1.9302 - classification_loss: 0.3478 462/500 [==========================>...] - ETA: 14s - loss: 2.2771 - regression_loss: 1.9295 - classification_loss: 0.3475 463/500 [==========================>...] - ETA: 14s - loss: 2.2764 - regression_loss: 1.9290 - classification_loss: 0.3474 464/500 [==========================>...] - ETA: 13s - loss: 2.2753 - regression_loss: 1.9281 - classification_loss: 0.3472 465/500 [==========================>...] - ETA: 13s - loss: 2.2743 - regression_loss: 1.9274 - classification_loss: 0.3470 466/500 [==========================>...] - ETA: 12s - loss: 2.2741 - regression_loss: 1.9273 - classification_loss: 0.3469 467/500 [===========================>..] - ETA: 12s - loss: 2.2738 - regression_loss: 1.9270 - classification_loss: 0.3469 468/500 [===========================>..] - ETA: 12s - loss: 2.2730 - regression_loss: 1.9263 - classification_loss: 0.3467 469/500 [===========================>..] - ETA: 11s - loss: 2.2724 - regression_loss: 1.9259 - classification_loss: 0.3465 470/500 [===========================>..] - ETA: 11s - loss: 2.2712 - regression_loss: 1.9250 - classification_loss: 0.3462 471/500 [===========================>..] - ETA: 11s - loss: 2.2706 - regression_loss: 1.9246 - classification_loss: 0.3460 472/500 [===========================>..] - ETA: 10s - loss: 2.2702 - regression_loss: 1.9243 - classification_loss: 0.3459 473/500 [===========================>..] - ETA: 10s - loss: 2.2699 - regression_loss: 1.9240 - classification_loss: 0.3459 474/500 [===========================>..] - ETA: 9s - loss: 2.2694 - regression_loss: 1.9236 - classification_loss: 0.3458  475/500 [===========================>..] - ETA: 9s - loss: 2.2702 - regression_loss: 1.9243 - classification_loss: 0.3459 476/500 [===========================>..] - ETA: 9s - loss: 2.2699 - regression_loss: 1.9241 - classification_loss: 0.3458 477/500 [===========================>..] - ETA: 8s - loss: 2.2727 - regression_loss: 1.9242 - classification_loss: 0.3484 478/500 [===========================>..] - ETA: 8s - loss: 2.2731 - regression_loss: 1.9247 - classification_loss: 0.3485 479/500 [===========================>..] - ETA: 7s - loss: 2.2733 - regression_loss: 1.9248 - classification_loss: 0.3485 480/500 [===========================>..] - ETA: 7s - loss: 2.2721 - regression_loss: 1.9237 - classification_loss: 0.3485 481/500 [===========================>..] - ETA: 7s - loss: 2.2718 - regression_loss: 1.9234 - classification_loss: 0.3484 482/500 [===========================>..] - ETA: 6s - loss: 2.2712 - regression_loss: 1.9229 - classification_loss: 0.3482 483/500 [===========================>..] - ETA: 6s - loss: 2.2713 - regression_loss: 1.9230 - classification_loss: 0.3483 484/500 [============================>.] - ETA: 6s - loss: 2.2721 - regression_loss: 1.9236 - classification_loss: 0.3485 485/500 [============================>.] - ETA: 5s - loss: 2.2706 - regression_loss: 1.9223 - classification_loss: 0.3483 486/500 [============================>.] - ETA: 5s - loss: 2.2697 - regression_loss: 1.9216 - classification_loss: 0.3481 487/500 [============================>.] - ETA: 4s - loss: 2.2695 - regression_loss: 1.9213 - classification_loss: 0.3481 488/500 [============================>.] - ETA: 4s - loss: 2.2688 - regression_loss: 1.9208 - classification_loss: 0.3480 489/500 [============================>.] - ETA: 4s - loss: 2.2677 - regression_loss: 1.9199 - classification_loss: 0.3478 490/500 [============================>.] - ETA: 3s - loss: 2.2663 - regression_loss: 1.9188 - classification_loss: 0.3476 491/500 [============================>.] - ETA: 3s - loss: 2.2663 - regression_loss: 1.9188 - classification_loss: 0.3476 492/500 [============================>.] - ETA: 3s - loss: 2.2652 - regression_loss: 1.9179 - classification_loss: 0.3473 493/500 [============================>.] - ETA: 2s - loss: 2.2646 - regression_loss: 1.9175 - classification_loss: 0.3471 494/500 [============================>.] - ETA: 2s - loss: 2.2646 - regression_loss: 1.9175 - classification_loss: 0.3471 495/500 [============================>.] - ETA: 1s - loss: 2.2637 - regression_loss: 1.9166 - classification_loss: 0.3471 496/500 [============================>.] - ETA: 1s - loss: 2.2634 - regression_loss: 1.9164 - classification_loss: 0.3470 497/500 [============================>.] - ETA: 1s - loss: 2.2635 - regression_loss: 1.9164 - classification_loss: 0.3471 498/500 [============================>.] - ETA: 0s - loss: 2.2632 - regression_loss: 1.9162 - classification_loss: 0.3470 499/500 [============================>.] - ETA: 0s - loss: 2.2636 - regression_loss: 1.9166 - classification_loss: 0.3469 500/500 [==============================] - 190s 380ms/step - loss: 2.2635 - regression_loss: 1.9167 - classification_loss: 0.3467 1172 instances of class plum with average precision: 0.4812 mAP: 0.4812 Epoch 00002: saving model to ./training/snapshots/resnet101_pascal_02.h5 Epoch 3/20 1/500 [..............................] - ETA: 3:11 - loss: 1.8510 - regression_loss: 1.6263 - classification_loss: 0.2247 2/500 [..............................] - ETA: 3:10 - loss: 2.0270 - regression_loss: 1.7340 - classification_loss: 0.2930 3/500 [..............................] - ETA: 3:12 - loss: 1.9970 - regression_loss: 1.6868 - classification_loss: 0.3102 4/500 [..............................] - ETA: 3:10 - loss: 1.9322 - regression_loss: 1.6287 - classification_loss: 0.3035 5/500 [..............................] - ETA: 3:09 - loss: 1.9989 - regression_loss: 1.6871 - classification_loss: 0.3118 6/500 [..............................] - ETA: 3:06 - loss: 2.0739 - regression_loss: 1.7442 - classification_loss: 0.3297 7/500 [..............................] - ETA: 3:07 - loss: 2.0644 - regression_loss: 1.7425 - classification_loss: 0.3219 8/500 [..............................] - ETA: 3:08 - loss: 1.9141 - regression_loss: 1.6183 - classification_loss: 0.2958 9/500 [..............................] - ETA: 3:08 - loss: 1.9609 - regression_loss: 1.6517 - classification_loss: 0.3092 10/500 [..............................] - ETA: 3:08 - loss: 2.0307 - regression_loss: 1.6910 - classification_loss: 0.3397 11/500 [..............................] - ETA: 3:08 - loss: 2.0573 - regression_loss: 1.7186 - classification_loss: 0.3386 12/500 [..............................] - ETA: 3:06 - loss: 2.0580 - regression_loss: 1.7229 - classification_loss: 0.3350 13/500 [..............................] - ETA: 3:05 - loss: 2.0712 - regression_loss: 1.7381 - classification_loss: 0.3331 14/500 [..............................] - ETA: 3:05 - loss: 2.0716 - regression_loss: 1.7429 - classification_loss: 0.3287 15/500 [..............................] - ETA: 3:05 - loss: 2.0312 - regression_loss: 1.7054 - classification_loss: 0.3258 16/500 [..............................] - ETA: 3:05 - loss: 2.0331 - regression_loss: 1.7041 - classification_loss: 0.3290 17/500 [>.............................] - ETA: 3:05 - loss: 2.0320 - regression_loss: 1.7067 - classification_loss: 0.3253 18/500 [>.............................] - ETA: 3:04 - loss: 2.0425 - regression_loss: 1.7150 - classification_loss: 0.3276 19/500 [>.............................] - ETA: 3:04 - loss: 2.0267 - regression_loss: 1.7024 - classification_loss: 0.3243 20/500 [>.............................] - ETA: 3:04 - loss: 1.9859 - regression_loss: 1.6683 - classification_loss: 0.3175 21/500 [>.............................] - ETA: 3:04 - loss: 1.9825 - regression_loss: 1.6661 - classification_loss: 0.3164 22/500 [>.............................] - ETA: 3:04 - loss: 1.9851 - regression_loss: 1.6691 - classification_loss: 0.3160 23/500 [>.............................] - ETA: 3:03 - loss: 2.0026 - regression_loss: 1.6860 - classification_loss: 0.3166 24/500 [>.............................] - ETA: 3:03 - loss: 2.0066 - regression_loss: 1.6926 - classification_loss: 0.3139 25/500 [>.............................] - ETA: 3:02 - loss: 2.0038 - regression_loss: 1.6920 - classification_loss: 0.3118 26/500 [>.............................] - ETA: 3:02 - loss: 1.9874 - regression_loss: 1.6796 - classification_loss: 0.3078 27/500 [>.............................] - ETA: 3:02 - loss: 1.9889 - regression_loss: 1.6804 - classification_loss: 0.3086 28/500 [>.............................] - ETA: 3:02 - loss: 1.9857 - regression_loss: 1.6788 - classification_loss: 0.3069 29/500 [>.............................] - ETA: 3:01 - loss: 1.9999 - regression_loss: 1.6911 - classification_loss: 0.3087 30/500 [>.............................] - ETA: 3:01 - loss: 2.0134 - regression_loss: 1.7035 - classification_loss: 0.3098 31/500 [>.............................] - ETA: 3:01 - loss: 2.0143 - regression_loss: 1.7060 - classification_loss: 0.3084 32/500 [>.............................] - ETA: 3:01 - loss: 2.0219 - regression_loss: 1.7146 - classification_loss: 0.3072 33/500 [>.............................] - ETA: 3:00 - loss: 2.0276 - regression_loss: 1.7205 - classification_loss: 0.3070 34/500 [=>............................] - ETA: 3:00 - loss: 2.0501 - regression_loss: 1.7386 - classification_loss: 0.3115 35/500 [=>............................] - ETA: 3:00 - loss: 2.0430 - regression_loss: 1.7345 - classification_loss: 0.3084 36/500 [=>............................] - ETA: 3:00 - loss: 2.0374 - regression_loss: 1.7315 - classification_loss: 0.3059 37/500 [=>............................] - ETA: 2:59 - loss: 2.0478 - regression_loss: 1.7412 - classification_loss: 0.3067 38/500 [=>............................] - ETA: 2:59 - loss: 2.0555 - regression_loss: 1.7490 - classification_loss: 0.3066 39/500 [=>............................] - ETA: 2:59 - loss: 2.0634 - regression_loss: 1.7555 - classification_loss: 0.3079 40/500 [=>............................] - ETA: 2:59 - loss: 2.0626 - regression_loss: 1.7558 - classification_loss: 0.3068 41/500 [=>............................] - ETA: 2:58 - loss: 2.0623 - regression_loss: 1.7552 - classification_loss: 0.3071 42/500 [=>............................] - ETA: 2:57 - loss: 2.0644 - regression_loss: 1.7579 - classification_loss: 0.3065 43/500 [=>............................] - ETA: 2:57 - loss: 2.0629 - regression_loss: 1.7574 - classification_loss: 0.3055 44/500 [=>............................] - ETA: 2:57 - loss: 2.0612 - regression_loss: 1.7546 - classification_loss: 0.3065 45/500 [=>............................] - ETA: 2:56 - loss: 2.0579 - regression_loss: 1.7517 - classification_loss: 0.3062 46/500 [=>............................] - ETA: 2:56 - loss: 2.0548 - regression_loss: 1.7488 - classification_loss: 0.3060 47/500 [=>............................] - ETA: 2:55 - loss: 2.0536 - regression_loss: 1.7482 - classification_loss: 0.3054 48/500 [=>............................] - ETA: 2:55 - loss: 2.0546 - regression_loss: 1.7492 - classification_loss: 0.3054 49/500 [=>............................] - ETA: 2:55 - loss: 2.0552 - regression_loss: 1.7499 - classification_loss: 0.3054 50/500 [==>...........................] - ETA: 2:54 - loss: 2.0499 - regression_loss: 1.7444 - classification_loss: 0.3055 51/500 [==>...........................] - ETA: 2:53 - loss: 2.0355 - regression_loss: 1.7312 - classification_loss: 0.3043 52/500 [==>...........................] - ETA: 2:53 - loss: 2.0393 - regression_loss: 1.7348 - classification_loss: 0.3045 53/500 [==>...........................] - ETA: 2:52 - loss: 2.0438 - regression_loss: 1.7396 - classification_loss: 0.3042 54/500 [==>...........................] - ETA: 2:51 - loss: 2.0283 - regression_loss: 1.7261 - classification_loss: 0.3022 55/500 [==>...........................] - ETA: 2:51 - loss: 2.0315 - regression_loss: 1.7294 - classification_loss: 0.3021 56/500 [==>...........................] - ETA: 2:50 - loss: 2.0285 - regression_loss: 1.7274 - classification_loss: 0.3011 57/500 [==>...........................] - ETA: 2:50 - loss: 2.0292 - regression_loss: 1.7282 - classification_loss: 0.3010 58/500 [==>...........................] - ETA: 2:49 - loss: 2.0128 - regression_loss: 1.7144 - classification_loss: 0.2984 59/500 [==>...........................] - ETA: 2:49 - loss: 2.0104 - regression_loss: 1.7126 - classification_loss: 0.2978 60/500 [==>...........................] - ETA: 2:49 - loss: 2.0159 - regression_loss: 1.7169 - classification_loss: 0.2991 61/500 [==>...........................] - ETA: 2:48 - loss: 2.0159 - regression_loss: 1.7166 - classification_loss: 0.2994 62/500 [==>...........................] - ETA: 2:48 - loss: 2.0170 - regression_loss: 1.7171 - classification_loss: 0.2999 63/500 [==>...........................] - ETA: 2:47 - loss: 2.0165 - regression_loss: 1.7171 - classification_loss: 0.2993 64/500 [==>...........................] - ETA: 2:47 - loss: 2.0205 - regression_loss: 1.7205 - classification_loss: 0.3000 65/500 [==>...........................] - ETA: 2:47 - loss: 2.0232 - regression_loss: 1.7221 - classification_loss: 0.3011 66/500 [==>...........................] - ETA: 2:46 - loss: 2.0235 - regression_loss: 1.7226 - classification_loss: 0.3009 67/500 [===>..........................] - ETA: 2:46 - loss: 2.0195 - regression_loss: 1.7193 - classification_loss: 0.3002 68/500 [===>..........................] - ETA: 2:45 - loss: 2.0196 - regression_loss: 1.7198 - classification_loss: 0.2998 69/500 [===>..........................] - ETA: 2:45 - loss: 2.0197 - regression_loss: 1.7205 - classification_loss: 0.2992 70/500 [===>..........................] - ETA: 2:45 - loss: 2.0190 - regression_loss: 1.7201 - classification_loss: 0.2989 71/500 [===>..........................] - ETA: 2:44 - loss: 2.0167 - regression_loss: 1.7180 - classification_loss: 0.2987 72/500 [===>..........................] - ETA: 2:44 - loss: 2.0145 - regression_loss: 1.7170 - classification_loss: 0.2975 73/500 [===>..........................] - ETA: 2:43 - loss: 2.0166 - regression_loss: 1.7190 - classification_loss: 0.2976 74/500 [===>..........................] - ETA: 2:43 - loss: 2.0176 - regression_loss: 1.7198 - classification_loss: 0.2979 75/500 [===>..........................] - ETA: 2:42 - loss: 2.0231 - regression_loss: 1.7243 - classification_loss: 0.2988 76/500 [===>..........................] - ETA: 2:42 - loss: 2.0227 - regression_loss: 1.7250 - classification_loss: 0.2977 77/500 [===>..........................] - ETA: 2:42 - loss: 2.0132 - regression_loss: 1.7170 - classification_loss: 0.2962 78/500 [===>..........................] - ETA: 2:41 - loss: 2.0170 - regression_loss: 1.7200 - classification_loss: 0.2970 79/500 [===>..........................] - ETA: 2:41 - loss: 2.0109 - regression_loss: 1.7151 - classification_loss: 0.2958 80/500 [===>..........................] - ETA: 2:40 - loss: 2.0076 - regression_loss: 1.7116 - classification_loss: 0.2960 81/500 [===>..........................] - ETA: 2:40 - loss: 2.0097 - regression_loss: 1.7142 - classification_loss: 0.2955 82/500 [===>..........................] - ETA: 2:40 - loss: 2.0118 - regression_loss: 1.7164 - classification_loss: 0.2954 83/500 [===>..........................] - ETA: 2:39 - loss: 2.0055 - regression_loss: 1.7107 - classification_loss: 0.2948 84/500 [====>.........................] - ETA: 2:39 - loss: 2.0091 - regression_loss: 1.7137 - classification_loss: 0.2954 85/500 [====>.........................] - ETA: 2:38 - loss: 2.0121 - regression_loss: 1.7168 - classification_loss: 0.2953 86/500 [====>.........................] - ETA: 2:38 - loss: 2.0136 - regression_loss: 1.7180 - classification_loss: 0.2956 87/500 [====>.........................] - ETA: 2:38 - loss: 2.0114 - regression_loss: 1.7163 - classification_loss: 0.2950 88/500 [====>.........................] - ETA: 2:37 - loss: 2.0061 - regression_loss: 1.7124 - classification_loss: 0.2937 89/500 [====>.........................] - ETA: 2:37 - loss: 2.0105 - regression_loss: 1.7165 - classification_loss: 0.2940 90/500 [====>.........................] - ETA: 2:36 - loss: 2.0099 - regression_loss: 1.7157 - classification_loss: 0.2942 91/500 [====>.........................] - ETA: 2:36 - loss: 2.0038 - regression_loss: 1.7105 - classification_loss: 0.2933 92/500 [====>.........................] - ETA: 2:36 - loss: 2.0027 - regression_loss: 1.7098 - classification_loss: 0.2930 93/500 [====>.........................] - ETA: 2:35 - loss: 1.9953 - regression_loss: 1.7035 - classification_loss: 0.2917 94/500 [====>.........................] - ETA: 2:35 - loss: 1.9948 - regression_loss: 1.7036 - classification_loss: 0.2912 95/500 [====>.........................] - ETA: 2:34 - loss: 2.0009 - regression_loss: 1.7087 - classification_loss: 0.2922 96/500 [====>.........................] - ETA: 2:34 - loss: 2.0052 - regression_loss: 1.7122 - classification_loss: 0.2929 97/500 [====>.........................] - ETA: 2:34 - loss: 2.0044 - regression_loss: 1.7119 - classification_loss: 0.2925 98/500 [====>.........................] - ETA: 2:33 - loss: 2.0037 - regression_loss: 1.7117 - classification_loss: 0.2919 99/500 [====>.........................] - ETA: 2:33 - loss: 2.0053 - regression_loss: 1.7132 - classification_loss: 0.2921 100/500 [=====>........................] - ETA: 2:32 - loss: 1.9983 - regression_loss: 1.7072 - classification_loss: 0.2911 101/500 [=====>........................] - ETA: 2:32 - loss: 1.9976 - regression_loss: 1.7068 - classification_loss: 0.2908 102/500 [=====>........................] - ETA: 2:32 - loss: 1.9987 - regression_loss: 1.7076 - classification_loss: 0.2910 103/500 [=====>........................] - ETA: 2:31 - loss: 1.9970 - regression_loss: 1.7062 - classification_loss: 0.2908 104/500 [=====>........................] - ETA: 2:31 - loss: 1.9954 - regression_loss: 1.7048 - classification_loss: 0.2907 105/500 [=====>........................] - ETA: 2:31 - loss: 2.0004 - regression_loss: 1.7093 - classification_loss: 0.2911 106/500 [=====>........................] - ETA: 2:30 - loss: 1.9988 - regression_loss: 1.7081 - classification_loss: 0.2907 107/500 [=====>........................] - ETA: 2:30 - loss: 2.0016 - regression_loss: 1.7106 - classification_loss: 0.2910 108/500 [=====>........................] - ETA: 2:29 - loss: 2.0017 - regression_loss: 1.7112 - classification_loss: 0.2905 109/500 [=====>........................] - ETA: 2:29 - loss: 1.9999 - regression_loss: 1.7098 - classification_loss: 0.2901 110/500 [=====>........................] - ETA: 2:29 - loss: 1.9942 - regression_loss: 1.7045 - classification_loss: 0.2897 111/500 [=====>........................] - ETA: 2:28 - loss: 1.9883 - regression_loss: 1.6994 - classification_loss: 0.2889 112/500 [=====>........................] - ETA: 2:28 - loss: 1.9890 - regression_loss: 1.7003 - classification_loss: 0.2887 113/500 [=====>........................] - ETA: 2:28 - loss: 1.9882 - regression_loss: 1.6998 - classification_loss: 0.2884 114/500 [=====>........................] - ETA: 2:27 - loss: 1.9846 - regression_loss: 1.6968 - classification_loss: 0.2878 115/500 [=====>........................] - ETA: 2:27 - loss: 1.9868 - regression_loss: 1.6987 - classification_loss: 0.2881 116/500 [=====>........................] - ETA: 2:27 - loss: 1.9870 - regression_loss: 1.6990 - classification_loss: 0.2880 117/500 [======>.......................] - ETA: 2:26 - loss: 1.9847 - regression_loss: 1.6967 - classification_loss: 0.2880 118/500 [======>.......................] - ETA: 2:26 - loss: 1.9894 - regression_loss: 1.6994 - classification_loss: 0.2900 119/500 [======>.......................] - ETA: 2:26 - loss: 1.9872 - regression_loss: 1.6974 - classification_loss: 0.2897 120/500 [======>.......................] - ETA: 2:25 - loss: 1.9882 - regression_loss: 1.6987 - classification_loss: 0.2895 121/500 [======>.......................] - ETA: 2:25 - loss: 1.9890 - regression_loss: 1.6996 - classification_loss: 0.2894 122/500 [======>.......................] - ETA: 2:24 - loss: 1.9923 - regression_loss: 1.7018 - classification_loss: 0.2905 123/500 [======>.......................] - ETA: 2:24 - loss: 1.9941 - regression_loss: 1.7036 - classification_loss: 0.2906 124/500 [======>.......................] - ETA: 2:24 - loss: 1.9945 - regression_loss: 1.7041 - classification_loss: 0.2904 125/500 [======>.......................] - ETA: 2:23 - loss: 1.9961 - regression_loss: 1.7054 - classification_loss: 0.2907 126/500 [======>.......................] - ETA: 2:23 - loss: 1.9908 - regression_loss: 1.7001 - classification_loss: 0.2907 127/500 [======>.......................] - ETA: 2:22 - loss: 1.9904 - regression_loss: 1.7000 - classification_loss: 0.2905 128/500 [======>.......................] - ETA: 2:22 - loss: 1.9870 - regression_loss: 1.6972 - classification_loss: 0.2898 129/500 [======>.......................] - ETA: 2:22 - loss: 1.9885 - regression_loss: 1.6988 - classification_loss: 0.2897 130/500 [======>.......................] - ETA: 2:21 - loss: 1.9813 - regression_loss: 1.6928 - classification_loss: 0.2885 131/500 [======>.......................] - ETA: 2:21 - loss: 1.9813 - regression_loss: 1.6929 - classification_loss: 0.2885 132/500 [======>.......................] - ETA: 2:20 - loss: 1.9805 - regression_loss: 1.6922 - classification_loss: 0.2883 133/500 [======>.......................] - ETA: 2:20 - loss: 1.9781 - regression_loss: 1.6902 - classification_loss: 0.2879 134/500 [=======>......................] - ETA: 2:20 - loss: 1.9801 - regression_loss: 1.6911 - classification_loss: 0.2889 135/500 [=======>......................] - ETA: 2:19 - loss: 1.9833 - regression_loss: 1.6922 - classification_loss: 0.2911 136/500 [=======>......................] - ETA: 2:19 - loss: 1.9866 - regression_loss: 1.6951 - classification_loss: 0.2915 137/500 [=======>......................] - ETA: 2:19 - loss: 1.9943 - regression_loss: 1.7010 - classification_loss: 0.2934 138/500 [=======>......................] - ETA: 2:18 - loss: 1.9871 - regression_loss: 1.6949 - classification_loss: 0.2922 139/500 [=======>......................] - ETA: 2:18 - loss: 1.9894 - regression_loss: 1.6969 - classification_loss: 0.2925 140/500 [=======>......................] - ETA: 2:17 - loss: 1.9877 - regression_loss: 1.6951 - classification_loss: 0.2926 141/500 [=======>......................] - ETA: 2:17 - loss: 1.9875 - regression_loss: 1.6952 - classification_loss: 0.2923 142/500 [=======>......................] - ETA: 2:17 - loss: 1.9936 - regression_loss: 1.7004 - classification_loss: 0.2932 143/500 [=======>......................] - ETA: 2:16 - loss: 1.9899 - regression_loss: 1.6974 - classification_loss: 0.2925 144/500 [=======>......................] - ETA: 2:16 - loss: 1.9895 - regression_loss: 1.6974 - classification_loss: 0.2921 145/500 [=======>......................] - ETA: 2:15 - loss: 1.9871 - regression_loss: 1.6954 - classification_loss: 0.2917 146/500 [=======>......................] - ETA: 2:15 - loss: 1.9885 - regression_loss: 1.6965 - classification_loss: 0.2919 147/500 [=======>......................] - ETA: 2:15 - loss: 1.9901 - regression_loss: 1.6981 - classification_loss: 0.2920 148/500 [=======>......................] - ETA: 2:14 - loss: 1.9880 - regression_loss: 1.6962 - classification_loss: 0.2918 149/500 [=======>......................] - ETA: 2:14 - loss: 1.9868 - regression_loss: 1.6950 - classification_loss: 0.2918 150/500 [========>.....................] - ETA: 2:14 - loss: 1.9848 - regression_loss: 1.6932 - classification_loss: 0.2916 151/500 [========>.....................] - ETA: 2:13 - loss: 1.9841 - regression_loss: 1.6921 - classification_loss: 0.2919 152/500 [========>.....................] - ETA: 2:13 - loss: 1.9871 - regression_loss: 1.6948 - classification_loss: 0.2923 153/500 [========>.....................] - ETA: 2:12 - loss: 1.9886 - regression_loss: 1.6962 - classification_loss: 0.2924 154/500 [========>.....................] - ETA: 2:12 - loss: 1.9912 - regression_loss: 1.6986 - classification_loss: 0.2926 155/500 [========>.....................] - ETA: 2:12 - loss: 1.9867 - regression_loss: 1.6945 - classification_loss: 0.2921 156/500 [========>.....................] - ETA: 2:11 - loss: 1.9843 - regression_loss: 1.6926 - classification_loss: 0.2917 157/500 [========>.....................] - ETA: 2:11 - loss: 1.9842 - regression_loss: 1.6931 - classification_loss: 0.2912 158/500 [========>.....................] - ETA: 2:11 - loss: 1.9861 - regression_loss: 1.6948 - classification_loss: 0.2912 159/500 [========>.....................] - ETA: 2:10 - loss: 1.9860 - regression_loss: 1.6950 - classification_loss: 0.2911 160/500 [========>.....................] - ETA: 2:10 - loss: 1.9841 - regression_loss: 1.6934 - classification_loss: 0.2907 161/500 [========>.....................] - ETA: 2:09 - loss: 1.9848 - regression_loss: 1.6940 - classification_loss: 0.2908 162/500 [========>.....................] - ETA: 2:09 - loss: 1.9827 - regression_loss: 1.6922 - classification_loss: 0.2905 163/500 [========>.....................] - ETA: 2:09 - loss: 1.9816 - regression_loss: 1.6914 - classification_loss: 0.2902 164/500 [========>.....................] - ETA: 2:08 - loss: 1.9818 - regression_loss: 1.6913 - classification_loss: 0.2905 165/500 [========>.....................] - ETA: 2:08 - loss: 1.9828 - regression_loss: 1.6922 - classification_loss: 0.2906 166/500 [========>.....................] - ETA: 2:08 - loss: 1.9848 - regression_loss: 1.6934 - classification_loss: 0.2914 167/500 [=========>....................] - ETA: 2:07 - loss: 1.9825 - regression_loss: 1.6914 - classification_loss: 0.2910 168/500 [=========>....................] - ETA: 2:07 - loss: 1.9832 - regression_loss: 1.6922 - classification_loss: 0.2910 169/500 [=========>....................] - ETA: 2:06 - loss: 1.9843 - regression_loss: 1.6931 - classification_loss: 0.2912 170/500 [=========>....................] - ETA: 2:06 - loss: 1.9825 - regression_loss: 1.6916 - classification_loss: 0.2909 171/500 [=========>....................] - ETA: 2:06 - loss: 1.9863 - regression_loss: 1.6947 - classification_loss: 0.2915 172/500 [=========>....................] - ETA: 2:05 - loss: 1.9828 - regression_loss: 1.6917 - classification_loss: 0.2910 173/500 [=========>....................] - ETA: 2:05 - loss: 1.9824 - regression_loss: 1.6916 - classification_loss: 0.2908 174/500 [=========>....................] - ETA: 2:04 - loss: 1.9821 - regression_loss: 1.6911 - classification_loss: 0.2910 175/500 [=========>....................] - ETA: 2:04 - loss: 1.9819 - regression_loss: 1.6909 - classification_loss: 0.2910 176/500 [=========>....................] - ETA: 2:04 - loss: 1.9825 - regression_loss: 1.6920 - classification_loss: 0.2905 177/500 [=========>....................] - ETA: 2:03 - loss: 1.9827 - regression_loss: 1.6922 - classification_loss: 0.2905 178/500 [=========>....................] - ETA: 2:03 - loss: 1.9819 - regression_loss: 1.6912 - classification_loss: 0.2907 179/500 [=========>....................] - ETA: 2:02 - loss: 1.9817 - regression_loss: 1.6905 - classification_loss: 0.2912 180/500 [=========>....................] - ETA: 2:02 - loss: 1.9779 - regression_loss: 1.6868 - classification_loss: 0.2911 181/500 [=========>....................] - ETA: 2:02 - loss: 1.9773 - regression_loss: 1.6858 - classification_loss: 0.2914 182/500 [=========>....................] - ETA: 2:01 - loss: 1.9778 - regression_loss: 1.6862 - classification_loss: 0.2916 183/500 [=========>....................] - ETA: 2:01 - loss: 1.9804 - regression_loss: 1.6884 - classification_loss: 0.2920 184/500 [==========>...................] - ETA: 2:01 - loss: 1.9772 - regression_loss: 1.6857 - classification_loss: 0.2915 185/500 [==========>...................] - ETA: 2:00 - loss: 1.9782 - regression_loss: 1.6870 - classification_loss: 0.2912 186/500 [==========>...................] - ETA: 2:00 - loss: 1.9732 - regression_loss: 1.6827 - classification_loss: 0.2905 187/500 [==========>...................] - ETA: 2:00 - loss: 1.9735 - regression_loss: 1.6830 - classification_loss: 0.2906 188/500 [==========>...................] - ETA: 1:59 - loss: 1.9754 - regression_loss: 1.6846 - classification_loss: 0.2908 189/500 [==========>...................] - ETA: 1:59 - loss: 1.9734 - regression_loss: 1.6828 - classification_loss: 0.2906 190/500 [==========>...................] - ETA: 1:58 - loss: 1.9714 - regression_loss: 1.6811 - classification_loss: 0.2903 191/500 [==========>...................] - ETA: 1:58 - loss: 1.9679 - regression_loss: 1.6776 - classification_loss: 0.2903 192/500 [==========>...................] - ETA: 1:58 - loss: 1.9686 - regression_loss: 1.6784 - classification_loss: 0.2902 193/500 [==========>...................] - ETA: 1:57 - loss: 1.9691 - regression_loss: 1.6790 - classification_loss: 0.2902 194/500 [==========>...................] - ETA: 1:57 - loss: 1.9701 - regression_loss: 1.6800 - classification_loss: 0.2902 195/500 [==========>...................] - ETA: 1:56 - loss: 1.9720 - regression_loss: 1.6814 - classification_loss: 0.2905 196/500 [==========>...................] - ETA: 1:56 - loss: 1.9735 - regression_loss: 1.6828 - classification_loss: 0.2907 197/500 [==========>...................] - ETA: 1:56 - loss: 1.9729 - regression_loss: 1.6824 - classification_loss: 0.2905 198/500 [==========>...................] - ETA: 1:55 - loss: 1.9723 - regression_loss: 1.6820 - classification_loss: 0.2903 199/500 [==========>...................] - ETA: 1:55 - loss: 1.9730 - regression_loss: 1.6826 - classification_loss: 0.2904 200/500 [===========>..................] - ETA: 1:55 - loss: 1.9752 - regression_loss: 1.6843 - classification_loss: 0.2909 201/500 [===========>..................] - ETA: 1:54 - loss: 1.9783 - regression_loss: 1.6867 - classification_loss: 0.2916 202/500 [===========>..................] - ETA: 1:54 - loss: 1.9783 - regression_loss: 1.6865 - classification_loss: 0.2918 203/500 [===========>..................] - ETA: 1:53 - loss: 1.9767 - regression_loss: 1.6851 - classification_loss: 0.2916 204/500 [===========>..................] - ETA: 1:53 - loss: 1.9768 - regression_loss: 1.6853 - classification_loss: 0.2915 205/500 [===========>..................] - ETA: 1:53 - loss: 1.9749 - regression_loss: 1.6841 - classification_loss: 0.2908 206/500 [===========>..................] - ETA: 1:52 - loss: 1.9753 - regression_loss: 1.6844 - classification_loss: 0.2909 207/500 [===========>..................] - ETA: 1:52 - loss: 1.9757 - regression_loss: 1.6846 - classification_loss: 0.2911 208/500 [===========>..................] - ETA: 1:52 - loss: 1.9756 - regression_loss: 1.6846 - classification_loss: 0.2910 209/500 [===========>..................] - ETA: 1:51 - loss: 1.9750 - regression_loss: 1.6842 - classification_loss: 0.2908 210/500 [===========>..................] - ETA: 1:51 - loss: 1.9760 - regression_loss: 1.6850 - classification_loss: 0.2910 211/500 [===========>..................] - ETA: 1:50 - loss: 1.9743 - regression_loss: 1.6835 - classification_loss: 0.2908 212/500 [===========>..................] - ETA: 1:50 - loss: 1.9754 - regression_loss: 1.6844 - classification_loss: 0.2910 213/500 [===========>..................] - ETA: 1:50 - loss: 1.9762 - regression_loss: 1.6849 - classification_loss: 0.2913 214/500 [===========>..................] - ETA: 1:49 - loss: 1.9761 - regression_loss: 1.6848 - classification_loss: 0.2914 215/500 [===========>..................] - ETA: 1:49 - loss: 1.9764 - regression_loss: 1.6852 - classification_loss: 0.2913 216/500 [===========>..................] - ETA: 1:48 - loss: 1.9749 - regression_loss: 1.6838 - classification_loss: 0.2911 217/500 [============>.................] - ETA: 1:48 - loss: 1.9766 - regression_loss: 1.6853 - classification_loss: 0.2913 218/500 [============>.................] - ETA: 1:48 - loss: 1.9765 - regression_loss: 1.6851 - classification_loss: 0.2914 219/500 [============>.................] - ETA: 1:47 - loss: 1.9768 - regression_loss: 1.6853 - classification_loss: 0.2915 220/500 [============>.................] - ETA: 1:47 - loss: 1.9780 - regression_loss: 1.6864 - classification_loss: 0.2916 221/500 [============>.................] - ETA: 1:47 - loss: 1.9773 - regression_loss: 1.6857 - classification_loss: 0.2916 222/500 [============>.................] - ETA: 1:46 - loss: 1.9771 - regression_loss: 1.6855 - classification_loss: 0.2916 223/500 [============>.................] - ETA: 1:46 - loss: 1.9817 - regression_loss: 1.6893 - classification_loss: 0.2924 224/500 [============>.................] - ETA: 1:45 - loss: 1.9812 - regression_loss: 1.6889 - classification_loss: 0.2923 225/500 [============>.................] - ETA: 1:45 - loss: 1.9800 - regression_loss: 1.6881 - classification_loss: 0.2919 226/500 [============>.................] - ETA: 1:45 - loss: 1.9796 - regression_loss: 1.6879 - classification_loss: 0.2917 227/500 [============>.................] - ETA: 1:44 - loss: 1.9808 - regression_loss: 1.6890 - classification_loss: 0.2918 228/500 [============>.................] - ETA: 1:44 - loss: 1.9807 - regression_loss: 1.6891 - classification_loss: 0.2916 229/500 [============>.................] - ETA: 1:43 - loss: 1.9792 - regression_loss: 1.6880 - classification_loss: 0.2912 230/500 [============>.................] - ETA: 1:43 - loss: 1.9792 - regression_loss: 1.6881 - classification_loss: 0.2912 231/500 [============>.................] - ETA: 1:43 - loss: 1.9791 - regression_loss: 1.6878 - classification_loss: 0.2913 232/500 [============>.................] - ETA: 1:42 - loss: 1.9799 - regression_loss: 1.6886 - classification_loss: 0.2914 233/500 [============>.................] - ETA: 1:42 - loss: 1.9775 - regression_loss: 1.6866 - classification_loss: 0.2909 234/500 [=============>................] - ETA: 1:41 - loss: 1.9780 - regression_loss: 1.6872 - classification_loss: 0.2908 235/500 [=============>................] - ETA: 1:41 - loss: 1.9789 - regression_loss: 1.6881 - classification_loss: 0.2908 236/500 [=============>................] - ETA: 1:41 - loss: 1.9785 - regression_loss: 1.6879 - classification_loss: 0.2907 237/500 [=============>................] - ETA: 1:40 - loss: 1.9734 - regression_loss: 1.6835 - classification_loss: 0.2899 238/500 [=============>................] - ETA: 1:40 - loss: 1.9722 - regression_loss: 1.6823 - classification_loss: 0.2900 239/500 [=============>................] - ETA: 1:39 - loss: 1.9726 - regression_loss: 1.6826 - classification_loss: 0.2900 240/500 [=============>................] - ETA: 1:39 - loss: 1.9714 - regression_loss: 1.6815 - classification_loss: 0.2899 241/500 [=============>................] - ETA: 1:39 - loss: 1.9723 - regression_loss: 1.6823 - classification_loss: 0.2900 242/500 [=============>................] - ETA: 1:38 - loss: 1.9739 - regression_loss: 1.6838 - classification_loss: 0.2901 243/500 [=============>................] - ETA: 1:38 - loss: 1.9752 - regression_loss: 1.6850 - classification_loss: 0.2903 244/500 [=============>................] - ETA: 1:37 - loss: 1.9726 - regression_loss: 1.6825 - classification_loss: 0.2900 245/500 [=============>................] - ETA: 1:37 - loss: 1.9725 - regression_loss: 1.6825 - classification_loss: 0.2900 246/500 [=============>................] - ETA: 1:37 - loss: 1.9737 - regression_loss: 1.6835 - classification_loss: 0.2901 247/500 [=============>................] - ETA: 1:36 - loss: 1.9726 - regression_loss: 1.6824 - classification_loss: 0.2902 248/500 [=============>................] - ETA: 1:36 - loss: 1.9718 - regression_loss: 1.6815 - classification_loss: 0.2903 249/500 [=============>................] - ETA: 1:35 - loss: 1.9691 - regression_loss: 1.6793 - classification_loss: 0.2899 250/500 [==============>...............] - ETA: 1:35 - loss: 1.9687 - regression_loss: 1.6789 - classification_loss: 0.2898 251/500 [==============>...............] - ETA: 1:35 - loss: 1.9689 - regression_loss: 1.6789 - classification_loss: 0.2900 252/500 [==============>...............] - ETA: 1:34 - loss: 1.9681 - regression_loss: 1.6782 - classification_loss: 0.2899 253/500 [==============>...............] - ETA: 1:34 - loss: 1.9689 - regression_loss: 1.6790 - classification_loss: 0.2899 254/500 [==============>...............] - ETA: 1:34 - loss: 1.9694 - regression_loss: 1.6796 - classification_loss: 0.2898 255/500 [==============>...............] - ETA: 1:33 - loss: 1.9674 - regression_loss: 1.6780 - classification_loss: 0.2894 256/500 [==============>...............] - ETA: 1:33 - loss: 1.9671 - regression_loss: 1.6777 - classification_loss: 0.2894 257/500 [==============>...............] - ETA: 1:32 - loss: 1.9655 - regression_loss: 1.6764 - classification_loss: 0.2891 258/500 [==============>...............] - ETA: 1:32 - loss: 1.9663 - regression_loss: 1.6771 - classification_loss: 0.2892 259/500 [==============>...............] - ETA: 1:32 - loss: 1.9663 - regression_loss: 1.6768 - classification_loss: 0.2894 260/500 [==============>...............] - ETA: 1:31 - loss: 1.9664 - regression_loss: 1.6769 - classification_loss: 0.2895 261/500 [==============>...............] - ETA: 1:31 - loss: 1.9665 - regression_loss: 1.6771 - classification_loss: 0.2894 262/500 [==============>...............] - ETA: 1:30 - loss: 1.9663 - regression_loss: 1.6769 - classification_loss: 0.2894 263/500 [==============>...............] - ETA: 1:30 - loss: 1.9644 - regression_loss: 1.6750 - classification_loss: 0.2894 264/500 [==============>...............] - ETA: 1:30 - loss: 1.9671 - regression_loss: 1.6773 - classification_loss: 0.2898 265/500 [==============>...............] - ETA: 1:29 - loss: 1.9661 - regression_loss: 1.6760 - classification_loss: 0.2901 266/500 [==============>...............] - ETA: 1:29 - loss: 1.9659 - regression_loss: 1.6759 - classification_loss: 0.2900 267/500 [===============>..............] - ETA: 1:28 - loss: 1.9636 - regression_loss: 1.6741 - classification_loss: 0.2896 268/500 [===============>..............] - ETA: 1:28 - loss: 1.9637 - regression_loss: 1.6740 - classification_loss: 0.2896 269/500 [===============>..............] - ETA: 1:28 - loss: 1.9593 - regression_loss: 1.6704 - classification_loss: 0.2890 270/500 [===============>..............] - ETA: 1:27 - loss: 1.9554 - regression_loss: 1.6668 - classification_loss: 0.2886 271/500 [===============>..............] - ETA: 1:27 - loss: 1.9549 - regression_loss: 1.6664 - classification_loss: 0.2885 272/500 [===============>..............] - ETA: 1:27 - loss: 1.9547 - regression_loss: 1.6663 - classification_loss: 0.2883 273/500 [===============>..............] - ETA: 1:26 - loss: 1.9522 - regression_loss: 1.6641 - classification_loss: 0.2880 274/500 [===============>..............] - ETA: 1:26 - loss: 1.9520 - regression_loss: 1.6638 - classification_loss: 0.2882 275/500 [===============>..............] - ETA: 1:25 - loss: 1.9516 - regression_loss: 1.6635 - classification_loss: 0.2880 276/500 [===============>..............] - ETA: 1:25 - loss: 1.9495 - regression_loss: 1.6617 - classification_loss: 0.2878 277/500 [===============>..............] - ETA: 1:25 - loss: 1.9497 - regression_loss: 1.6620 - classification_loss: 0.2878 278/500 [===============>..............] - ETA: 1:24 - loss: 1.9470 - regression_loss: 1.6597 - classification_loss: 0.2873 279/500 [===============>..............] - ETA: 1:24 - loss: 1.9457 - regression_loss: 1.6586 - classification_loss: 0.2871 280/500 [===============>..............] - ETA: 1:23 - loss: 1.9461 - regression_loss: 1.6590 - classification_loss: 0.2871 281/500 [===============>..............] - ETA: 1:23 - loss: 1.9430 - regression_loss: 1.6562 - classification_loss: 0.2868 282/500 [===============>..............] - ETA: 1:23 - loss: 1.9411 - regression_loss: 1.6546 - classification_loss: 0.2865 283/500 [===============>..............] - ETA: 1:22 - loss: 1.9402 - regression_loss: 1.6539 - classification_loss: 0.2863 284/500 [================>.............] - ETA: 1:22 - loss: 1.9406 - regression_loss: 1.6543 - classification_loss: 0.2863 285/500 [================>.............] - ETA: 1:22 - loss: 1.9406 - regression_loss: 1.6543 - classification_loss: 0.2862 286/500 [================>.............] - ETA: 1:21 - loss: 1.9385 - regression_loss: 1.6524 - classification_loss: 0.2862 287/500 [================>.............] - ETA: 1:21 - loss: 1.9395 - regression_loss: 1.6533 - classification_loss: 0.2862 288/500 [================>.............] - ETA: 1:20 - loss: 1.9406 - regression_loss: 1.6541 - classification_loss: 0.2866 289/500 [================>.............] - ETA: 1:20 - loss: 1.9395 - regression_loss: 1.6531 - classification_loss: 0.2864 290/500 [================>.............] - ETA: 1:20 - loss: 1.9394 - regression_loss: 1.6530 - classification_loss: 0.2864 291/500 [================>.............] - ETA: 1:19 - loss: 1.9362 - regression_loss: 1.6501 - classification_loss: 0.2862 292/500 [================>.............] - ETA: 1:19 - loss: 1.9344 - regression_loss: 1.6486 - classification_loss: 0.2858 293/500 [================>.............] - ETA: 1:19 - loss: 1.9338 - regression_loss: 1.6481 - classification_loss: 0.2857 294/500 [================>.............] - ETA: 1:18 - loss: 1.9321 - regression_loss: 1.6466 - classification_loss: 0.2855 295/500 [================>.............] - ETA: 1:18 - loss: 1.9314 - regression_loss: 1.6460 - classification_loss: 0.2854 296/500 [================>.............] - ETA: 1:17 - loss: 1.9322 - regression_loss: 1.6469 - classification_loss: 0.2853 297/500 [================>.............] - ETA: 1:17 - loss: 1.9326 - regression_loss: 1.6474 - classification_loss: 0.2852 298/500 [================>.............] - ETA: 1:17 - loss: 1.9326 - regression_loss: 1.6474 - classification_loss: 0.2852 299/500 [================>.............] - ETA: 1:16 - loss: 1.9328 - regression_loss: 1.6477 - classification_loss: 0.2852 300/500 [=================>............] - ETA: 1:16 - loss: 1.9314 - regression_loss: 1.6466 - classification_loss: 0.2848 301/500 [=================>............] - ETA: 1:15 - loss: 1.9313 - regression_loss: 1.6464 - classification_loss: 0.2849 302/500 [=================>............] - ETA: 1:15 - loss: 1.9287 - regression_loss: 1.6440 - classification_loss: 0.2848 303/500 [=================>............] - ETA: 1:15 - loss: 1.9264 - regression_loss: 1.6421 - classification_loss: 0.2843 304/500 [=================>............] - ETA: 1:14 - loss: 1.9283 - regression_loss: 1.6437 - classification_loss: 0.2846 305/500 [=================>............] - ETA: 1:14 - loss: 1.9293 - regression_loss: 1.6446 - classification_loss: 0.2847 306/500 [=================>............] - ETA: 1:14 - loss: 1.9291 - regression_loss: 1.6445 - classification_loss: 0.2847 307/500 [=================>............] - ETA: 1:13 - loss: 1.9294 - regression_loss: 1.6444 - classification_loss: 0.2849 308/500 [=================>............] - ETA: 1:13 - loss: 1.9267 - regression_loss: 1.6423 - classification_loss: 0.2845 309/500 [=================>............] - ETA: 1:12 - loss: 1.9262 - regression_loss: 1.6418 - classification_loss: 0.2844 310/500 [=================>............] - ETA: 1:12 - loss: 1.9244 - regression_loss: 1.6405 - classification_loss: 0.2839 311/500 [=================>............] - ETA: 1:12 - loss: 1.9240 - regression_loss: 1.6402 - classification_loss: 0.2838 312/500 [=================>............] - ETA: 1:11 - loss: 1.9236 - regression_loss: 1.6399 - classification_loss: 0.2836 313/500 [=================>............] - ETA: 1:11 - loss: 1.9229 - regression_loss: 1.6396 - classification_loss: 0.2834 314/500 [=================>............] - ETA: 1:11 - loss: 1.9221 - regression_loss: 1.6389 - classification_loss: 0.2832 315/500 [=================>............] - ETA: 1:10 - loss: 1.9226 - regression_loss: 1.6395 - classification_loss: 0.2831 316/500 [=================>............] - ETA: 1:10 - loss: 1.9225 - regression_loss: 1.6395 - classification_loss: 0.2830 317/500 [==================>...........] - ETA: 1:09 - loss: 1.9227 - regression_loss: 1.6399 - classification_loss: 0.2828 318/500 [==================>...........] - ETA: 1:09 - loss: 1.9231 - regression_loss: 1.6405 - classification_loss: 0.2826 319/500 [==================>...........] - ETA: 1:09 - loss: 1.9230 - regression_loss: 1.6404 - classification_loss: 0.2826 320/500 [==================>...........] - ETA: 1:08 - loss: 1.9229 - regression_loss: 1.6404 - classification_loss: 0.2825 321/500 [==================>...........] - ETA: 1:08 - loss: 1.9214 - regression_loss: 1.6392 - classification_loss: 0.2821 322/500 [==================>...........] - ETA: 1:07 - loss: 1.9220 - regression_loss: 1.6399 - classification_loss: 0.2821 323/500 [==================>...........] - ETA: 1:07 - loss: 1.9221 - regression_loss: 1.6399 - classification_loss: 0.2821 324/500 [==================>...........] - ETA: 1:07 - loss: 1.9219 - regression_loss: 1.6398 - classification_loss: 0.2821 325/500 [==================>...........] - ETA: 1:06 - loss: 1.9223 - regression_loss: 1.6402 - classification_loss: 0.2821 326/500 [==================>...........] - ETA: 1:06 - loss: 1.9227 - regression_loss: 1.6404 - classification_loss: 0.2822 327/500 [==================>...........] - ETA: 1:06 - loss: 1.9222 - regression_loss: 1.6401 - classification_loss: 0.2821 328/500 [==================>...........] - ETA: 1:05 - loss: 1.9236 - regression_loss: 1.6414 - classification_loss: 0.2823 329/500 [==================>...........] - ETA: 1:05 - loss: 1.9241 - regression_loss: 1.6418 - classification_loss: 0.2823 330/500 [==================>...........] - ETA: 1:04 - loss: 1.9251 - regression_loss: 1.6426 - classification_loss: 0.2825 331/500 [==================>...........] - ETA: 1:04 - loss: 1.9254 - regression_loss: 1.6429 - classification_loss: 0.2824 332/500 [==================>...........] - ETA: 1:04 - loss: 1.9258 - regression_loss: 1.6433 - classification_loss: 0.2825 333/500 [==================>...........] - ETA: 1:03 - loss: 1.9268 - regression_loss: 1.6442 - classification_loss: 0.2826 334/500 [===================>..........] - ETA: 1:03 - loss: 1.9271 - regression_loss: 1.6445 - classification_loss: 0.2826 335/500 [===================>..........] - ETA: 1:02 - loss: 1.9283 - regression_loss: 1.6455 - classification_loss: 0.2828 336/500 [===================>..........] - ETA: 1:02 - loss: 1.9270 - regression_loss: 1.6445 - classification_loss: 0.2825 337/500 [===================>..........] - ETA: 1:02 - loss: 1.9269 - regression_loss: 1.6444 - classification_loss: 0.2825 338/500 [===================>..........] - ETA: 1:01 - loss: 1.9284 - regression_loss: 1.6457 - classification_loss: 0.2827 339/500 [===================>..........] - ETA: 1:01 - loss: 1.9272 - regression_loss: 1.6446 - classification_loss: 0.2825 340/500 [===================>..........] - ETA: 1:01 - loss: 1.9249 - regression_loss: 1.6422 - classification_loss: 0.2828 341/500 [===================>..........] - ETA: 1:00 - loss: 1.9249 - regression_loss: 1.6421 - classification_loss: 0.2828 342/500 [===================>..........] - ETA: 1:00 - loss: 1.9258 - regression_loss: 1.6429 - classification_loss: 0.2828 343/500 [===================>..........] - ETA: 59s - loss: 1.9245 - regression_loss: 1.6418 - classification_loss: 0.2827  344/500 [===================>..........] - ETA: 59s - loss: 1.9239 - regression_loss: 1.6414 - classification_loss: 0.2825 345/500 [===================>..........] - ETA: 59s - loss: 1.9234 - regression_loss: 1.6409 - classification_loss: 0.2825 346/500 [===================>..........] - ETA: 58s - loss: 1.9243 - regression_loss: 1.6416 - classification_loss: 0.2828 347/500 [===================>..........] - ETA: 58s - loss: 1.9246 - regression_loss: 1.6419 - classification_loss: 0.2828 348/500 [===================>..........] - ETA: 58s - loss: 1.9255 - regression_loss: 1.6426 - classification_loss: 0.2828 349/500 [===================>..........] - ETA: 57s - loss: 1.9277 - regression_loss: 1.6445 - classification_loss: 0.2831 350/500 [====================>.........] - ETA: 57s - loss: 1.9277 - regression_loss: 1.6446 - classification_loss: 0.2831 351/500 [====================>.........] - ETA: 56s - loss: 1.9276 - regression_loss: 1.6446 - classification_loss: 0.2829 352/500 [====================>.........] - ETA: 56s - loss: 1.9284 - regression_loss: 1.6452 - classification_loss: 0.2833 353/500 [====================>.........] - ETA: 56s - loss: 1.9288 - regression_loss: 1.6453 - classification_loss: 0.2835 354/500 [====================>.........] - ETA: 55s - loss: 1.9289 - regression_loss: 1.6454 - classification_loss: 0.2834 355/500 [====================>.........] - ETA: 55s - loss: 1.9287 - regression_loss: 1.6453 - classification_loss: 0.2834 356/500 [====================>.........] - ETA: 54s - loss: 1.9259 - regression_loss: 1.6429 - classification_loss: 0.2830 357/500 [====================>.........] - ETA: 54s - loss: 1.9263 - regression_loss: 1.6433 - classification_loss: 0.2831 358/500 [====================>.........] - ETA: 54s - loss: 1.9257 - regression_loss: 1.6429 - classification_loss: 0.2828 359/500 [====================>.........] - ETA: 53s - loss: 1.9262 - regression_loss: 1.6432 - classification_loss: 0.2829 360/500 [====================>.........] - ETA: 53s - loss: 1.9265 - regression_loss: 1.6436 - classification_loss: 0.2829 361/500 [====================>.........] - ETA: 53s - loss: 1.9274 - regression_loss: 1.6442 - classification_loss: 0.2832 362/500 [====================>.........] - ETA: 52s - loss: 1.9268 - regression_loss: 1.6439 - classification_loss: 0.2829 363/500 [====================>.........] - ETA: 52s - loss: 1.9263 - regression_loss: 1.6435 - classification_loss: 0.2827 364/500 [====================>.........] - ETA: 51s - loss: 1.9262 - regression_loss: 1.6433 - classification_loss: 0.2829 365/500 [====================>.........] - ETA: 51s - loss: 1.9259 - regression_loss: 1.6432 - classification_loss: 0.2827 366/500 [====================>.........] - ETA: 51s - loss: 1.9264 - regression_loss: 1.6437 - classification_loss: 0.2827 367/500 [=====================>........] - ETA: 50s - loss: 1.9286 - regression_loss: 1.6455 - classification_loss: 0.2831 368/500 [=====================>........] - ETA: 50s - loss: 1.9283 - regression_loss: 1.6452 - classification_loss: 0.2830 369/500 [=====================>........] - ETA: 49s - loss: 1.9280 - regression_loss: 1.6450 - classification_loss: 0.2830 370/500 [=====================>........] - ETA: 49s - loss: 1.9293 - regression_loss: 1.6464 - classification_loss: 0.2829 371/500 [=====================>........] - ETA: 49s - loss: 1.9300 - regression_loss: 1.6471 - classification_loss: 0.2829 372/500 [=====================>........] - ETA: 48s - loss: 1.9295 - regression_loss: 1.6467 - classification_loss: 0.2829 373/500 [=====================>........] - ETA: 48s - loss: 1.9288 - regression_loss: 1.6461 - classification_loss: 0.2828 374/500 [=====================>........] - ETA: 48s - loss: 1.9283 - regression_loss: 1.6455 - classification_loss: 0.2828 375/500 [=====================>........] - ETA: 47s - loss: 1.9281 - regression_loss: 1.6455 - classification_loss: 0.2827 376/500 [=====================>........] - ETA: 47s - loss: 1.9284 - regression_loss: 1.6458 - classification_loss: 0.2826 377/500 [=====================>........] - ETA: 46s - loss: 1.9298 - regression_loss: 1.6470 - classification_loss: 0.2828 378/500 [=====================>........] - ETA: 46s - loss: 1.9303 - regression_loss: 1.6475 - classification_loss: 0.2828 379/500 [=====================>........] - ETA: 46s - loss: 1.9287 - regression_loss: 1.6462 - classification_loss: 0.2825 380/500 [=====================>........] - ETA: 45s - loss: 1.9266 - regression_loss: 1.6444 - classification_loss: 0.2822 381/500 [=====================>........] - ETA: 45s - loss: 1.9267 - regression_loss: 1.6445 - classification_loss: 0.2822 382/500 [=====================>........] - ETA: 45s - loss: 1.9274 - regression_loss: 1.6451 - classification_loss: 0.2823 383/500 [=====================>........] - ETA: 44s - loss: 1.9284 - regression_loss: 1.6460 - classification_loss: 0.2824 384/500 [======================>.......] - ETA: 44s - loss: 1.9278 - regression_loss: 1.6455 - classification_loss: 0.2823 385/500 [======================>.......] - ETA: 43s - loss: 1.9277 - regression_loss: 1.6455 - classification_loss: 0.2822 386/500 [======================>.......] - ETA: 43s - loss: 1.9277 - regression_loss: 1.6456 - classification_loss: 0.2821 387/500 [======================>.......] - ETA: 43s - loss: 1.9278 - regression_loss: 1.6457 - classification_loss: 0.2821 388/500 [======================>.......] - ETA: 42s - loss: 1.9277 - regression_loss: 1.6455 - classification_loss: 0.2822 389/500 [======================>.......] - ETA: 42s - loss: 1.9263 - regression_loss: 1.6441 - classification_loss: 0.2822 390/500 [======================>.......] - ETA: 41s - loss: 1.9252 - regression_loss: 1.6432 - classification_loss: 0.2820 391/500 [======================>.......] - ETA: 41s - loss: 1.9252 - regression_loss: 1.6433 - classification_loss: 0.2819 392/500 [======================>.......] - ETA: 41s - loss: 1.9256 - regression_loss: 1.6436 - classification_loss: 0.2820 393/500 [======================>.......] - ETA: 40s - loss: 1.9258 - regression_loss: 1.6437 - classification_loss: 0.2820 394/500 [======================>.......] - ETA: 40s - loss: 1.9240 - regression_loss: 1.6423 - classification_loss: 0.2817 395/500 [======================>.......] - ETA: 40s - loss: 1.9238 - regression_loss: 1.6421 - classification_loss: 0.2817 396/500 [======================>.......] - ETA: 39s - loss: 1.9241 - regression_loss: 1.6425 - classification_loss: 0.2816 397/500 [======================>.......] - ETA: 39s - loss: 1.9244 - regression_loss: 1.6427 - classification_loss: 0.2817 398/500 [======================>.......] - ETA: 38s - loss: 1.9238 - regression_loss: 1.6425 - classification_loss: 0.2813 399/500 [======================>.......] - ETA: 38s - loss: 1.9214 - regression_loss: 1.6405 - classification_loss: 0.2810 400/500 [=======================>......] - ETA: 38s - loss: 1.9218 - regression_loss: 1.6408 - classification_loss: 0.2809 401/500 [=======================>......] - ETA: 37s - loss: 1.9224 - regression_loss: 1.6415 - classification_loss: 0.2809 402/500 [=======================>......] - ETA: 37s - loss: 1.9224 - regression_loss: 1.6415 - classification_loss: 0.2809 403/500 [=======================>......] - ETA: 36s - loss: 1.9231 - regression_loss: 1.6420 - classification_loss: 0.2811 404/500 [=======================>......] - ETA: 36s - loss: 1.9219 - regression_loss: 1.6409 - classification_loss: 0.2810 405/500 [=======================>......] - ETA: 36s - loss: 1.9225 - regression_loss: 1.6414 - classification_loss: 0.2810 406/500 [=======================>......] - ETA: 35s - loss: 1.9221 - regression_loss: 1.6411 - classification_loss: 0.2810 407/500 [=======================>......] - ETA: 35s - loss: 1.9214 - regression_loss: 1.6404 - classification_loss: 0.2810 408/500 [=======================>......] - ETA: 35s - loss: 1.9212 - regression_loss: 1.6404 - classification_loss: 0.2809 409/500 [=======================>......] - ETA: 34s - loss: 1.9216 - regression_loss: 1.6408 - classification_loss: 0.2808 410/500 [=======================>......] - ETA: 34s - loss: 1.9219 - regression_loss: 1.6412 - classification_loss: 0.2807 411/500 [=======================>......] - ETA: 33s - loss: 1.9206 - regression_loss: 1.6402 - classification_loss: 0.2804 412/500 [=======================>......] - ETA: 33s - loss: 1.9194 - regression_loss: 1.6391 - classification_loss: 0.2803 413/500 [=======================>......] - ETA: 33s - loss: 1.9194 - regression_loss: 1.6390 - classification_loss: 0.2804 414/500 [=======================>......] - ETA: 32s - loss: 1.9170 - regression_loss: 1.6370 - classification_loss: 0.2800 415/500 [=======================>......] - ETA: 32s - loss: 1.9168 - regression_loss: 1.6369 - classification_loss: 0.2799 416/500 [=======================>......] - ETA: 32s - loss: 1.9171 - regression_loss: 1.6372 - classification_loss: 0.2799 417/500 [========================>.....] - ETA: 31s - loss: 1.9170 - regression_loss: 1.6372 - classification_loss: 0.2797 418/500 [========================>.....] - ETA: 31s - loss: 1.9157 - regression_loss: 1.6362 - classification_loss: 0.2795 419/500 [========================>.....] - ETA: 30s - loss: 1.9167 - regression_loss: 1.6371 - classification_loss: 0.2795 420/500 [========================>.....] - ETA: 30s - loss: 1.9165 - regression_loss: 1.6371 - classification_loss: 0.2795 421/500 [========================>.....] - ETA: 30s - loss: 1.9159 - regression_loss: 1.6365 - classification_loss: 0.2794 422/500 [========================>.....] - ETA: 29s - loss: 1.9166 - regression_loss: 1.6372 - classification_loss: 0.2795 423/500 [========================>.....] - ETA: 29s - loss: 1.9165 - regression_loss: 1.6370 - classification_loss: 0.2795 424/500 [========================>.....] - ETA: 28s - loss: 1.9160 - regression_loss: 1.6367 - classification_loss: 0.2794 425/500 [========================>.....] - ETA: 28s - loss: 1.9158 - regression_loss: 1.6364 - classification_loss: 0.2794 426/500 [========================>.....] - ETA: 28s - loss: 1.9154 - regression_loss: 1.6361 - classification_loss: 0.2793 427/500 [========================>.....] - ETA: 27s - loss: 1.9151 - regression_loss: 1.6359 - classification_loss: 0.2792 428/500 [========================>.....] - ETA: 27s - loss: 1.9148 - regression_loss: 1.6357 - classification_loss: 0.2791 429/500 [========================>.....] - ETA: 27s - loss: 1.9143 - regression_loss: 1.6353 - classification_loss: 0.2790 430/500 [========================>.....] - ETA: 26s - loss: 1.9145 - regression_loss: 1.6355 - classification_loss: 0.2790 431/500 [========================>.....] - ETA: 26s - loss: 1.9135 - regression_loss: 1.6347 - classification_loss: 0.2787 432/500 [========================>.....] - ETA: 25s - loss: 1.9136 - regression_loss: 1.6346 - classification_loss: 0.2790 433/500 [========================>.....] - ETA: 25s - loss: 1.9134 - regression_loss: 1.6345 - classification_loss: 0.2789 434/500 [=========================>....] - ETA: 25s - loss: 1.9135 - regression_loss: 1.6347 - classification_loss: 0.2788 435/500 [=========================>....] - ETA: 24s - loss: 1.9130 - regression_loss: 1.6343 - classification_loss: 0.2787 436/500 [=========================>....] - ETA: 24s - loss: 1.9131 - regression_loss: 1.6345 - classification_loss: 0.2786 437/500 [=========================>....] - ETA: 24s - loss: 1.9127 - regression_loss: 1.6341 - classification_loss: 0.2786 438/500 [=========================>....] - ETA: 23s - loss: 1.9109 - regression_loss: 1.6323 - classification_loss: 0.2786 439/500 [=========================>....] - ETA: 23s - loss: 1.9092 - regression_loss: 1.6308 - classification_loss: 0.2784 440/500 [=========================>....] - ETA: 22s - loss: 1.9094 - regression_loss: 1.6310 - classification_loss: 0.2784 441/500 [=========================>....] - ETA: 22s - loss: 1.9094 - regression_loss: 1.6310 - classification_loss: 0.2784 442/500 [=========================>....] - ETA: 22s - loss: 1.9091 - regression_loss: 1.6307 - classification_loss: 0.2784 443/500 [=========================>....] - ETA: 21s - loss: 1.9093 - regression_loss: 1.6310 - classification_loss: 0.2784 444/500 [=========================>....] - ETA: 21s - loss: 1.9078 - regression_loss: 1.6297 - classification_loss: 0.2781 445/500 [=========================>....] - ETA: 20s - loss: 1.9079 - regression_loss: 1.6300 - classification_loss: 0.2779 446/500 [=========================>....] - ETA: 20s - loss: 1.9078 - regression_loss: 1.6299 - classification_loss: 0.2779 447/500 [=========================>....] - ETA: 20s - loss: 1.9077 - regression_loss: 1.6299 - classification_loss: 0.2778 448/500 [=========================>....] - ETA: 19s - loss: 1.9086 - regression_loss: 1.6306 - classification_loss: 0.2780 449/500 [=========================>....] - ETA: 19s - loss: 1.9074 - regression_loss: 1.6295 - classification_loss: 0.2778 450/500 [==========================>...] - ETA: 19s - loss: 1.9065 - regression_loss: 1.6288 - classification_loss: 0.2776 451/500 [==========================>...] - ETA: 18s - loss: 1.9070 - regression_loss: 1.6293 - classification_loss: 0.2778 452/500 [==========================>...] - ETA: 18s - loss: 1.9064 - regression_loss: 1.6288 - classification_loss: 0.2777 453/500 [==========================>...] - ETA: 17s - loss: 1.9070 - regression_loss: 1.6293 - classification_loss: 0.2777 454/500 [==========================>...] - ETA: 17s - loss: 1.9074 - regression_loss: 1.6297 - classification_loss: 0.2778 455/500 [==========================>...] - ETA: 17s - loss: 1.9065 - regression_loss: 1.6288 - classification_loss: 0.2778 456/500 [==========================>...] - ETA: 16s - loss: 1.9064 - regression_loss: 1.6286 - classification_loss: 0.2778 457/500 [==========================>...] - ETA: 16s - loss: 1.9067 - regression_loss: 1.6289 - classification_loss: 0.2777 458/500 [==========================>...] - ETA: 16s - loss: 1.9063 - regression_loss: 1.6287 - classification_loss: 0.2776 459/500 [==========================>...] - ETA: 15s - loss: 1.9053 - regression_loss: 1.6277 - classification_loss: 0.2775 460/500 [==========================>...] - ETA: 15s - loss: 1.9046 - regression_loss: 1.6272 - classification_loss: 0.2774 461/500 [==========================>...] - ETA: 14s - loss: 1.9044 - regression_loss: 1.6272 - classification_loss: 0.2773 462/500 [==========================>...] - ETA: 14s - loss: 1.9045 - regression_loss: 1.6271 - classification_loss: 0.2773 463/500 [==========================>...] - ETA: 14s - loss: 1.9033 - regression_loss: 1.6260 - classification_loss: 0.2773 464/500 [==========================>...] - ETA: 13s - loss: 1.9016 - regression_loss: 1.6245 - classification_loss: 0.2772 465/500 [==========================>...] - ETA: 13s - loss: 1.8996 - regression_loss: 1.6227 - classification_loss: 0.2770 466/500 [==========================>...] - ETA: 12s - loss: 1.8997 - regression_loss: 1.6227 - classification_loss: 0.2770 467/500 [===========================>..] - ETA: 12s - loss: 1.8999 - regression_loss: 1.6230 - classification_loss: 0.2769 468/500 [===========================>..] - ETA: 12s - loss: 1.9003 - regression_loss: 1.6235 - classification_loss: 0.2769 469/500 [===========================>..] - ETA: 11s - loss: 1.8991 - regression_loss: 1.6224 - classification_loss: 0.2767 470/500 [===========================>..] - ETA: 11s - loss: 1.8975 - regression_loss: 1.6210 - classification_loss: 0.2765 471/500 [===========================>..] - ETA: 11s - loss: 1.8962 - regression_loss: 1.6199 - classification_loss: 0.2763 472/500 [===========================>..] - ETA: 10s - loss: 1.8967 - regression_loss: 1.6202 - classification_loss: 0.2764 473/500 [===========================>..] - ETA: 10s - loss: 1.8957 - regression_loss: 1.6194 - classification_loss: 0.2763 474/500 [===========================>..] - ETA: 9s - loss: 1.8957 - regression_loss: 1.6194 - classification_loss: 0.2763  475/500 [===========================>..] - ETA: 9s - loss: 1.8946 - regression_loss: 1.6183 - classification_loss: 0.2762 476/500 [===========================>..] - ETA: 9s - loss: 1.8938 - regression_loss: 1.6177 - classification_loss: 0.2761 477/500 [===========================>..] - ETA: 8s - loss: 1.8927 - regression_loss: 1.6168 - classification_loss: 0.2759 478/500 [===========================>..] - ETA: 8s - loss: 1.8929 - regression_loss: 1.6170 - classification_loss: 0.2759 479/500 [===========================>..] - ETA: 7s - loss: 1.8923 - regression_loss: 1.6164 - classification_loss: 0.2759 480/500 [===========================>..] - ETA: 7s - loss: 1.8921 - regression_loss: 1.6162 - classification_loss: 0.2758 481/500 [===========================>..] - ETA: 7s - loss: 1.8920 - regression_loss: 1.6163 - classification_loss: 0.2758 482/500 [===========================>..] - ETA: 6s - loss: 1.8922 - regression_loss: 1.6162 - classification_loss: 0.2760 483/500 [===========================>..] - ETA: 6s - loss: 1.8905 - regression_loss: 1.6147 - classification_loss: 0.2758 484/500 [============================>.] - ETA: 6s - loss: 1.8901 - regression_loss: 1.6144 - classification_loss: 0.2757 485/500 [============================>.] - ETA: 5s - loss: 1.8909 - regression_loss: 1.6150 - classification_loss: 0.2759 486/500 [============================>.] - ETA: 5s - loss: 1.8909 - regression_loss: 1.6151 - classification_loss: 0.2758 487/500 [============================>.] - ETA: 4s - loss: 1.8892 - regression_loss: 1.6137 - classification_loss: 0.2755 488/500 [============================>.] - ETA: 4s - loss: 1.8892 - regression_loss: 1.6137 - classification_loss: 0.2755 489/500 [============================>.] - ETA: 4s - loss: 1.8893 - regression_loss: 1.6140 - classification_loss: 0.2753 490/500 [============================>.] - ETA: 3s - loss: 1.8875 - regression_loss: 1.6124 - classification_loss: 0.2751 491/500 [============================>.] - ETA: 3s - loss: 1.8879 - regression_loss: 1.6127 - classification_loss: 0.2752 492/500 [============================>.] - ETA: 3s - loss: 1.8865 - regression_loss: 1.6114 - classification_loss: 0.2751 493/500 [============================>.] - ETA: 2s - loss: 1.8873 - regression_loss: 1.6119 - classification_loss: 0.2753 494/500 [============================>.] - ETA: 2s - loss: 1.8882 - regression_loss: 1.6128 - classification_loss: 0.2754 495/500 [============================>.] - ETA: 1s - loss: 1.8869 - regression_loss: 1.6117 - classification_loss: 0.2752 496/500 [============================>.] - ETA: 1s - loss: 1.8879 - regression_loss: 1.6126 - classification_loss: 0.2753 497/500 [============================>.] - ETA: 1s - loss: 1.8883 - regression_loss: 1.6130 - classification_loss: 0.2753 498/500 [============================>.] - ETA: 0s - loss: 1.8876 - regression_loss: 1.6123 - classification_loss: 0.2753 499/500 [============================>.] - ETA: 0s - loss: 1.8863 - regression_loss: 1.6113 - classification_loss: 0.2750 500/500 [==============================] - 191s 381ms/step - loss: 1.8874 - regression_loss: 1.6123 - classification_loss: 0.2752 1172 instances of class plum with average precision: 0.5340 mAP: 0.5340 Epoch 00003: saving model to ./training/snapshots/resnet101_pascal_03.h5 Epoch 4/20 1/500 [..............................] - ETA: 3:11 - loss: 1.8990 - regression_loss: 1.7032 - classification_loss: 0.1958 2/500 [..............................] - ETA: 3:06 - loss: 2.0376 - regression_loss: 1.8019 - classification_loss: 0.2357 3/500 [..............................] - ETA: 3:06 - loss: 2.0509 - regression_loss: 1.8037 - classification_loss: 0.2473 4/500 [..............................] - ETA: 3:06 - loss: 1.9417 - regression_loss: 1.6989 - classification_loss: 0.2429 5/500 [..............................] - ETA: 3:07 - loss: 1.8915 - regression_loss: 1.6506 - classification_loss: 0.2408 6/500 [..............................] - ETA: 3:06 - loss: 1.8063 - regression_loss: 1.5790 - classification_loss: 0.2273 7/500 [..............................] - ETA: 3:05 - loss: 1.7843 - regression_loss: 1.5546 - classification_loss: 0.2298 8/500 [..............................] - ETA: 3:03 - loss: 1.7933 - regression_loss: 1.5639 - classification_loss: 0.2293 9/500 [..............................] - ETA: 3:02 - loss: 1.8078 - regression_loss: 1.5676 - classification_loss: 0.2402 10/500 [..............................] - ETA: 3:01 - loss: 1.8566 - regression_loss: 1.6132 - classification_loss: 0.2433 11/500 [..............................] - ETA: 3:01 - loss: 1.8835 - regression_loss: 1.6365 - classification_loss: 0.2469 12/500 [..............................] - ETA: 3:01 - loss: 1.9087 - regression_loss: 1.6547 - classification_loss: 0.2540 13/500 [..............................] - ETA: 3:01 - loss: 1.9526 - regression_loss: 1.6868 - classification_loss: 0.2657 14/500 [..............................] - ETA: 3:00 - loss: 1.9540 - regression_loss: 1.6858 - classification_loss: 0.2682 15/500 [..............................] - ETA: 3:00 - loss: 1.9285 - regression_loss: 1.6616 - classification_loss: 0.2669 16/500 [..............................] - ETA: 3:00 - loss: 1.9221 - regression_loss: 1.6546 - classification_loss: 0.2675 17/500 [>.............................] - ETA: 2:59 - loss: 1.9212 - regression_loss: 1.6530 - classification_loss: 0.2682 18/500 [>.............................] - ETA: 2:58 - loss: 1.9677 - regression_loss: 1.6900 - classification_loss: 0.2777 19/500 [>.............................] - ETA: 2:58 - loss: 1.9910 - regression_loss: 1.7047 - classification_loss: 0.2864 20/500 [>.............................] - ETA: 2:59 - loss: 2.0007 - regression_loss: 1.7085 - classification_loss: 0.2922 21/500 [>.............................] - ETA: 2:58 - loss: 1.9908 - regression_loss: 1.7028 - classification_loss: 0.2879 22/500 [>.............................] - ETA: 2:58 - loss: 1.9697 - regression_loss: 1.6848 - classification_loss: 0.2849 23/500 [>.............................] - ETA: 2:58 - loss: 1.9418 - regression_loss: 1.6629 - classification_loss: 0.2789 24/500 [>.............................] - ETA: 2:58 - loss: 1.9480 - regression_loss: 1.6687 - classification_loss: 0.2793 25/500 [>.............................] - ETA: 2:58 - loss: 1.9476 - regression_loss: 1.6682 - classification_loss: 0.2794 26/500 [>.............................] - ETA: 2:58 - loss: 1.9531 - regression_loss: 1.6690 - classification_loss: 0.2841 27/500 [>.............................] - ETA: 2:58 - loss: 1.9361 - regression_loss: 1.6537 - classification_loss: 0.2824 28/500 [>.............................] - ETA: 2:57 - loss: 1.9353 - regression_loss: 1.6527 - classification_loss: 0.2826 29/500 [>.............................] - ETA: 2:58 - loss: 1.9315 - regression_loss: 1.6526 - classification_loss: 0.2789 30/500 [>.............................] - ETA: 2:58 - loss: 1.9272 - regression_loss: 1.6494 - classification_loss: 0.2778 31/500 [>.............................] - ETA: 2:57 - loss: 1.9366 - regression_loss: 1.6569 - classification_loss: 0.2797 32/500 [>.............................] - ETA: 2:57 - loss: 1.9335 - regression_loss: 1.6551 - classification_loss: 0.2785 33/500 [>.............................] - ETA: 2:57 - loss: 1.9279 - regression_loss: 1.6512 - classification_loss: 0.2767 34/500 [=>............................] - ETA: 2:57 - loss: 1.9319 - regression_loss: 1.6543 - classification_loss: 0.2776 35/500 [=>............................] - ETA: 2:57 - loss: 1.9229 - regression_loss: 1.6470 - classification_loss: 0.2758 36/500 [=>............................] - ETA: 2:57 - loss: 1.9167 - regression_loss: 1.6382 - classification_loss: 0.2785 37/500 [=>............................] - ETA: 2:57 - loss: 1.8879 - regression_loss: 1.6140 - classification_loss: 0.2739 38/500 [=>............................] - ETA: 2:57 - loss: 1.8709 - regression_loss: 1.6005 - classification_loss: 0.2704 39/500 [=>............................] - ETA: 2:56 - loss: 1.8725 - regression_loss: 1.6021 - classification_loss: 0.2704 40/500 [=>............................] - ETA: 2:56 - loss: 1.8856 - regression_loss: 1.6124 - classification_loss: 0.2732 41/500 [=>............................] - ETA: 2:56 - loss: 1.8922 - regression_loss: 1.6181 - classification_loss: 0.2741 42/500 [=>............................] - ETA: 2:55 - loss: 1.8923 - regression_loss: 1.6185 - classification_loss: 0.2738 43/500 [=>............................] - ETA: 2:55 - loss: 1.8928 - regression_loss: 1.6187 - classification_loss: 0.2741 44/500 [=>............................] - ETA: 2:54 - loss: 1.8926 - regression_loss: 1.6185 - classification_loss: 0.2741 45/500 [=>............................] - ETA: 2:54 - loss: 1.8905 - regression_loss: 1.6165 - classification_loss: 0.2739 46/500 [=>............................] - ETA: 2:53 - loss: 1.8755 - regression_loss: 1.6032 - classification_loss: 0.2723 47/500 [=>............................] - ETA: 2:53 - loss: 1.8677 - regression_loss: 1.5979 - classification_loss: 0.2699 48/500 [=>............................] - ETA: 2:53 - loss: 1.8589 - regression_loss: 1.5902 - classification_loss: 0.2687 49/500 [=>............................] - ETA: 2:53 - loss: 1.8606 - regression_loss: 1.5909 - classification_loss: 0.2697 50/500 [==>...........................] - ETA: 2:52 - loss: 1.8618 - regression_loss: 1.5930 - classification_loss: 0.2688 51/500 [==>...........................] - ETA: 2:52 - loss: 1.8554 - regression_loss: 1.5885 - classification_loss: 0.2669 52/500 [==>...........................] - ETA: 2:52 - loss: 1.8433 - regression_loss: 1.5782 - classification_loss: 0.2651 53/500 [==>...........................] - ETA: 2:52 - loss: 1.8357 - regression_loss: 1.5720 - classification_loss: 0.2637 54/500 [==>...........................] - ETA: 2:51 - loss: 1.8210 - regression_loss: 1.5589 - classification_loss: 0.2621 55/500 [==>...........................] - ETA: 2:51 - loss: 1.8168 - regression_loss: 1.5549 - classification_loss: 0.2619 56/500 [==>...........................] - ETA: 2:50 - loss: 1.8199 - regression_loss: 1.5580 - classification_loss: 0.2619 57/500 [==>...........................] - ETA: 2:50 - loss: 1.8105 - regression_loss: 1.5499 - classification_loss: 0.2606 58/500 [==>...........................] - ETA: 2:49 - loss: 1.8109 - regression_loss: 1.5504 - classification_loss: 0.2605 59/500 [==>...........................] - ETA: 2:49 - loss: 1.8157 - regression_loss: 1.5550 - classification_loss: 0.2607 60/500 [==>...........................] - ETA: 2:48 - loss: 1.8088 - regression_loss: 1.5492 - classification_loss: 0.2597 61/500 [==>...........................] - ETA: 2:48 - loss: 1.8208 - regression_loss: 1.5596 - classification_loss: 0.2612 62/500 [==>...........................] - ETA: 2:48 - loss: 1.8231 - regression_loss: 1.5618 - classification_loss: 0.2614 63/500 [==>...........................] - ETA: 2:47 - loss: 1.8139 - regression_loss: 1.5541 - classification_loss: 0.2598 64/500 [==>...........................] - ETA: 2:47 - loss: 1.8120 - regression_loss: 1.5528 - classification_loss: 0.2592 65/500 [==>...........................] - ETA: 2:46 - loss: 1.8060 - regression_loss: 1.5480 - classification_loss: 0.2580 66/500 [==>...........................] - ETA: 2:46 - loss: 1.8057 - regression_loss: 1.5478 - classification_loss: 0.2578 67/500 [===>..........................] - ETA: 2:46 - loss: 1.8021 - regression_loss: 1.5441 - classification_loss: 0.2581 68/500 [===>..........................] - ETA: 2:45 - loss: 1.8007 - regression_loss: 1.5432 - classification_loss: 0.2575 69/500 [===>..........................] - ETA: 2:45 - loss: 1.8028 - regression_loss: 1.5446 - classification_loss: 0.2582 70/500 [===>..........................] - ETA: 2:45 - loss: 1.8020 - regression_loss: 1.5443 - classification_loss: 0.2577 71/500 [===>..........................] - ETA: 2:44 - loss: 1.8163 - regression_loss: 1.5565 - classification_loss: 0.2598 72/500 [===>..........................] - ETA: 2:44 - loss: 1.8114 - regression_loss: 1.5511 - classification_loss: 0.2603 73/500 [===>..........................] - ETA: 2:44 - loss: 1.8102 - regression_loss: 1.5500 - classification_loss: 0.2601 74/500 [===>..........................] - ETA: 2:43 - loss: 1.8153 - regression_loss: 1.5547 - classification_loss: 0.2606 75/500 [===>..........................] - ETA: 2:43 - loss: 1.8164 - regression_loss: 1.5554 - classification_loss: 0.2610 76/500 [===>..........................] - ETA: 2:42 - loss: 1.8138 - regression_loss: 1.5528 - classification_loss: 0.2610 77/500 [===>..........................] - ETA: 2:42 - loss: 1.8131 - regression_loss: 1.5525 - classification_loss: 0.2606 78/500 [===>..........................] - ETA: 2:41 - loss: 1.8127 - regression_loss: 1.5527 - classification_loss: 0.2600 79/500 [===>..........................] - ETA: 2:41 - loss: 1.8025 - regression_loss: 1.5441 - classification_loss: 0.2584 80/500 [===>..........................] - ETA: 2:41 - loss: 1.8042 - regression_loss: 1.5458 - classification_loss: 0.2585 81/500 [===>..........................] - ETA: 2:40 - loss: 1.8051 - regression_loss: 1.5469 - classification_loss: 0.2582 82/500 [===>..........................] - ETA: 2:40 - loss: 1.7985 - regression_loss: 1.5406 - classification_loss: 0.2579 83/500 [===>..........................] - ETA: 2:40 - loss: 1.7909 - regression_loss: 1.5328 - classification_loss: 0.2581 84/500 [====>.........................] - ETA: 2:39 - loss: 1.7910 - regression_loss: 1.5333 - classification_loss: 0.2577 85/500 [====>.........................] - ETA: 2:39 - loss: 1.7917 - regression_loss: 1.5341 - classification_loss: 0.2575 86/500 [====>.........................] - ETA: 2:38 - loss: 1.7948 - regression_loss: 1.5371 - classification_loss: 0.2578 87/500 [====>.........................] - ETA: 2:38 - loss: 1.7968 - regression_loss: 1.5391 - classification_loss: 0.2577 88/500 [====>.........................] - ETA: 2:37 - loss: 1.7987 - regression_loss: 1.5410 - classification_loss: 0.2578 89/500 [====>.........................] - ETA: 2:37 - loss: 1.7983 - regression_loss: 1.5406 - classification_loss: 0.2577 90/500 [====>.........................] - ETA: 2:37 - loss: 1.7997 - regression_loss: 1.5417 - classification_loss: 0.2580 91/500 [====>.........................] - ETA: 2:36 - loss: 1.7993 - regression_loss: 1.5418 - classification_loss: 0.2575 92/500 [====>.........................] - ETA: 2:36 - loss: 1.8045 - regression_loss: 1.5458 - classification_loss: 0.2587 93/500 [====>.........................] - ETA: 2:35 - loss: 1.8071 - regression_loss: 1.5472 - classification_loss: 0.2599 94/500 [====>.........................] - ETA: 2:35 - loss: 1.8088 - regression_loss: 1.5486 - classification_loss: 0.2602 95/500 [====>.........................] - ETA: 2:35 - loss: 1.8107 - regression_loss: 1.5503 - classification_loss: 0.2604 96/500 [====>.........................] - ETA: 2:34 - loss: 1.8149 - regression_loss: 1.5539 - classification_loss: 0.2610 97/500 [====>.........................] - ETA: 2:34 - loss: 1.8125 - regression_loss: 1.5521 - classification_loss: 0.2603 98/500 [====>.........................] - ETA: 2:34 - loss: 1.8202 - regression_loss: 1.5582 - classification_loss: 0.2620 99/500 [====>.........................] - ETA: 2:33 - loss: 1.8263 - regression_loss: 1.5632 - classification_loss: 0.2631 100/500 [=====>........................] - ETA: 2:33 - loss: 1.8261 - regression_loss: 1.5628 - classification_loss: 0.2632 101/500 [=====>........................] - ETA: 2:32 - loss: 1.8189 - regression_loss: 1.5568 - classification_loss: 0.2621 102/500 [=====>........................] - ETA: 2:32 - loss: 1.8159 - regression_loss: 1.5536 - classification_loss: 0.2623 103/500 [=====>........................] - ETA: 2:32 - loss: 1.8103 - regression_loss: 1.5486 - classification_loss: 0.2617 104/500 [=====>........................] - ETA: 2:31 - loss: 1.8108 - regression_loss: 1.5489 - classification_loss: 0.2619 105/500 [=====>........................] - ETA: 2:31 - loss: 1.8095 - regression_loss: 1.5481 - classification_loss: 0.2615 106/500 [=====>........................] - ETA: 2:30 - loss: 1.8014 - regression_loss: 1.5411 - classification_loss: 0.2603 107/500 [=====>........................] - ETA: 2:30 - loss: 1.8026 - regression_loss: 1.5422 - classification_loss: 0.2604 108/500 [=====>........................] - ETA: 2:30 - loss: 1.8033 - regression_loss: 1.5429 - classification_loss: 0.2604 109/500 [=====>........................] - ETA: 2:29 - loss: 1.8003 - regression_loss: 1.5408 - classification_loss: 0.2595 110/500 [=====>........................] - ETA: 2:29 - loss: 1.8062 - regression_loss: 1.5467 - classification_loss: 0.2595 111/500 [=====>........................] - ETA: 2:28 - loss: 1.8020 - regression_loss: 1.5435 - classification_loss: 0.2585 112/500 [=====>........................] - ETA: 2:28 - loss: 1.8026 - regression_loss: 1.5442 - classification_loss: 0.2585 113/500 [=====>........................] - ETA: 2:27 - loss: 1.8028 - regression_loss: 1.5444 - classification_loss: 0.2584 114/500 [=====>........................] - ETA: 2:27 - loss: 1.8013 - regression_loss: 1.5432 - classification_loss: 0.2580 115/500 [=====>........................] - ETA: 2:27 - loss: 1.7936 - regression_loss: 1.5369 - classification_loss: 0.2567 116/500 [=====>........................] - ETA: 2:26 - loss: 1.7918 - regression_loss: 1.5353 - classification_loss: 0.2565 117/500 [======>.......................] - ETA: 2:26 - loss: 1.7864 - regression_loss: 1.5309 - classification_loss: 0.2555 118/500 [======>.......................] - ETA: 2:25 - loss: 1.7861 - regression_loss: 1.5308 - classification_loss: 0.2553 119/500 [======>.......................] - ETA: 2:25 - loss: 1.7915 - regression_loss: 1.5356 - classification_loss: 0.2559 120/500 [======>.......................] - ETA: 2:25 - loss: 1.7928 - regression_loss: 1.5367 - classification_loss: 0.2561 121/500 [======>.......................] - ETA: 2:24 - loss: 1.7921 - regression_loss: 1.5362 - classification_loss: 0.2560 122/500 [======>.......................] - ETA: 2:24 - loss: 1.7927 - regression_loss: 1.5368 - classification_loss: 0.2559 123/500 [======>.......................] - ETA: 2:24 - loss: 1.7932 - regression_loss: 1.5373 - classification_loss: 0.2559 124/500 [======>.......................] - ETA: 2:23 - loss: 1.7926 - regression_loss: 1.5358 - classification_loss: 0.2568 125/500 [======>.......................] - ETA: 2:23 - loss: 1.7884 - regression_loss: 1.5324 - classification_loss: 0.2560 126/500 [======>.......................] - ETA: 2:23 - loss: 1.7861 - regression_loss: 1.5306 - classification_loss: 0.2555 127/500 [======>.......................] - ETA: 2:22 - loss: 1.7855 - regression_loss: 1.5302 - classification_loss: 0.2552 128/500 [======>.......................] - ETA: 2:22 - loss: 1.7806 - regression_loss: 1.5263 - classification_loss: 0.2543 129/500 [======>.......................] - ETA: 2:21 - loss: 1.7860 - regression_loss: 1.5307 - classification_loss: 0.2553 130/500 [======>.......................] - ETA: 2:21 - loss: 1.7868 - regression_loss: 1.5316 - classification_loss: 0.2551 131/500 [======>.......................] - ETA: 2:21 - loss: 1.7920 - regression_loss: 1.5361 - classification_loss: 0.2559 132/500 [======>.......................] - ETA: 2:20 - loss: 1.7930 - regression_loss: 1.5369 - classification_loss: 0.2561 133/500 [======>.......................] - ETA: 2:20 - loss: 1.7930 - regression_loss: 1.5369 - classification_loss: 0.2561 134/500 [=======>......................] - ETA: 2:19 - loss: 1.7925 - regression_loss: 1.5370 - classification_loss: 0.2555 135/500 [=======>......................] - ETA: 2:19 - loss: 1.7935 - regression_loss: 1.5384 - classification_loss: 0.2552 136/500 [=======>......................] - ETA: 2:19 - loss: 1.7901 - regression_loss: 1.5360 - classification_loss: 0.2541 137/500 [=======>......................] - ETA: 2:18 - loss: 1.7874 - regression_loss: 1.5334 - classification_loss: 0.2541 138/500 [=======>......................] - ETA: 2:18 - loss: 1.7878 - regression_loss: 1.5332 - classification_loss: 0.2545 139/500 [=======>......................] - ETA: 2:17 - loss: 1.7874 - regression_loss: 1.5330 - classification_loss: 0.2544 140/500 [=======>......................] - ETA: 2:17 - loss: 1.7839 - regression_loss: 1.5301 - classification_loss: 0.2539 141/500 [=======>......................] - ETA: 2:17 - loss: 1.7846 - regression_loss: 1.5300 - classification_loss: 0.2545 142/500 [=======>......................] - ETA: 2:16 - loss: 1.7850 - regression_loss: 1.5298 - classification_loss: 0.2551 143/500 [=======>......................] - ETA: 2:16 - loss: 1.7843 - regression_loss: 1.5295 - classification_loss: 0.2548 144/500 [=======>......................] - ETA: 2:16 - loss: 1.7813 - regression_loss: 1.5273 - classification_loss: 0.2540 145/500 [=======>......................] - ETA: 2:15 - loss: 1.7831 - regression_loss: 1.5289 - classification_loss: 0.2543 146/500 [=======>......................] - ETA: 2:15 - loss: 1.7838 - regression_loss: 1.5292 - classification_loss: 0.2545 147/500 [=======>......................] - ETA: 2:14 - loss: 1.7846 - regression_loss: 1.5299 - classification_loss: 0.2548 148/500 [=======>......................] - ETA: 2:14 - loss: 1.7843 - regression_loss: 1.5295 - classification_loss: 0.2548 149/500 [=======>......................] - ETA: 2:14 - loss: 1.7861 - regression_loss: 1.5312 - classification_loss: 0.2549 150/500 [========>.....................] - ETA: 2:13 - loss: 1.7827 - regression_loss: 1.5280 - classification_loss: 0.2547 151/500 [========>.....................] - ETA: 2:13 - loss: 1.7825 - regression_loss: 1.5279 - classification_loss: 0.2546 152/500 [========>.....................] - ETA: 2:13 - loss: 1.7849 - regression_loss: 1.5302 - classification_loss: 0.2547 153/500 [========>.....................] - ETA: 2:12 - loss: 1.7849 - regression_loss: 1.5305 - classification_loss: 0.2544 154/500 [========>.....................] - ETA: 2:12 - loss: 1.7857 - regression_loss: 1.5312 - classification_loss: 0.2545 155/500 [========>.....................] - ETA: 2:11 - loss: 1.7874 - regression_loss: 1.5328 - classification_loss: 0.2547 156/500 [========>.....................] - ETA: 2:11 - loss: 1.7882 - regression_loss: 1.5329 - classification_loss: 0.2553 157/500 [========>.....................] - ETA: 2:11 - loss: 1.7877 - regression_loss: 1.5326 - classification_loss: 0.2551 158/500 [========>.....................] - ETA: 2:10 - loss: 1.7914 - regression_loss: 1.5359 - classification_loss: 0.2554 159/500 [========>.....................] - ETA: 2:10 - loss: 1.7908 - regression_loss: 1.5353 - classification_loss: 0.2555 160/500 [========>.....................] - ETA: 2:10 - loss: 1.7915 - regression_loss: 1.5358 - classification_loss: 0.2557 161/500 [========>.....................] - ETA: 2:09 - loss: 1.7914 - regression_loss: 1.5353 - classification_loss: 0.2561 162/500 [========>.....................] - ETA: 2:09 - loss: 1.7911 - regression_loss: 1.5353 - classification_loss: 0.2558 163/500 [========>.....................] - ETA: 2:08 - loss: 1.7961 - regression_loss: 1.5396 - classification_loss: 0.2565 164/500 [========>.....................] - ETA: 2:08 - loss: 1.7929 - regression_loss: 1.5368 - classification_loss: 0.2561 165/500 [========>.....................] - ETA: 2:08 - loss: 1.7965 - regression_loss: 1.5402 - classification_loss: 0.2563 166/500 [========>.....................] - ETA: 2:07 - loss: 1.7975 - regression_loss: 1.5413 - classification_loss: 0.2562 167/500 [=========>....................] - ETA: 2:07 - loss: 1.8005 - regression_loss: 1.5439 - classification_loss: 0.2566 168/500 [=========>....................] - ETA: 2:06 - loss: 1.8032 - regression_loss: 1.5460 - classification_loss: 0.2572 169/500 [=========>....................] - ETA: 2:06 - loss: 1.8027 - regression_loss: 1.5458 - classification_loss: 0.2568 170/500 [=========>....................] - ETA: 2:06 - loss: 1.8045 - regression_loss: 1.5475 - classification_loss: 0.2570 171/500 [=========>....................] - ETA: 2:05 - loss: 1.8079 - regression_loss: 1.5493 - classification_loss: 0.2586 172/500 [=========>....................] - ETA: 2:05 - loss: 1.8085 - regression_loss: 1.5502 - classification_loss: 0.2583 173/500 [=========>....................] - ETA: 2:04 - loss: 1.8114 - regression_loss: 1.5528 - classification_loss: 0.2585 174/500 [=========>....................] - ETA: 2:04 - loss: 1.8116 - regression_loss: 1.5534 - classification_loss: 0.2582 175/500 [=========>....................] - ETA: 2:04 - loss: 1.8118 - regression_loss: 1.5538 - classification_loss: 0.2580 176/500 [=========>....................] - ETA: 2:03 - loss: 1.8130 - regression_loss: 1.5550 - classification_loss: 0.2580 177/500 [=========>....................] - ETA: 2:03 - loss: 1.8131 - regression_loss: 1.5552 - classification_loss: 0.2580 178/500 [=========>....................] - ETA: 2:02 - loss: 1.8115 - regression_loss: 1.5538 - classification_loss: 0.2577 179/500 [=========>....................] - ETA: 2:02 - loss: 1.8077 - regression_loss: 1.5507 - classification_loss: 0.2571 180/500 [=========>....................] - ETA: 2:02 - loss: 1.8083 - regression_loss: 1.5512 - classification_loss: 0.2571 181/500 [=========>....................] - ETA: 2:01 - loss: 1.8065 - regression_loss: 1.5497 - classification_loss: 0.2568 182/500 [=========>....................] - ETA: 2:01 - loss: 1.8058 - regression_loss: 1.5493 - classification_loss: 0.2565 183/500 [=========>....................] - ETA: 2:00 - loss: 1.8064 - regression_loss: 1.5502 - classification_loss: 0.2562 184/500 [==========>...................] - ETA: 2:00 - loss: 1.8070 - regression_loss: 1.5508 - classification_loss: 0.2562 185/500 [==========>...................] - ETA: 2:00 - loss: 1.8043 - regression_loss: 1.5482 - classification_loss: 0.2561 186/500 [==========>...................] - ETA: 1:59 - loss: 1.8043 - regression_loss: 1.5482 - classification_loss: 0.2561 187/500 [==========>...................] - ETA: 1:59 - loss: 1.8038 - regression_loss: 1.5476 - classification_loss: 0.2562 188/500 [==========>...................] - ETA: 1:58 - loss: 1.8038 - regression_loss: 1.5477 - classification_loss: 0.2561 189/500 [==========>...................] - ETA: 1:58 - loss: 1.8051 - regression_loss: 1.5489 - classification_loss: 0.2563 190/500 [==========>...................] - ETA: 1:58 - loss: 1.8011 - regression_loss: 1.5455 - classification_loss: 0.2557 191/500 [==========>...................] - ETA: 1:57 - loss: 1.7997 - regression_loss: 1.5442 - classification_loss: 0.2555 192/500 [==========>...................] - ETA: 1:57 - loss: 1.7997 - regression_loss: 1.5439 - classification_loss: 0.2558 193/500 [==========>...................] - ETA: 1:57 - loss: 1.7977 - regression_loss: 1.5421 - classification_loss: 0.2556 194/500 [==========>...................] - ETA: 1:56 - loss: 1.7975 - regression_loss: 1.5418 - classification_loss: 0.2557 195/500 [==========>...................] - ETA: 1:56 - loss: 1.7974 - regression_loss: 1.5417 - classification_loss: 0.2557 196/500 [==========>...................] - ETA: 1:55 - loss: 1.7967 - regression_loss: 1.5413 - classification_loss: 0.2555 197/500 [==========>...................] - ETA: 1:55 - loss: 1.7954 - regression_loss: 1.5403 - classification_loss: 0.2551 198/500 [==========>...................] - ETA: 1:55 - loss: 1.7950 - regression_loss: 1.5401 - classification_loss: 0.2550 199/500 [==========>...................] - ETA: 1:54 - loss: 1.7942 - regression_loss: 1.5393 - classification_loss: 0.2548 200/500 [===========>..................] - ETA: 1:54 - loss: 1.7961 - regression_loss: 1.5411 - classification_loss: 0.2550 201/500 [===========>..................] - ETA: 1:53 - loss: 1.7979 - regression_loss: 1.5427 - classification_loss: 0.2552 202/500 [===========>..................] - ETA: 1:53 - loss: 1.7942 - regression_loss: 1.5394 - classification_loss: 0.2548 203/500 [===========>..................] - ETA: 1:53 - loss: 1.7923 - regression_loss: 1.5379 - classification_loss: 0.2544 204/500 [===========>..................] - ETA: 1:52 - loss: 1.7927 - regression_loss: 1.5382 - classification_loss: 0.2545 205/500 [===========>..................] - ETA: 1:52 - loss: 1.7943 - regression_loss: 1.5397 - classification_loss: 0.2546 206/500 [===========>..................] - ETA: 1:52 - loss: 1.7929 - regression_loss: 1.5385 - classification_loss: 0.2545 207/500 [===========>..................] - ETA: 1:51 - loss: 1.7945 - regression_loss: 1.5395 - classification_loss: 0.2549 208/500 [===========>..................] - ETA: 1:51 - loss: 1.7969 - regression_loss: 1.5415 - classification_loss: 0.2554 209/500 [===========>..................] - ETA: 1:50 - loss: 1.7949 - regression_loss: 1.5401 - classification_loss: 0.2549 210/500 [===========>..................] - ETA: 1:50 - loss: 1.7914 - regression_loss: 1.5368 - classification_loss: 0.2546 211/500 [===========>..................] - ETA: 1:50 - loss: 1.7912 - regression_loss: 1.5367 - classification_loss: 0.2545 212/500 [===========>..................] - ETA: 1:49 - loss: 1.7931 - regression_loss: 1.5378 - classification_loss: 0.2553 213/500 [===========>..................] - ETA: 1:49 - loss: 1.7907 - regression_loss: 1.5357 - classification_loss: 0.2549 214/500 [===========>..................] - ETA: 1:48 - loss: 1.7865 - regression_loss: 1.5322 - classification_loss: 0.2544 215/500 [===========>..................] - ETA: 1:48 - loss: 1.7865 - regression_loss: 1.5321 - classification_loss: 0.2544 216/500 [===========>..................] - ETA: 1:48 - loss: 1.7850 - regression_loss: 1.5309 - classification_loss: 0.2542 217/500 [============>.................] - ETA: 1:47 - loss: 1.7820 - regression_loss: 1.5284 - classification_loss: 0.2537 218/500 [============>.................] - ETA: 1:47 - loss: 1.7854 - regression_loss: 1.5313 - classification_loss: 0.2541 219/500 [============>.................] - ETA: 1:47 - loss: 1.7818 - regression_loss: 1.5283 - classification_loss: 0.2536 220/500 [============>.................] - ETA: 1:46 - loss: 1.7812 - regression_loss: 1.5278 - classification_loss: 0.2534 221/500 [============>.................] - ETA: 1:46 - loss: 1.7790 - regression_loss: 1.5258 - classification_loss: 0.2531 222/500 [============>.................] - ETA: 1:46 - loss: 1.7743 - regression_loss: 1.5216 - classification_loss: 0.2527 223/500 [============>.................] - ETA: 1:45 - loss: 1.7706 - regression_loss: 1.5184 - classification_loss: 0.2523 224/500 [============>.................] - ETA: 1:45 - loss: 1.7724 - regression_loss: 1.5200 - classification_loss: 0.2524 225/500 [============>.................] - ETA: 1:44 - loss: 1.7721 - regression_loss: 1.5196 - classification_loss: 0.2525 226/500 [============>.................] - ETA: 1:44 - loss: 1.7712 - regression_loss: 1.5189 - classification_loss: 0.2523 227/500 [============>.................] - ETA: 1:44 - loss: 1.7705 - regression_loss: 1.5184 - classification_loss: 0.2520 228/500 [============>.................] - ETA: 1:43 - loss: 1.7696 - regression_loss: 1.5176 - classification_loss: 0.2520 229/500 [============>.................] - ETA: 1:43 - loss: 1.7678 - regression_loss: 1.5162 - classification_loss: 0.2515 230/500 [============>.................] - ETA: 1:43 - loss: 1.7678 - regression_loss: 1.5161 - classification_loss: 0.2517 231/500 [============>.................] - ETA: 1:42 - loss: 1.7682 - regression_loss: 1.5165 - classification_loss: 0.2516 232/500 [============>.................] - ETA: 1:42 - loss: 1.7703 - regression_loss: 1.5186 - classification_loss: 0.2516 233/500 [============>.................] - ETA: 1:41 - loss: 1.7678 - regression_loss: 1.5166 - classification_loss: 0.2512 234/500 [=============>................] - ETA: 1:41 - loss: 1.7678 - regression_loss: 1.5163 - classification_loss: 0.2515 235/500 [=============>................] - ETA: 1:41 - loss: 1.7686 - regression_loss: 1.5171 - classification_loss: 0.2516 236/500 [=============>................] - ETA: 1:40 - loss: 1.7701 - regression_loss: 1.5183 - classification_loss: 0.2519 237/500 [=============>................] - ETA: 1:40 - loss: 1.7711 - regression_loss: 1.5193 - classification_loss: 0.2519 238/500 [=============>................] - ETA: 1:39 - loss: 1.7720 - regression_loss: 1.5199 - classification_loss: 0.2521 239/500 [=============>................] - ETA: 1:39 - loss: 1.7734 - regression_loss: 1.5212 - classification_loss: 0.2521 240/500 [=============>................] - ETA: 1:39 - loss: 1.7727 - regression_loss: 1.5207 - classification_loss: 0.2520 241/500 [=============>................] - ETA: 1:38 - loss: 1.7700 - regression_loss: 1.5186 - classification_loss: 0.2514 242/500 [=============>................] - ETA: 1:38 - loss: 1.7708 - regression_loss: 1.5193 - classification_loss: 0.2515 243/500 [=============>................] - ETA: 1:38 - loss: 1.7708 - regression_loss: 1.5192 - classification_loss: 0.2516 244/500 [=============>................] - ETA: 1:37 - loss: 1.7724 - regression_loss: 1.5202 - classification_loss: 0.2522 245/500 [=============>................] - ETA: 1:37 - loss: 1.7737 - regression_loss: 1.5216 - classification_loss: 0.2522 246/500 [=============>................] - ETA: 1:36 - loss: 1.7742 - regression_loss: 1.5224 - classification_loss: 0.2519 247/500 [=============>................] - ETA: 1:36 - loss: 1.7747 - regression_loss: 1.5233 - classification_loss: 0.2515 248/500 [=============>................] - ETA: 1:36 - loss: 1.7742 - regression_loss: 1.5228 - classification_loss: 0.2513 249/500 [=============>................] - ETA: 1:35 - loss: 1.7733 - regression_loss: 1.5221 - classification_loss: 0.2512 250/500 [==============>...............] - ETA: 1:35 - loss: 1.7721 - regression_loss: 1.5209 - classification_loss: 0.2512 251/500 [==============>...............] - ETA: 1:35 - loss: 1.7727 - regression_loss: 1.5215 - classification_loss: 0.2513 252/500 [==============>...............] - ETA: 1:34 - loss: 1.7709 - regression_loss: 1.5200 - classification_loss: 0.2509 253/500 [==============>...............] - ETA: 1:34 - loss: 1.7698 - regression_loss: 1.5189 - classification_loss: 0.2509 254/500 [==============>...............] - ETA: 1:34 - loss: 1.7692 - regression_loss: 1.5183 - classification_loss: 0.2509 255/500 [==============>...............] - ETA: 1:33 - loss: 1.7674 - regression_loss: 1.5167 - classification_loss: 0.2506 256/500 [==============>...............] - ETA: 1:33 - loss: 1.7676 - regression_loss: 1.5172 - classification_loss: 0.2505 257/500 [==============>...............] - ETA: 1:32 - loss: 1.7678 - regression_loss: 1.5172 - classification_loss: 0.2505 258/500 [==============>...............] - ETA: 1:32 - loss: 1.7700 - regression_loss: 1.5192 - classification_loss: 0.2508 259/500 [==============>...............] - ETA: 1:32 - loss: 1.7696 - regression_loss: 1.5187 - classification_loss: 0.2509 260/500 [==============>...............] - ETA: 1:31 - loss: 1.7700 - regression_loss: 1.5189 - classification_loss: 0.2510 261/500 [==============>...............] - ETA: 1:31 - loss: 1.7705 - regression_loss: 1.5195 - classification_loss: 0.2510 262/500 [==============>...............] - ETA: 1:30 - loss: 1.7714 - regression_loss: 1.5206 - classification_loss: 0.2508 263/500 [==============>...............] - ETA: 1:30 - loss: 1.7712 - regression_loss: 1.5205 - classification_loss: 0.2507 264/500 [==============>...............] - ETA: 1:30 - loss: 1.7701 - regression_loss: 1.5194 - classification_loss: 0.2507 265/500 [==============>...............] - ETA: 1:29 - loss: 1.7668 - regression_loss: 1.5167 - classification_loss: 0.2502 266/500 [==============>...............] - ETA: 1:29 - loss: 1.7634 - regression_loss: 1.5137 - classification_loss: 0.2498 267/500 [===============>..............] - ETA: 1:28 - loss: 1.7643 - regression_loss: 1.5142 - classification_loss: 0.2501 268/500 [===============>..............] - ETA: 1:28 - loss: 1.7646 - regression_loss: 1.5147 - classification_loss: 0.2499 269/500 [===============>..............] - ETA: 1:28 - loss: 1.7640 - regression_loss: 1.5143 - classification_loss: 0.2497 270/500 [===============>..............] - ETA: 1:27 - loss: 1.7614 - regression_loss: 1.5121 - classification_loss: 0.2493 271/500 [===============>..............] - ETA: 1:27 - loss: 1.7617 - regression_loss: 1.5124 - classification_loss: 0.2494 272/500 [===============>..............] - ETA: 1:27 - loss: 1.7640 - regression_loss: 1.5133 - classification_loss: 0.2507 273/500 [===============>..............] - ETA: 1:26 - loss: 1.7641 - regression_loss: 1.5134 - classification_loss: 0.2507 274/500 [===============>..............] - ETA: 1:26 - loss: 1.7650 - regression_loss: 1.5142 - classification_loss: 0.2508 275/500 [===============>..............] - ETA: 1:25 - loss: 1.7656 - regression_loss: 1.5148 - classification_loss: 0.2508 276/500 [===============>..............] - ETA: 1:25 - loss: 1.7628 - regression_loss: 1.5122 - classification_loss: 0.2506 277/500 [===============>..............] - ETA: 1:25 - loss: 1.7642 - regression_loss: 1.5134 - classification_loss: 0.2508 278/500 [===============>..............] - ETA: 1:24 - loss: 1.7611 - regression_loss: 1.5108 - classification_loss: 0.2504 279/500 [===============>..............] - ETA: 1:24 - loss: 1.7612 - regression_loss: 1.5110 - classification_loss: 0.2502 280/500 [===============>..............] - ETA: 1:24 - loss: 1.7605 - regression_loss: 1.5101 - classification_loss: 0.2503 281/500 [===============>..............] - ETA: 1:23 - loss: 1.7588 - regression_loss: 1.5086 - classification_loss: 0.2502 282/500 [===============>..............] - ETA: 1:23 - loss: 1.7578 - regression_loss: 1.5077 - classification_loss: 0.2501 283/500 [===============>..............] - ETA: 1:23 - loss: 1.7594 - regression_loss: 1.5092 - classification_loss: 0.2502 284/500 [================>.............] - ETA: 1:22 - loss: 1.7583 - regression_loss: 1.5084 - classification_loss: 0.2499 285/500 [================>.............] - ETA: 1:22 - loss: 1.7595 - regression_loss: 1.5095 - classification_loss: 0.2500 286/500 [================>.............] - ETA: 1:21 - loss: 1.7570 - regression_loss: 1.5073 - classification_loss: 0.2497 287/500 [================>.............] - ETA: 1:21 - loss: 1.7535 - regression_loss: 1.5043 - classification_loss: 0.2492 288/500 [================>.............] - ETA: 1:21 - loss: 1.7531 - regression_loss: 1.5041 - classification_loss: 0.2490 289/500 [================>.............] - ETA: 1:20 - loss: 1.7556 - regression_loss: 1.5063 - classification_loss: 0.2493 290/500 [================>.............] - ETA: 1:20 - loss: 1.7525 - regression_loss: 1.5036 - classification_loss: 0.2489 291/500 [================>.............] - ETA: 1:20 - loss: 1.7484 - regression_loss: 1.5000 - classification_loss: 0.2484 292/500 [================>.............] - ETA: 1:19 - loss: 1.7483 - regression_loss: 1.5000 - classification_loss: 0.2484 293/500 [================>.............] - ETA: 1:19 - loss: 1.7482 - regression_loss: 1.4999 - classification_loss: 0.2483 294/500 [================>.............] - ETA: 1:18 - loss: 1.7495 - regression_loss: 1.5010 - classification_loss: 0.2485 295/500 [================>.............] - ETA: 1:18 - loss: 1.7499 - regression_loss: 1.5013 - classification_loss: 0.2486 296/500 [================>.............] - ETA: 1:18 - loss: 1.7488 - regression_loss: 1.5004 - classification_loss: 0.2484 297/500 [================>.............] - ETA: 1:17 - loss: 1.7486 - regression_loss: 1.5005 - classification_loss: 0.2481 298/500 [================>.............] - ETA: 1:17 - loss: 1.7477 - regression_loss: 1.4996 - classification_loss: 0.2481 299/500 [================>.............] - ETA: 1:16 - loss: 1.7483 - regression_loss: 1.5000 - classification_loss: 0.2482 300/500 [=================>............] - ETA: 1:16 - loss: 1.7488 - regression_loss: 1.5006 - classification_loss: 0.2481 301/500 [=================>............] - ETA: 1:16 - loss: 1.7484 - regression_loss: 1.5002 - classification_loss: 0.2482 302/500 [=================>............] - ETA: 1:15 - loss: 1.7493 - regression_loss: 1.5007 - classification_loss: 0.2486 303/500 [=================>............] - ETA: 1:15 - loss: 1.7488 - regression_loss: 1.5004 - classification_loss: 0.2485 304/500 [=================>............] - ETA: 1:15 - loss: 1.7483 - regression_loss: 1.4999 - classification_loss: 0.2484 305/500 [=================>............] - ETA: 1:14 - loss: 1.7490 - regression_loss: 1.5002 - classification_loss: 0.2488 306/500 [=================>............] - ETA: 1:14 - loss: 1.7498 - regression_loss: 1.5009 - classification_loss: 0.2489 307/500 [=================>............] - ETA: 1:13 - loss: 1.7503 - regression_loss: 1.5014 - classification_loss: 0.2489 308/500 [=================>............] - ETA: 1:13 - loss: 1.7474 - regression_loss: 1.4987 - classification_loss: 0.2487 309/500 [=================>............] - ETA: 1:13 - loss: 1.7482 - regression_loss: 1.4995 - classification_loss: 0.2487 310/500 [=================>............] - ETA: 1:12 - loss: 1.7485 - regression_loss: 1.4996 - classification_loss: 0.2488 311/500 [=================>............] - ETA: 1:12 - loss: 1.7491 - regression_loss: 1.5002 - classification_loss: 0.2490 312/500 [=================>............] - ETA: 1:12 - loss: 1.7488 - regression_loss: 1.4996 - classification_loss: 0.2491 313/500 [=================>............] - ETA: 1:11 - loss: 1.7490 - regression_loss: 1.4997 - classification_loss: 0.2494 314/500 [=================>............] - ETA: 1:11 - loss: 1.7470 - regression_loss: 1.4980 - classification_loss: 0.2490 315/500 [=================>............] - ETA: 1:10 - loss: 1.7467 - regression_loss: 1.4976 - classification_loss: 0.2492 316/500 [=================>............] - ETA: 1:10 - loss: 1.7445 - regression_loss: 1.4957 - classification_loss: 0.2488 317/500 [==================>...........] - ETA: 1:10 - loss: 1.7449 - regression_loss: 1.4960 - classification_loss: 0.2489 318/500 [==================>...........] - ETA: 1:09 - loss: 1.7439 - regression_loss: 1.4952 - classification_loss: 0.2487 319/500 [==================>...........] - ETA: 1:09 - loss: 1.7422 - regression_loss: 1.4937 - classification_loss: 0.2485 320/500 [==================>...........] - ETA: 1:09 - loss: 1.7407 - regression_loss: 1.4925 - classification_loss: 0.2482 321/500 [==================>...........] - ETA: 1:08 - loss: 1.7405 - regression_loss: 1.4922 - classification_loss: 0.2483 322/500 [==================>...........] - ETA: 1:08 - loss: 1.7420 - regression_loss: 1.4936 - classification_loss: 0.2484 323/500 [==================>...........] - ETA: 1:07 - loss: 1.7425 - regression_loss: 1.4941 - classification_loss: 0.2483 324/500 [==================>...........] - ETA: 1:07 - loss: 1.7428 - regression_loss: 1.4945 - classification_loss: 0.2483 325/500 [==================>...........] - ETA: 1:07 - loss: 1.7427 - regression_loss: 1.4945 - classification_loss: 0.2482 326/500 [==================>...........] - ETA: 1:06 - loss: 1.7421 - regression_loss: 1.4941 - classification_loss: 0.2480 327/500 [==================>...........] - ETA: 1:06 - loss: 1.7427 - regression_loss: 1.4947 - classification_loss: 0.2480 328/500 [==================>...........] - ETA: 1:05 - loss: 1.7444 - regression_loss: 1.4959 - classification_loss: 0.2484 329/500 [==================>...........] - ETA: 1:05 - loss: 1.7437 - regression_loss: 1.4954 - classification_loss: 0.2483 330/500 [==================>...........] - ETA: 1:05 - loss: 1.7429 - regression_loss: 1.4948 - classification_loss: 0.2481 331/500 [==================>...........] - ETA: 1:04 - loss: 1.7458 - regression_loss: 1.4975 - classification_loss: 0.2483 332/500 [==================>...........] - ETA: 1:04 - loss: 1.7444 - regression_loss: 1.4963 - classification_loss: 0.2481 333/500 [==================>...........] - ETA: 1:04 - loss: 1.7447 - regression_loss: 1.4966 - classification_loss: 0.2481 334/500 [===================>..........] - ETA: 1:03 - loss: 1.7430 - regression_loss: 1.4952 - classification_loss: 0.2478 335/500 [===================>..........] - ETA: 1:03 - loss: 1.7428 - regression_loss: 1.4952 - classification_loss: 0.2476 336/500 [===================>..........] - ETA: 1:02 - loss: 1.7427 - regression_loss: 1.4952 - classification_loss: 0.2475 337/500 [===================>..........] - ETA: 1:02 - loss: 1.7412 - regression_loss: 1.4941 - classification_loss: 0.2471 338/500 [===================>..........] - ETA: 1:02 - loss: 1.7412 - regression_loss: 1.4942 - classification_loss: 0.2471 339/500 [===================>..........] - ETA: 1:01 - loss: 1.7426 - regression_loss: 1.4952 - classification_loss: 0.2473 340/500 [===================>..........] - ETA: 1:01 - loss: 1.7407 - regression_loss: 1.4937 - classification_loss: 0.2471 341/500 [===================>..........] - ETA: 1:00 - loss: 1.7405 - regression_loss: 1.4935 - classification_loss: 0.2470 342/500 [===================>..........] - ETA: 1:00 - loss: 1.7413 - regression_loss: 1.4943 - classification_loss: 0.2471 343/500 [===================>..........] - ETA: 1:00 - loss: 1.7416 - regression_loss: 1.4944 - classification_loss: 0.2471 344/500 [===================>..........] - ETA: 59s - loss: 1.7400 - regression_loss: 1.4933 - classification_loss: 0.2468  345/500 [===================>..........] - ETA: 59s - loss: 1.7398 - regression_loss: 1.4930 - classification_loss: 0.2468 346/500 [===================>..........] - ETA: 59s - loss: 1.7392 - regression_loss: 1.4920 - classification_loss: 0.2471 347/500 [===================>..........] - ETA: 58s - loss: 1.7392 - regression_loss: 1.4921 - classification_loss: 0.2471 348/500 [===================>..........] - ETA: 58s - loss: 1.7386 - regression_loss: 1.4915 - classification_loss: 0.2471 349/500 [===================>..........] - ETA: 57s - loss: 1.7382 - regression_loss: 1.4912 - classification_loss: 0.2470 350/500 [====================>.........] - ETA: 57s - loss: 1.7368 - regression_loss: 1.4900 - classification_loss: 0.2468 351/500 [====================>.........] - ETA: 57s - loss: 1.7352 - regression_loss: 1.4885 - classification_loss: 0.2467 352/500 [====================>.........] - ETA: 56s - loss: 1.7352 - regression_loss: 1.4885 - classification_loss: 0.2466 353/500 [====================>.........] - ETA: 56s - loss: 1.7341 - regression_loss: 1.4878 - classification_loss: 0.2463 354/500 [====================>.........] - ETA: 55s - loss: 1.7341 - regression_loss: 1.4879 - classification_loss: 0.2462 355/500 [====================>.........] - ETA: 55s - loss: 1.7349 - regression_loss: 1.4886 - classification_loss: 0.2464 356/500 [====================>.........] - ETA: 55s - loss: 1.7347 - regression_loss: 1.4883 - classification_loss: 0.2464 357/500 [====================>.........] - ETA: 54s - loss: 1.7358 - regression_loss: 1.4895 - classification_loss: 0.2463 358/500 [====================>.........] - ETA: 54s - loss: 1.7357 - regression_loss: 1.4893 - classification_loss: 0.2464 359/500 [====================>.........] - ETA: 54s - loss: 1.7352 - regression_loss: 1.4889 - classification_loss: 0.2463 360/500 [====================>.........] - ETA: 53s - loss: 1.7361 - regression_loss: 1.4896 - classification_loss: 0.2464 361/500 [====================>.........] - ETA: 53s - loss: 1.7364 - regression_loss: 1.4899 - classification_loss: 0.2465 362/500 [====================>.........] - ETA: 52s - loss: 1.7365 - regression_loss: 1.4898 - classification_loss: 0.2467 363/500 [====================>.........] - ETA: 52s - loss: 1.7370 - regression_loss: 1.4902 - classification_loss: 0.2467 364/500 [====================>.........] - ETA: 52s - loss: 1.7375 - regression_loss: 1.4907 - classification_loss: 0.2468 365/500 [====================>.........] - ETA: 51s - loss: 1.7371 - regression_loss: 1.4904 - classification_loss: 0.2467 366/500 [====================>.........] - ETA: 51s - loss: 1.7354 - regression_loss: 1.4888 - classification_loss: 0.2466 367/500 [=====================>........] - ETA: 50s - loss: 1.7357 - regression_loss: 1.4891 - classification_loss: 0.2466 368/500 [=====================>........] - ETA: 50s - loss: 1.7367 - regression_loss: 1.4898 - classification_loss: 0.2468 369/500 [=====================>........] - ETA: 50s - loss: 1.7364 - regression_loss: 1.4896 - classification_loss: 0.2468 370/500 [=====================>........] - ETA: 49s - loss: 1.7357 - regression_loss: 1.4891 - classification_loss: 0.2465 371/500 [=====================>........] - ETA: 49s - loss: 1.7341 - regression_loss: 1.4877 - classification_loss: 0.2464 372/500 [=====================>........] - ETA: 49s - loss: 1.7320 - regression_loss: 1.4859 - classification_loss: 0.2461 373/500 [=====================>........] - ETA: 48s - loss: 1.7322 - regression_loss: 1.4859 - classification_loss: 0.2463 374/500 [=====================>........] - ETA: 48s - loss: 1.7326 - regression_loss: 1.4862 - classification_loss: 0.2464 375/500 [=====================>........] - ETA: 47s - loss: 1.7332 - regression_loss: 1.4870 - classification_loss: 0.2463 376/500 [=====================>........] - ETA: 47s - loss: 1.7334 - regression_loss: 1.4871 - classification_loss: 0.2462 377/500 [=====================>........] - ETA: 47s - loss: 1.7331 - regression_loss: 1.4869 - classification_loss: 0.2462 378/500 [=====================>........] - ETA: 46s - loss: 1.7331 - regression_loss: 1.4870 - classification_loss: 0.2461 379/500 [=====================>........] - ETA: 46s - loss: 1.7328 - regression_loss: 1.4869 - classification_loss: 0.2459 380/500 [=====================>........] - ETA: 45s - loss: 1.7315 - regression_loss: 1.4856 - classification_loss: 0.2459 381/500 [=====================>........] - ETA: 45s - loss: 1.7311 - regression_loss: 1.4851 - classification_loss: 0.2460 382/500 [=====================>........] - ETA: 45s - loss: 1.7312 - regression_loss: 1.4852 - classification_loss: 0.2460 383/500 [=====================>........] - ETA: 44s - loss: 1.7308 - regression_loss: 1.4850 - classification_loss: 0.2458 384/500 [======================>.......] - ETA: 44s - loss: 1.7310 - regression_loss: 1.4853 - classification_loss: 0.2457 385/500 [======================>.......] - ETA: 44s - loss: 1.7303 - regression_loss: 1.4847 - classification_loss: 0.2456 386/500 [======================>.......] - ETA: 43s - loss: 1.7313 - regression_loss: 1.4857 - classification_loss: 0.2457 387/500 [======================>.......] - ETA: 43s - loss: 1.7303 - regression_loss: 1.4846 - classification_loss: 0.2456 388/500 [======================>.......] - ETA: 42s - loss: 1.7296 - regression_loss: 1.4840 - classification_loss: 0.2456 389/500 [======================>.......] - ETA: 42s - loss: 1.7294 - regression_loss: 1.4837 - classification_loss: 0.2456 390/500 [======================>.......] - ETA: 42s - loss: 1.7278 - regression_loss: 1.4824 - classification_loss: 0.2453 391/500 [======================>.......] - ETA: 41s - loss: 1.7279 - regression_loss: 1.4827 - classification_loss: 0.2452 392/500 [======================>.......] - ETA: 41s - loss: 1.7281 - regression_loss: 1.4828 - classification_loss: 0.2452 393/500 [======================>.......] - ETA: 40s - loss: 1.7288 - regression_loss: 1.4837 - classification_loss: 0.2451 394/500 [======================>.......] - ETA: 40s - loss: 1.7290 - regression_loss: 1.4840 - classification_loss: 0.2451 395/500 [======================>.......] - ETA: 40s - loss: 1.7297 - regression_loss: 1.4845 - classification_loss: 0.2451 396/500 [======================>.......] - ETA: 39s - loss: 1.7293 - regression_loss: 1.4843 - classification_loss: 0.2450 397/500 [======================>.......] - ETA: 39s - loss: 1.7299 - regression_loss: 1.4847 - classification_loss: 0.2451 398/500 [======================>.......] - ETA: 39s - loss: 1.7302 - regression_loss: 1.4852 - classification_loss: 0.2451 399/500 [======================>.......] - ETA: 38s - loss: 1.7286 - regression_loss: 1.4836 - classification_loss: 0.2451 400/500 [=======================>......] - ETA: 38s - loss: 1.7269 - regression_loss: 1.4821 - classification_loss: 0.2448 401/500 [=======================>......] - ETA: 37s - loss: 1.7252 - regression_loss: 1.4803 - classification_loss: 0.2449 402/500 [=======================>......] - ETA: 37s - loss: 1.7232 - regression_loss: 1.4786 - classification_loss: 0.2447 403/500 [=======================>......] - ETA: 37s - loss: 1.7221 - regression_loss: 1.4775 - classification_loss: 0.2446 404/500 [=======================>......] - ETA: 36s - loss: 1.7197 - regression_loss: 1.4756 - classification_loss: 0.2441 405/500 [=======================>......] - ETA: 36s - loss: 1.7201 - regression_loss: 1.4760 - classification_loss: 0.2441 406/500 [=======================>......] - ETA: 35s - loss: 1.7189 - regression_loss: 1.4750 - classification_loss: 0.2439 407/500 [=======================>......] - ETA: 35s - loss: 1.7186 - regression_loss: 1.4746 - classification_loss: 0.2439 408/500 [=======================>......] - ETA: 35s - loss: 1.7174 - regression_loss: 1.4736 - classification_loss: 0.2437 409/500 [=======================>......] - ETA: 34s - loss: 1.7158 - regression_loss: 1.4723 - classification_loss: 0.2435 410/500 [=======================>......] - ETA: 34s - loss: 1.7167 - regression_loss: 1.4730 - classification_loss: 0.2437 411/500 [=======================>......] - ETA: 34s - loss: 1.7167 - regression_loss: 1.4731 - classification_loss: 0.2436 412/500 [=======================>......] - ETA: 33s - loss: 1.7164 - regression_loss: 1.4729 - classification_loss: 0.2435 413/500 [=======================>......] - ETA: 33s - loss: 1.7157 - regression_loss: 1.4723 - classification_loss: 0.2434 414/500 [=======================>......] - ETA: 32s - loss: 1.7157 - regression_loss: 1.4723 - classification_loss: 0.2434 415/500 [=======================>......] - ETA: 32s - loss: 1.7153 - regression_loss: 1.4720 - classification_loss: 0.2433 416/500 [=======================>......] - ETA: 32s - loss: 1.7156 - regression_loss: 1.4722 - classification_loss: 0.2433 417/500 [========================>.....] - ETA: 31s - loss: 1.7150 - regression_loss: 1.4718 - classification_loss: 0.2432 418/500 [========================>.....] - ETA: 31s - loss: 1.7151 - regression_loss: 1.4720 - classification_loss: 0.2432 419/500 [========================>.....] - ETA: 31s - loss: 1.7144 - regression_loss: 1.4713 - classification_loss: 0.2431 420/500 [========================>.....] - ETA: 30s - loss: 1.7131 - regression_loss: 1.4702 - classification_loss: 0.2429 421/500 [========================>.....] - ETA: 30s - loss: 1.7111 - regression_loss: 1.4685 - classification_loss: 0.2426 422/500 [========================>.....] - ETA: 29s - loss: 1.7111 - regression_loss: 1.4685 - classification_loss: 0.2425 423/500 [========================>.....] - ETA: 29s - loss: 1.7114 - regression_loss: 1.4689 - classification_loss: 0.2424 424/500 [========================>.....] - ETA: 29s - loss: 1.7122 - regression_loss: 1.4697 - classification_loss: 0.2426 425/500 [========================>.....] - ETA: 28s - loss: 1.7125 - regression_loss: 1.4699 - classification_loss: 0.2426 426/500 [========================>.....] - ETA: 28s - loss: 1.7112 - regression_loss: 1.4688 - classification_loss: 0.2424 427/500 [========================>.....] - ETA: 27s - loss: 1.7113 - regression_loss: 1.4690 - classification_loss: 0.2424 428/500 [========================>.....] - ETA: 27s - loss: 1.7119 - regression_loss: 1.4695 - classification_loss: 0.2424 429/500 [========================>.....] - ETA: 27s - loss: 1.7116 - regression_loss: 1.4693 - classification_loss: 0.2423 430/500 [========================>.....] - ETA: 26s - loss: 1.7101 - regression_loss: 1.4680 - classification_loss: 0.2421 431/500 [========================>.....] - ETA: 26s - loss: 1.7102 - regression_loss: 1.4681 - classification_loss: 0.2421 432/500 [========================>.....] - ETA: 26s - loss: 1.7099 - regression_loss: 1.4678 - classification_loss: 0.2421 433/500 [========================>.....] - ETA: 25s - loss: 1.7097 - regression_loss: 1.4677 - classification_loss: 0.2420 434/500 [=========================>....] - ETA: 25s - loss: 1.7099 - regression_loss: 1.4680 - classification_loss: 0.2419 435/500 [=========================>....] - ETA: 24s - loss: 1.7102 - regression_loss: 1.4683 - classification_loss: 0.2419 436/500 [=========================>....] - ETA: 24s - loss: 1.7104 - regression_loss: 1.4685 - classification_loss: 0.2419 437/500 [=========================>....] - ETA: 24s - loss: 1.7103 - regression_loss: 1.4683 - classification_loss: 0.2419 438/500 [=========================>....] - ETA: 23s - loss: 1.7116 - regression_loss: 1.4695 - classification_loss: 0.2421 439/500 [=========================>....] - ETA: 23s - loss: 1.7128 - regression_loss: 1.4707 - classification_loss: 0.2422 440/500 [=========================>....] - ETA: 22s - loss: 1.7135 - regression_loss: 1.4714 - classification_loss: 0.2421 441/500 [=========================>....] - ETA: 22s - loss: 1.7123 - regression_loss: 1.4703 - classification_loss: 0.2419 442/500 [=========================>....] - ETA: 22s - loss: 1.7129 - regression_loss: 1.4709 - classification_loss: 0.2419 443/500 [=========================>....] - ETA: 21s - loss: 1.7125 - regression_loss: 1.4705 - classification_loss: 0.2419 444/500 [=========================>....] - ETA: 21s - loss: 1.7126 - regression_loss: 1.4707 - classification_loss: 0.2419 445/500 [=========================>....] - ETA: 21s - loss: 1.7124 - regression_loss: 1.4707 - classification_loss: 0.2417 446/500 [=========================>....] - ETA: 20s - loss: 1.7109 - regression_loss: 1.4694 - classification_loss: 0.2415 447/500 [=========================>....] - ETA: 20s - loss: 1.7110 - regression_loss: 1.4695 - classification_loss: 0.2415 448/500 [=========================>....] - ETA: 19s - loss: 1.7110 - regression_loss: 1.4696 - classification_loss: 0.2414 449/500 [=========================>....] - ETA: 19s - loss: 1.7116 - regression_loss: 1.4701 - classification_loss: 0.2415 450/500 [==========================>...] - ETA: 19s - loss: 1.7124 - regression_loss: 1.4709 - classification_loss: 0.2415 451/500 [==========================>...] - ETA: 18s - loss: 1.7110 - regression_loss: 1.4695 - classification_loss: 0.2415 452/500 [==========================>...] - ETA: 18s - loss: 1.7099 - regression_loss: 1.4685 - classification_loss: 0.2414 453/500 [==========================>...] - ETA: 17s - loss: 1.7074 - regression_loss: 1.4663 - classification_loss: 0.2411 454/500 [==========================>...] - ETA: 17s - loss: 1.7069 - regression_loss: 1.4659 - classification_loss: 0.2411 455/500 [==========================>...] - ETA: 17s - loss: 1.7064 - regression_loss: 1.4654 - classification_loss: 0.2410 456/500 [==========================>...] - ETA: 16s - loss: 1.7063 - regression_loss: 1.4653 - classification_loss: 0.2410 457/500 [==========================>...] - ETA: 16s - loss: 1.7057 - regression_loss: 1.4649 - classification_loss: 0.2409 458/500 [==========================>...] - ETA: 16s - loss: 1.7063 - regression_loss: 1.4655 - classification_loss: 0.2408 459/500 [==========================>...] - ETA: 15s - loss: 1.7052 - regression_loss: 1.4646 - classification_loss: 0.2406 460/500 [==========================>...] - ETA: 15s - loss: 1.7054 - regression_loss: 1.4647 - classification_loss: 0.2407 461/500 [==========================>...] - ETA: 14s - loss: 1.7049 - regression_loss: 1.4642 - classification_loss: 0.2407 462/500 [==========================>...] - ETA: 14s - loss: 1.7047 - regression_loss: 1.4641 - classification_loss: 0.2407 463/500 [==========================>...] - ETA: 14s - loss: 1.7056 - regression_loss: 1.4648 - classification_loss: 0.2408 464/500 [==========================>...] - ETA: 13s - loss: 1.7043 - regression_loss: 1.4638 - classification_loss: 0.2405 465/500 [==========================>...] - ETA: 13s - loss: 1.7045 - regression_loss: 1.4638 - classification_loss: 0.2407 466/500 [==========================>...] - ETA: 13s - loss: 1.7045 - regression_loss: 1.4637 - classification_loss: 0.2407 467/500 [===========================>..] - ETA: 12s - loss: 1.7051 - regression_loss: 1.4642 - classification_loss: 0.2409 468/500 [===========================>..] - ETA: 12s - loss: 1.7043 - regression_loss: 1.4635 - classification_loss: 0.2408 469/500 [===========================>..] - ETA: 11s - loss: 1.7055 - regression_loss: 1.4645 - classification_loss: 0.2410 470/500 [===========================>..] - ETA: 11s - loss: 1.7040 - regression_loss: 1.4631 - classification_loss: 0.2409 471/500 [===========================>..] - ETA: 11s - loss: 1.7026 - regression_loss: 1.4619 - classification_loss: 0.2408 472/500 [===========================>..] - ETA: 10s - loss: 1.7015 - regression_loss: 1.4610 - classification_loss: 0.2405 473/500 [===========================>..] - ETA: 10s - loss: 1.7028 - regression_loss: 1.4620 - classification_loss: 0.2408 474/500 [===========================>..] - ETA: 9s - loss: 1.7029 - regression_loss: 1.4621 - classification_loss: 0.2408  475/500 [===========================>..] - ETA: 9s - loss: 1.7025 - regression_loss: 1.4617 - classification_loss: 0.2408 476/500 [===========================>..] - ETA: 9s - loss: 1.7032 - regression_loss: 1.4623 - classification_loss: 0.2410 477/500 [===========================>..] - ETA: 8s - loss: 1.7019 - regression_loss: 1.4610 - classification_loss: 0.2409 478/500 [===========================>..] - ETA: 8s - loss: 1.7027 - regression_loss: 1.4615 - classification_loss: 0.2411 479/500 [===========================>..] - ETA: 8s - loss: 1.7026 - regression_loss: 1.4615 - classification_loss: 0.2411 480/500 [===========================>..] - ETA: 7s - loss: 1.7009 - regression_loss: 1.4599 - classification_loss: 0.2410 481/500 [===========================>..] - ETA: 7s - loss: 1.6997 - regression_loss: 1.4589 - classification_loss: 0.2408 482/500 [===========================>..] - ETA: 6s - loss: 1.6993 - regression_loss: 1.4585 - classification_loss: 0.2408 483/500 [===========================>..] - ETA: 6s - loss: 1.6986 - regression_loss: 1.4579 - classification_loss: 0.2408 484/500 [============================>.] - ETA: 6s - loss: 1.6971 - regression_loss: 1.4566 - classification_loss: 0.2405 485/500 [============================>.] - ETA: 5s - loss: 1.6965 - regression_loss: 1.4561 - classification_loss: 0.2404 486/500 [============================>.] - ETA: 5s - loss: 1.6977 - regression_loss: 1.4570 - classification_loss: 0.2407 487/500 [============================>.] - ETA: 4s - loss: 1.6973 - regression_loss: 1.4566 - classification_loss: 0.2407 488/500 [============================>.] - ETA: 4s - loss: 1.6973 - regression_loss: 1.4565 - classification_loss: 0.2407 489/500 [============================>.] - ETA: 4s - loss: 1.6974 - regression_loss: 1.4566 - classification_loss: 0.2408 490/500 [============================>.] - ETA: 3s - loss: 1.6961 - regression_loss: 1.4555 - classification_loss: 0.2406 491/500 [============================>.] - ETA: 3s - loss: 1.6965 - regression_loss: 1.4560 - classification_loss: 0.2405 492/500 [============================>.] - ETA: 3s - loss: 1.6968 - regression_loss: 1.4562 - classification_loss: 0.2406 493/500 [============================>.] - ETA: 2s - loss: 1.6956 - regression_loss: 1.4552 - classification_loss: 0.2404 494/500 [============================>.] - ETA: 2s - loss: 1.6936 - regression_loss: 1.4534 - classification_loss: 0.2402 495/500 [============================>.] - ETA: 1s - loss: 1.6933 - regression_loss: 1.4531 - classification_loss: 0.2402 496/500 [============================>.] - ETA: 1s - loss: 1.6932 - regression_loss: 1.4531 - classification_loss: 0.2402 497/500 [============================>.] - ETA: 1s - loss: 1.6930 - regression_loss: 1.4528 - classification_loss: 0.2402 498/500 [============================>.] - ETA: 0s - loss: 1.6930 - regression_loss: 1.4528 - classification_loss: 0.2402 499/500 [============================>.] - ETA: 0s - loss: 1.6922 - regression_loss: 1.4522 - classification_loss: 0.2400 500/500 [==============================] - 191s 383ms/step - loss: 1.6930 - regression_loss: 1.4530 - classification_loss: 0.2401 1172 instances of class plum with average precision: 0.5924 mAP: 0.5924 Epoch 00004: saving model to ./training/snapshots/resnet101_pascal_04.h5 Epoch 5/20 1/500 [..............................] - ETA: 3:00 - loss: 1.6303 - regression_loss: 1.4217 - classification_loss: 0.2085 2/500 [..............................] - ETA: 3:06 - loss: 1.7931 - regression_loss: 1.5379 - classification_loss: 0.2552 3/500 [..............................] - ETA: 3:09 - loss: 1.8245 - regression_loss: 1.5813 - classification_loss: 0.2432 4/500 [..............................] - ETA: 3:10 - loss: 1.7139 - regression_loss: 1.4839 - classification_loss: 0.2300 5/500 [..............................] - ETA: 3:10 - loss: 1.5553 - regression_loss: 1.3386 - classification_loss: 0.2166 6/500 [..............................] - ETA: 3:12 - loss: 1.4777 - regression_loss: 1.2670 - classification_loss: 0.2107 7/500 [..............................] - ETA: 3:11 - loss: 1.3914 - regression_loss: 1.1922 - classification_loss: 0.1992 8/500 [..............................] - ETA: 3:08 - loss: 1.4982 - regression_loss: 1.2839 - classification_loss: 0.2143 9/500 [..............................] - ETA: 3:08 - loss: 1.5075 - regression_loss: 1.2898 - classification_loss: 0.2177 10/500 [..............................] - ETA: 3:08 - loss: 1.5129 - regression_loss: 1.2918 - classification_loss: 0.2211 11/500 [..............................] - ETA: 3:08 - loss: 1.5515 - regression_loss: 1.3291 - classification_loss: 0.2225 12/500 [..............................] - ETA: 3:07 - loss: 1.5905 - regression_loss: 1.3651 - classification_loss: 0.2254 13/500 [..............................] - ETA: 3:08 - loss: 1.6068 - regression_loss: 1.3759 - classification_loss: 0.2309 14/500 [..............................] - ETA: 3:07 - loss: 1.5832 - regression_loss: 1.3575 - classification_loss: 0.2257 15/500 [..............................] - ETA: 3:07 - loss: 1.6496 - regression_loss: 1.4119 - classification_loss: 0.2377 16/500 [..............................] - ETA: 3:06 - loss: 1.6675 - regression_loss: 1.4264 - classification_loss: 0.2411 17/500 [>.............................] - ETA: 3:05 - loss: 1.6816 - regression_loss: 1.4365 - classification_loss: 0.2452 18/500 [>.............................] - ETA: 3:06 - loss: 1.7097 - regression_loss: 1.4587 - classification_loss: 0.2510 19/500 [>.............................] - ETA: 3:05 - loss: 1.7082 - regression_loss: 1.4533 - classification_loss: 0.2549 20/500 [>.............................] - ETA: 3:05 - loss: 1.7486 - regression_loss: 1.4855 - classification_loss: 0.2631 21/500 [>.............................] - ETA: 3:05 - loss: 1.7470 - regression_loss: 1.4852 - classification_loss: 0.2618 22/500 [>.............................] - ETA: 3:05 - loss: 1.7803 - regression_loss: 1.5117 - classification_loss: 0.2686 23/500 [>.............................] - ETA: 3:05 - loss: 1.7836 - regression_loss: 1.5148 - classification_loss: 0.2689 24/500 [>.............................] - ETA: 3:05 - loss: 1.7829 - regression_loss: 1.5144 - classification_loss: 0.2686 25/500 [>.............................] - ETA: 3:05 - loss: 1.7803 - regression_loss: 1.5162 - classification_loss: 0.2641 26/500 [>.............................] - ETA: 3:05 - loss: 1.7711 - regression_loss: 1.5082 - classification_loss: 0.2629 27/500 [>.............................] - ETA: 3:04 - loss: 1.7544 - regression_loss: 1.4915 - classification_loss: 0.2629 28/500 [>.............................] - ETA: 3:04 - loss: 1.7745 - regression_loss: 1.4956 - classification_loss: 0.2789 29/500 [>.............................] - ETA: 3:03 - loss: 1.7740 - regression_loss: 1.4951 - classification_loss: 0.2789 30/500 [>.............................] - ETA: 3:03 - loss: 1.7490 - regression_loss: 1.4733 - classification_loss: 0.2757 31/500 [>.............................] - ETA: 3:02 - loss: 1.7635 - regression_loss: 1.4871 - classification_loss: 0.2764 32/500 [>.............................] - ETA: 3:02 - loss: 1.7677 - regression_loss: 1.4922 - classification_loss: 0.2754 33/500 [>.............................] - ETA: 3:02 - loss: 1.7717 - regression_loss: 1.4974 - classification_loss: 0.2743 34/500 [=>............................] - ETA: 3:01 - loss: 1.7616 - regression_loss: 1.4893 - classification_loss: 0.2723 35/500 [=>............................] - ETA: 3:00 - loss: 1.7462 - regression_loss: 1.4772 - classification_loss: 0.2690 36/500 [=>............................] - ETA: 3:00 - loss: 1.7321 - regression_loss: 1.4666 - classification_loss: 0.2654 37/500 [=>............................] - ETA: 3:00 - loss: 1.7268 - regression_loss: 1.4629 - classification_loss: 0.2639 38/500 [=>............................] - ETA: 3:00 - loss: 1.7237 - regression_loss: 1.4611 - classification_loss: 0.2626 39/500 [=>............................] - ETA: 2:59 - loss: 1.7060 - regression_loss: 1.4456 - classification_loss: 0.2604 40/500 [=>............................] - ETA: 2:59 - loss: 1.7007 - regression_loss: 1.4416 - classification_loss: 0.2591 41/500 [=>............................] - ETA: 2:59 - loss: 1.7013 - regression_loss: 1.4430 - classification_loss: 0.2584 42/500 [=>............................] - ETA: 2:59 - loss: 1.6987 - regression_loss: 1.4401 - classification_loss: 0.2586 43/500 [=>............................] - ETA: 2:58 - loss: 1.6902 - regression_loss: 1.4331 - classification_loss: 0.2571 44/500 [=>............................] - ETA: 2:58 - loss: 1.6988 - regression_loss: 1.4406 - classification_loss: 0.2582 45/500 [=>............................] - ETA: 2:58 - loss: 1.6913 - regression_loss: 1.4358 - classification_loss: 0.2555 46/500 [=>............................] - ETA: 2:57 - loss: 1.6845 - regression_loss: 1.4306 - classification_loss: 0.2539 47/500 [=>............................] - ETA: 2:57 - loss: 1.6859 - regression_loss: 1.4324 - classification_loss: 0.2535 48/500 [=>............................] - ETA: 2:57 - loss: 1.6762 - regression_loss: 1.4244 - classification_loss: 0.2518 49/500 [=>............................] - ETA: 2:57 - loss: 1.6660 - regression_loss: 1.4164 - classification_loss: 0.2496 50/500 [==>...........................] - ETA: 2:56 - loss: 1.6648 - regression_loss: 1.4150 - classification_loss: 0.2498 51/500 [==>...........................] - ETA: 2:55 - loss: 1.6730 - regression_loss: 1.4224 - classification_loss: 0.2505 52/500 [==>...........................] - ETA: 2:55 - loss: 1.6757 - regression_loss: 1.4251 - classification_loss: 0.2505 53/500 [==>...........................] - ETA: 2:54 - loss: 1.6689 - regression_loss: 1.4202 - classification_loss: 0.2487 54/500 [==>...........................] - ETA: 2:54 - loss: 1.6855 - regression_loss: 1.4337 - classification_loss: 0.2519 55/500 [==>...........................] - ETA: 2:54 - loss: 1.6750 - regression_loss: 1.4246 - classification_loss: 0.2503 56/500 [==>...........................] - ETA: 2:53 - loss: 1.6815 - regression_loss: 1.4306 - classification_loss: 0.2509 57/500 [==>...........................] - ETA: 2:53 - loss: 1.6850 - regression_loss: 1.4345 - classification_loss: 0.2505 58/500 [==>...........................] - ETA: 2:52 - loss: 1.6842 - regression_loss: 1.4348 - classification_loss: 0.2494 59/500 [==>...........................] - ETA: 2:52 - loss: 1.6824 - regression_loss: 1.4342 - classification_loss: 0.2481 60/500 [==>...........................] - ETA: 2:51 - loss: 1.6853 - regression_loss: 1.4371 - classification_loss: 0.2481 61/500 [==>...........................] - ETA: 2:51 - loss: 1.6906 - regression_loss: 1.4423 - classification_loss: 0.2483 62/500 [==>...........................] - ETA: 2:51 - loss: 1.6892 - regression_loss: 1.4417 - classification_loss: 0.2474 63/500 [==>...........................] - ETA: 2:50 - loss: 1.6813 - regression_loss: 1.4339 - classification_loss: 0.2474 64/500 [==>...........................] - ETA: 2:50 - loss: 1.6789 - regression_loss: 1.4320 - classification_loss: 0.2469 65/500 [==>...........................] - ETA: 2:50 - loss: 1.6803 - regression_loss: 1.4336 - classification_loss: 0.2467 66/500 [==>...........................] - ETA: 2:49 - loss: 1.6755 - regression_loss: 1.4296 - classification_loss: 0.2459 67/500 [===>..........................] - ETA: 2:49 - loss: 1.6795 - regression_loss: 1.4334 - classification_loss: 0.2461 68/500 [===>..........................] - ETA: 2:48 - loss: 1.6824 - regression_loss: 1.4365 - classification_loss: 0.2459 69/500 [===>..........................] - ETA: 2:48 - loss: 1.6809 - regression_loss: 1.4345 - classification_loss: 0.2464 70/500 [===>..........................] - ETA: 2:47 - loss: 1.6791 - regression_loss: 1.4335 - classification_loss: 0.2456 71/500 [===>..........................] - ETA: 2:47 - loss: 1.6836 - regression_loss: 1.4379 - classification_loss: 0.2457 72/500 [===>..........................] - ETA: 2:46 - loss: 1.6859 - regression_loss: 1.4404 - classification_loss: 0.2455 73/500 [===>..........................] - ETA: 2:46 - loss: 1.6861 - regression_loss: 1.4406 - classification_loss: 0.2455 74/500 [===>..........................] - ETA: 2:46 - loss: 1.6883 - regression_loss: 1.4429 - classification_loss: 0.2454 75/500 [===>..........................] - ETA: 2:45 - loss: 1.6859 - regression_loss: 1.4414 - classification_loss: 0.2446 76/500 [===>..........................] - ETA: 2:44 - loss: 1.6891 - regression_loss: 1.4443 - classification_loss: 0.2448 77/500 [===>..........................] - ETA: 2:44 - loss: 1.6794 - regression_loss: 1.4357 - classification_loss: 0.2438 78/500 [===>..........................] - ETA: 2:44 - loss: 1.6761 - regression_loss: 1.4322 - classification_loss: 0.2439 79/500 [===>..........................] - ETA: 2:43 - loss: 1.6800 - regression_loss: 1.4361 - classification_loss: 0.2440 80/500 [===>..........................] - ETA: 2:43 - loss: 1.6761 - regression_loss: 1.4327 - classification_loss: 0.2434 81/500 [===>..........................] - ETA: 2:42 - loss: 1.6760 - regression_loss: 1.4323 - classification_loss: 0.2437 82/500 [===>..........................] - ETA: 2:42 - loss: 1.6768 - regression_loss: 1.4327 - classification_loss: 0.2441 83/500 [===>..........................] - ETA: 2:42 - loss: 1.6747 - regression_loss: 1.4311 - classification_loss: 0.2435 84/500 [====>.........................] - ETA: 2:41 - loss: 1.6721 - regression_loss: 1.4288 - classification_loss: 0.2433 85/500 [====>.........................] - ETA: 2:41 - loss: 1.6754 - regression_loss: 1.4320 - classification_loss: 0.2435 86/500 [====>.........................] - ETA: 2:40 - loss: 1.6712 - regression_loss: 1.4283 - classification_loss: 0.2429 87/500 [====>.........................] - ETA: 2:40 - loss: 1.6740 - regression_loss: 1.4301 - classification_loss: 0.2439 88/500 [====>.........................] - ETA: 2:39 - loss: 1.6710 - regression_loss: 1.4272 - classification_loss: 0.2438 89/500 [====>.........................] - ETA: 2:39 - loss: 1.6727 - regression_loss: 1.4287 - classification_loss: 0.2441 90/500 [====>.........................] - ETA: 2:39 - loss: 1.6728 - regression_loss: 1.4286 - classification_loss: 0.2443 91/500 [====>.........................] - ETA: 2:38 - loss: 1.6597 - regression_loss: 1.4173 - classification_loss: 0.2424 92/500 [====>.........................] - ETA: 2:38 - loss: 1.6600 - regression_loss: 1.4176 - classification_loss: 0.2424 93/500 [====>.........................] - ETA: 2:37 - loss: 1.6531 - regression_loss: 1.4114 - classification_loss: 0.2417 94/500 [====>.........................] - ETA: 2:37 - loss: 1.6529 - regression_loss: 1.4113 - classification_loss: 0.2416 95/500 [====>.........................] - ETA: 2:37 - loss: 1.6533 - regression_loss: 1.4119 - classification_loss: 0.2414 96/500 [====>.........................] - ETA: 2:36 - loss: 1.6505 - regression_loss: 1.4099 - classification_loss: 0.2406 97/500 [====>.........................] - ETA: 2:36 - loss: 1.6522 - regression_loss: 1.4117 - classification_loss: 0.2405 98/500 [====>.........................] - ETA: 2:35 - loss: 1.6498 - regression_loss: 1.4098 - classification_loss: 0.2400 99/500 [====>.........................] - ETA: 2:35 - loss: 1.6464 - regression_loss: 1.4069 - classification_loss: 0.2395 100/500 [=====>........................] - ETA: 2:35 - loss: 1.6548 - regression_loss: 1.4127 - classification_loss: 0.2421 101/500 [=====>........................] - ETA: 2:34 - loss: 1.6588 - regression_loss: 1.4162 - classification_loss: 0.2426 102/500 [=====>........................] - ETA: 2:34 - loss: 1.6577 - regression_loss: 1.4150 - classification_loss: 0.2427 103/500 [=====>........................] - ETA: 2:33 - loss: 1.6456 - regression_loss: 1.4042 - classification_loss: 0.2414 104/500 [=====>........................] - ETA: 2:33 - loss: 1.6405 - regression_loss: 1.3997 - classification_loss: 0.2409 105/500 [=====>........................] - ETA: 2:32 - loss: 1.6404 - regression_loss: 1.3994 - classification_loss: 0.2410 106/500 [=====>........................] - ETA: 2:32 - loss: 1.6379 - regression_loss: 1.3975 - classification_loss: 0.2404 107/500 [=====>........................] - ETA: 2:32 - loss: 1.6395 - regression_loss: 1.3991 - classification_loss: 0.2405 108/500 [=====>........................] - ETA: 2:31 - loss: 1.6344 - regression_loss: 1.3948 - classification_loss: 0.2396 109/500 [=====>........................] - ETA: 2:31 - loss: 1.6262 - regression_loss: 1.3879 - classification_loss: 0.2383 110/500 [=====>........................] - ETA: 2:31 - loss: 1.6269 - regression_loss: 1.3887 - classification_loss: 0.2382 111/500 [=====>........................] - ETA: 2:30 - loss: 1.6269 - regression_loss: 1.3889 - classification_loss: 0.2380 112/500 [=====>........................] - ETA: 2:30 - loss: 1.6275 - regression_loss: 1.3897 - classification_loss: 0.2378 113/500 [=====>........................] - ETA: 2:29 - loss: 1.6247 - regression_loss: 1.3875 - classification_loss: 0.2371 114/500 [=====>........................] - ETA: 2:29 - loss: 1.6260 - regression_loss: 1.3891 - classification_loss: 0.2369 115/500 [=====>........................] - ETA: 2:29 - loss: 1.6258 - regression_loss: 1.3891 - classification_loss: 0.2367 116/500 [=====>........................] - ETA: 2:28 - loss: 1.6291 - regression_loss: 1.3922 - classification_loss: 0.2369 117/500 [======>.......................] - ETA: 2:28 - loss: 1.6344 - regression_loss: 1.3969 - classification_loss: 0.2374 118/500 [======>.......................] - ETA: 2:27 - loss: 1.6372 - regression_loss: 1.3990 - classification_loss: 0.2382 119/500 [======>.......................] - ETA: 2:27 - loss: 1.6369 - regression_loss: 1.3989 - classification_loss: 0.2380 120/500 [======>.......................] - ETA: 2:27 - loss: 1.6348 - regression_loss: 1.3973 - classification_loss: 0.2375 121/500 [======>.......................] - ETA: 2:26 - loss: 1.6297 - regression_loss: 1.3931 - classification_loss: 0.2366 122/500 [======>.......................] - ETA: 2:26 - loss: 1.6219 - regression_loss: 1.3865 - classification_loss: 0.2354 123/500 [======>.......................] - ETA: 2:25 - loss: 1.6176 - regression_loss: 1.3825 - classification_loss: 0.2351 124/500 [======>.......................] - ETA: 2:25 - loss: 1.6135 - regression_loss: 1.3790 - classification_loss: 0.2345 125/500 [======>.......................] - ETA: 2:25 - loss: 1.6165 - regression_loss: 1.3815 - classification_loss: 0.2350 126/500 [======>.......................] - ETA: 2:24 - loss: 1.6162 - regression_loss: 1.3816 - classification_loss: 0.2346 127/500 [======>.......................] - ETA: 2:24 - loss: 1.6158 - regression_loss: 1.3815 - classification_loss: 0.2344 128/500 [======>.......................] - ETA: 2:23 - loss: 1.6178 - regression_loss: 1.3834 - classification_loss: 0.2344 129/500 [======>.......................] - ETA: 2:23 - loss: 1.6119 - regression_loss: 1.3781 - classification_loss: 0.2338 130/500 [======>.......................] - ETA: 2:22 - loss: 1.6079 - regression_loss: 1.3748 - classification_loss: 0.2331 131/500 [======>.......................] - ETA: 2:22 - loss: 1.6084 - regression_loss: 1.3755 - classification_loss: 0.2329 132/500 [======>.......................] - ETA: 2:22 - loss: 1.6131 - regression_loss: 1.3801 - classification_loss: 0.2330 133/500 [======>.......................] - ETA: 2:21 - loss: 1.6153 - regression_loss: 1.3820 - classification_loss: 0.2333 134/500 [=======>......................] - ETA: 2:21 - loss: 1.6159 - regression_loss: 1.3821 - classification_loss: 0.2339 135/500 [=======>......................] - ETA: 2:21 - loss: 1.6129 - regression_loss: 1.3795 - classification_loss: 0.2334 136/500 [=======>......................] - ETA: 2:20 - loss: 1.6143 - regression_loss: 1.3805 - classification_loss: 0.2339 137/500 [=======>......................] - ETA: 2:20 - loss: 1.6097 - regression_loss: 1.3767 - classification_loss: 0.2329 138/500 [=======>......................] - ETA: 2:20 - loss: 1.6114 - regression_loss: 1.3789 - classification_loss: 0.2326 139/500 [=======>......................] - ETA: 2:19 - loss: 1.6128 - regression_loss: 1.3804 - classification_loss: 0.2324 140/500 [=======>......................] - ETA: 2:19 - loss: 1.6137 - regression_loss: 1.3811 - classification_loss: 0.2325 141/500 [=======>......................] - ETA: 2:18 - loss: 1.6082 - regression_loss: 1.3766 - classification_loss: 0.2316 142/500 [=======>......................] - ETA: 2:18 - loss: 1.6078 - regression_loss: 1.3766 - classification_loss: 0.2312 143/500 [=======>......................] - ETA: 2:18 - loss: 1.6090 - regression_loss: 1.3778 - classification_loss: 0.2313 144/500 [=======>......................] - ETA: 2:17 - loss: 1.6105 - regression_loss: 1.3790 - classification_loss: 0.2315 145/500 [=======>......................] - ETA: 2:17 - loss: 1.6095 - regression_loss: 1.3782 - classification_loss: 0.2313 146/500 [=======>......................] - ETA: 2:16 - loss: 1.6109 - regression_loss: 1.3792 - classification_loss: 0.2317 147/500 [=======>......................] - ETA: 2:16 - loss: 1.6109 - regression_loss: 1.3798 - classification_loss: 0.2311 148/500 [=======>......................] - ETA: 2:15 - loss: 1.6129 - regression_loss: 1.3811 - classification_loss: 0.2318 149/500 [=======>......................] - ETA: 2:15 - loss: 1.6109 - regression_loss: 1.3791 - classification_loss: 0.2318 150/500 [========>.....................] - ETA: 2:15 - loss: 1.6145 - regression_loss: 1.3825 - classification_loss: 0.2320 151/500 [========>.....................] - ETA: 2:14 - loss: 1.6142 - regression_loss: 1.3822 - classification_loss: 0.2320 152/500 [========>.....................] - ETA: 2:14 - loss: 1.6145 - regression_loss: 1.3825 - classification_loss: 0.2320 153/500 [========>.....................] - ETA: 2:13 - loss: 1.6141 - regression_loss: 1.3821 - classification_loss: 0.2319 154/500 [========>.....................] - ETA: 2:13 - loss: 1.6139 - regression_loss: 1.3823 - classification_loss: 0.2317 155/500 [========>.....................] - ETA: 2:13 - loss: 1.6065 - regression_loss: 1.3756 - classification_loss: 0.2309 156/500 [========>.....................] - ETA: 2:12 - loss: 1.6064 - regression_loss: 1.3759 - classification_loss: 0.2305 157/500 [========>.....................] - ETA: 2:12 - loss: 1.6069 - regression_loss: 1.3764 - classification_loss: 0.2305 158/500 [========>.....................] - ETA: 2:12 - loss: 1.6031 - regression_loss: 1.3733 - classification_loss: 0.2298 159/500 [========>.....................] - ETA: 2:11 - loss: 1.6027 - regression_loss: 1.3731 - classification_loss: 0.2296 160/500 [========>.....................] - ETA: 2:11 - loss: 1.6032 - regression_loss: 1.3735 - classification_loss: 0.2298 161/500 [========>.....................] - ETA: 2:10 - loss: 1.6012 - regression_loss: 1.3715 - classification_loss: 0.2296 162/500 [========>.....................] - ETA: 2:10 - loss: 1.5996 - regression_loss: 1.3700 - classification_loss: 0.2297 163/500 [========>.....................] - ETA: 2:10 - loss: 1.5999 - regression_loss: 1.3703 - classification_loss: 0.2296 164/500 [========>.....................] - ETA: 2:09 - loss: 1.6004 - regression_loss: 1.3708 - classification_loss: 0.2296 165/500 [========>.....................] - ETA: 2:09 - loss: 1.5979 - regression_loss: 1.3686 - classification_loss: 0.2293 166/500 [========>.....................] - ETA: 2:08 - loss: 1.6006 - regression_loss: 1.3710 - classification_loss: 0.2296 167/500 [=========>....................] - ETA: 2:08 - loss: 1.5973 - regression_loss: 1.3685 - classification_loss: 0.2288 168/500 [=========>....................] - ETA: 2:08 - loss: 1.5953 - regression_loss: 1.3668 - classification_loss: 0.2285 169/500 [=========>....................] - ETA: 2:07 - loss: 1.5974 - regression_loss: 1.3685 - classification_loss: 0.2289 170/500 [=========>....................] - ETA: 2:07 - loss: 1.5973 - regression_loss: 1.3684 - classification_loss: 0.2289 171/500 [=========>....................] - ETA: 2:06 - loss: 1.5957 - regression_loss: 1.3670 - classification_loss: 0.2286 172/500 [=========>....................] - ETA: 2:06 - loss: 1.5981 - regression_loss: 1.3692 - classification_loss: 0.2289 173/500 [=========>....................] - ETA: 2:06 - loss: 1.5981 - regression_loss: 1.3691 - classification_loss: 0.2290 174/500 [=========>....................] - ETA: 2:05 - loss: 1.5976 - regression_loss: 1.3687 - classification_loss: 0.2289 175/500 [=========>....................] - ETA: 2:05 - loss: 1.5939 - regression_loss: 1.3655 - classification_loss: 0.2284 176/500 [=========>....................] - ETA: 2:04 - loss: 1.5926 - regression_loss: 1.3644 - classification_loss: 0.2282 177/500 [=========>....................] - ETA: 2:04 - loss: 1.5938 - regression_loss: 1.3653 - classification_loss: 0.2285 178/500 [=========>....................] - ETA: 2:04 - loss: 1.5922 - regression_loss: 1.3640 - classification_loss: 0.2282 179/500 [=========>....................] - ETA: 2:03 - loss: 1.5915 - regression_loss: 1.3635 - classification_loss: 0.2280 180/500 [=========>....................] - ETA: 2:03 - loss: 1.5907 - regression_loss: 1.3631 - classification_loss: 0.2276 181/500 [=========>....................] - ETA: 2:02 - loss: 1.5863 - regression_loss: 1.3594 - classification_loss: 0.2269 182/500 [=========>....................] - ETA: 2:02 - loss: 1.5838 - regression_loss: 1.3575 - classification_loss: 0.2262 183/500 [=========>....................] - ETA: 2:02 - loss: 1.5815 - regression_loss: 1.3557 - classification_loss: 0.2259 184/500 [==========>...................] - ETA: 2:01 - loss: 1.5812 - regression_loss: 1.3556 - classification_loss: 0.2256 185/500 [==========>...................] - ETA: 2:01 - loss: 1.5816 - regression_loss: 1.3559 - classification_loss: 0.2256 186/500 [==========>...................] - ETA: 2:00 - loss: 1.5803 - regression_loss: 1.3549 - classification_loss: 0.2254 187/500 [==========>...................] - ETA: 2:00 - loss: 1.5788 - regression_loss: 1.3535 - classification_loss: 0.2253 188/500 [==========>...................] - ETA: 2:00 - loss: 1.5792 - regression_loss: 1.3543 - classification_loss: 0.2249 189/500 [==========>...................] - ETA: 1:59 - loss: 1.5755 - regression_loss: 1.3511 - classification_loss: 0.2244 190/500 [==========>...................] - ETA: 1:59 - loss: 1.5779 - regression_loss: 1.3530 - classification_loss: 0.2249 191/500 [==========>...................] - ETA: 1:58 - loss: 1.5818 - regression_loss: 1.3561 - classification_loss: 0.2256 192/500 [==========>...................] - ETA: 1:58 - loss: 1.5844 - regression_loss: 1.3583 - classification_loss: 0.2261 193/500 [==========>...................] - ETA: 1:58 - loss: 1.5855 - regression_loss: 1.3590 - classification_loss: 0.2265 194/500 [==========>...................] - ETA: 1:57 - loss: 1.5841 - regression_loss: 1.3581 - classification_loss: 0.2260 195/500 [==========>...................] - ETA: 1:57 - loss: 1.5816 - regression_loss: 1.3559 - classification_loss: 0.2257 196/500 [==========>...................] - ETA: 1:57 - loss: 1.5788 - regression_loss: 1.3535 - classification_loss: 0.2253 197/500 [==========>...................] - ETA: 1:56 - loss: 1.5815 - regression_loss: 1.3558 - classification_loss: 0.2258 198/500 [==========>...................] - ETA: 1:56 - loss: 1.5818 - regression_loss: 1.3563 - classification_loss: 0.2255 199/500 [==========>...................] - ETA: 1:55 - loss: 1.5837 - regression_loss: 1.3581 - classification_loss: 0.2257 200/500 [===========>..................] - ETA: 1:55 - loss: 1.5836 - regression_loss: 1.3579 - classification_loss: 0.2256 201/500 [===========>..................] - ETA: 1:55 - loss: 1.5828 - regression_loss: 1.3574 - classification_loss: 0.2254 202/500 [===========>..................] - ETA: 1:54 - loss: 1.5814 - regression_loss: 1.3561 - classification_loss: 0.2253 203/500 [===========>..................] - ETA: 1:54 - loss: 1.5809 - regression_loss: 1.3559 - classification_loss: 0.2250 204/500 [===========>..................] - ETA: 1:53 - loss: 1.5810 - regression_loss: 1.3559 - classification_loss: 0.2251 205/500 [===========>..................] - ETA: 1:53 - loss: 1.5777 - regression_loss: 1.3532 - classification_loss: 0.2245 206/500 [===========>..................] - ETA: 1:53 - loss: 1.5756 - regression_loss: 1.3513 - classification_loss: 0.2242 207/500 [===========>..................] - ETA: 1:52 - loss: 1.5752 - regression_loss: 1.3507 - classification_loss: 0.2244 208/500 [===========>..................] - ETA: 1:52 - loss: 1.5704 - regression_loss: 1.3466 - classification_loss: 0.2238 209/500 [===========>..................] - ETA: 1:51 - loss: 1.5668 - regression_loss: 1.3435 - classification_loss: 0.2233 210/500 [===========>..................] - ETA: 1:51 - loss: 1.5682 - regression_loss: 1.3448 - classification_loss: 0.2234 211/500 [===========>..................] - ETA: 1:51 - loss: 1.5686 - regression_loss: 1.3453 - classification_loss: 0.2233 212/500 [===========>..................] - ETA: 1:50 - loss: 1.5707 - regression_loss: 1.3471 - classification_loss: 0.2236 213/500 [===========>..................] - ETA: 1:50 - loss: 1.5719 - regression_loss: 1.3482 - classification_loss: 0.2237 214/500 [===========>..................] - ETA: 1:50 - loss: 1.5709 - regression_loss: 1.3475 - classification_loss: 0.2234 215/500 [===========>..................] - ETA: 1:49 - loss: 1.5750 - regression_loss: 1.3511 - classification_loss: 0.2239 216/500 [===========>..................] - ETA: 1:49 - loss: 1.5760 - regression_loss: 1.3520 - classification_loss: 0.2239 217/500 [============>.................] - ETA: 1:48 - loss: 1.5736 - regression_loss: 1.3501 - classification_loss: 0.2235 218/500 [============>.................] - ETA: 1:48 - loss: 1.5743 - regression_loss: 1.3508 - classification_loss: 0.2235 219/500 [============>.................] - ETA: 1:48 - loss: 1.5747 - regression_loss: 1.3514 - classification_loss: 0.2233 220/500 [============>.................] - ETA: 1:47 - loss: 1.5753 - regression_loss: 1.3519 - classification_loss: 0.2233 221/500 [============>.................] - ETA: 1:47 - loss: 1.5767 - regression_loss: 1.3531 - classification_loss: 0.2235 222/500 [============>.................] - ETA: 1:47 - loss: 1.5748 - regression_loss: 1.3515 - classification_loss: 0.2233 223/500 [============>.................] - ETA: 1:46 - loss: 1.5719 - regression_loss: 1.3489 - classification_loss: 0.2230 224/500 [============>.................] - ETA: 1:46 - loss: 1.5704 - regression_loss: 1.3477 - classification_loss: 0.2228 225/500 [============>.................] - ETA: 1:45 - loss: 1.5692 - regression_loss: 1.3468 - classification_loss: 0.2224 226/500 [============>.................] - ETA: 1:45 - loss: 1.5642 - regression_loss: 1.3424 - classification_loss: 0.2217 227/500 [============>.................] - ETA: 1:45 - loss: 1.5628 - regression_loss: 1.3413 - classification_loss: 0.2216 228/500 [============>.................] - ETA: 1:44 - loss: 1.5610 - regression_loss: 1.3399 - classification_loss: 0.2211 229/500 [============>.................] - ETA: 1:44 - loss: 1.5616 - regression_loss: 1.3404 - classification_loss: 0.2212 230/500 [============>.................] - ETA: 1:43 - loss: 1.5627 - regression_loss: 1.3413 - classification_loss: 0.2214 231/500 [============>.................] - ETA: 1:43 - loss: 1.5611 - regression_loss: 1.3400 - classification_loss: 0.2212 232/500 [============>.................] - ETA: 1:43 - loss: 1.5590 - regression_loss: 1.3382 - classification_loss: 0.2208 233/500 [============>.................] - ETA: 1:42 - loss: 1.5591 - regression_loss: 1.3383 - classification_loss: 0.2208 234/500 [=============>................] - ETA: 1:42 - loss: 1.5578 - regression_loss: 1.3370 - classification_loss: 0.2208 235/500 [=============>................] - ETA: 1:41 - loss: 1.5563 - regression_loss: 1.3360 - classification_loss: 0.2203 236/500 [=============>................] - ETA: 1:41 - loss: 1.5578 - regression_loss: 1.3375 - classification_loss: 0.2203 237/500 [=============>................] - ETA: 1:41 - loss: 1.5585 - regression_loss: 1.3383 - classification_loss: 0.2202 238/500 [=============>................] - ETA: 1:40 - loss: 1.5584 - regression_loss: 1.3384 - classification_loss: 0.2200 239/500 [=============>................] - ETA: 1:40 - loss: 1.5558 - regression_loss: 1.3363 - classification_loss: 0.2196 240/500 [=============>................] - ETA: 1:39 - loss: 1.5585 - regression_loss: 1.3384 - classification_loss: 0.2201 241/500 [=============>................] - ETA: 1:39 - loss: 1.5578 - regression_loss: 1.3380 - classification_loss: 0.2197 242/500 [=============>................] - ETA: 1:39 - loss: 1.5579 - regression_loss: 1.3382 - classification_loss: 0.2197 243/500 [=============>................] - ETA: 1:38 - loss: 1.5595 - regression_loss: 1.3397 - classification_loss: 0.2198 244/500 [=============>................] - ETA: 1:38 - loss: 1.5599 - regression_loss: 1.3403 - classification_loss: 0.2196 245/500 [=============>................] - ETA: 1:38 - loss: 1.5609 - regression_loss: 1.3412 - classification_loss: 0.2197 246/500 [=============>................] - ETA: 1:37 - loss: 1.5616 - regression_loss: 1.3419 - classification_loss: 0.2197 247/500 [=============>................] - ETA: 1:37 - loss: 1.5614 - regression_loss: 1.3417 - classification_loss: 0.2197 248/500 [=============>................] - ETA: 1:36 - loss: 1.5608 - regression_loss: 1.3413 - classification_loss: 0.2195 249/500 [=============>................] - ETA: 1:36 - loss: 1.5605 - regression_loss: 1.3410 - classification_loss: 0.2195 250/500 [==============>...............] - ETA: 1:36 - loss: 1.5595 - regression_loss: 1.3402 - classification_loss: 0.2192 251/500 [==============>...............] - ETA: 1:35 - loss: 1.5589 - regression_loss: 1.3402 - classification_loss: 0.2188 252/500 [==============>...............] - ETA: 1:35 - loss: 1.5596 - regression_loss: 1.3407 - classification_loss: 0.2189 253/500 [==============>...............] - ETA: 1:34 - loss: 1.5614 - regression_loss: 1.3423 - classification_loss: 0.2190 254/500 [==============>...............] - ETA: 1:34 - loss: 1.5619 - regression_loss: 1.3426 - classification_loss: 0.2193 255/500 [==============>...............] - ETA: 1:34 - loss: 1.5631 - regression_loss: 1.3433 - classification_loss: 0.2198 256/500 [==============>...............] - ETA: 1:33 - loss: 1.5615 - regression_loss: 1.3419 - classification_loss: 0.2196 257/500 [==============>...............] - ETA: 1:33 - loss: 1.5621 - regression_loss: 1.3424 - classification_loss: 0.2197 258/500 [==============>...............] - ETA: 1:33 - loss: 1.5623 - regression_loss: 1.3427 - classification_loss: 0.2196 259/500 [==============>...............] - ETA: 1:32 - loss: 1.5618 - regression_loss: 1.3424 - classification_loss: 0.2194 260/500 [==============>...............] - ETA: 1:32 - loss: 1.5621 - regression_loss: 1.3428 - classification_loss: 0.2193 261/500 [==============>...............] - ETA: 1:31 - loss: 1.5643 - regression_loss: 1.3449 - classification_loss: 0.2194 262/500 [==============>...............] - ETA: 1:31 - loss: 1.5671 - regression_loss: 1.3472 - classification_loss: 0.2199 263/500 [==============>...............] - ETA: 1:31 - loss: 1.5664 - regression_loss: 1.3469 - classification_loss: 0.2195 264/500 [==============>...............] - ETA: 1:30 - loss: 1.5671 - regression_loss: 1.3474 - classification_loss: 0.2198 265/500 [==============>...............] - ETA: 1:30 - loss: 1.5697 - regression_loss: 1.3498 - classification_loss: 0.2198 266/500 [==============>...............] - ETA: 1:29 - loss: 1.5683 - regression_loss: 1.3486 - classification_loss: 0.2197 267/500 [===============>..............] - ETA: 1:29 - loss: 1.5692 - regression_loss: 1.3496 - classification_loss: 0.2196 268/500 [===============>..............] - ETA: 1:29 - loss: 1.5688 - regression_loss: 1.3494 - classification_loss: 0.2194 269/500 [===============>..............] - ETA: 1:28 - loss: 1.5688 - regression_loss: 1.3493 - classification_loss: 0.2194 270/500 [===============>..............] - ETA: 1:28 - loss: 1.5668 - regression_loss: 1.3475 - classification_loss: 0.2192 271/500 [===============>..............] - ETA: 1:27 - loss: 1.5680 - regression_loss: 1.3485 - classification_loss: 0.2195 272/500 [===============>..............] - ETA: 1:27 - loss: 1.5703 - regression_loss: 1.3506 - classification_loss: 0.2197 273/500 [===============>..............] - ETA: 1:27 - loss: 1.5703 - regression_loss: 1.3507 - classification_loss: 0.2196 274/500 [===============>..............] - ETA: 1:26 - loss: 1.5722 - regression_loss: 1.3521 - classification_loss: 0.2201 275/500 [===============>..............] - ETA: 1:26 - loss: 1.5747 - regression_loss: 1.3543 - classification_loss: 0.2205 276/500 [===============>..............] - ETA: 1:26 - loss: 1.5764 - regression_loss: 1.3559 - classification_loss: 0.2205 277/500 [===============>..............] - ETA: 1:25 - loss: 1.5747 - regression_loss: 1.3545 - classification_loss: 0.2202 278/500 [===============>..............] - ETA: 1:25 - loss: 1.5730 - regression_loss: 1.3532 - classification_loss: 0.2199 279/500 [===============>..............] - ETA: 1:24 - loss: 1.5733 - regression_loss: 1.3537 - classification_loss: 0.2197 280/500 [===============>..............] - ETA: 1:24 - loss: 1.5739 - regression_loss: 1.3543 - classification_loss: 0.2196 281/500 [===============>..............] - ETA: 1:24 - loss: 1.5739 - regression_loss: 1.3543 - classification_loss: 0.2196 282/500 [===============>..............] - ETA: 1:23 - loss: 1.5716 - regression_loss: 1.3525 - classification_loss: 0.2191 283/500 [===============>..............] - ETA: 1:23 - loss: 1.5728 - regression_loss: 1.3537 - classification_loss: 0.2192 284/500 [================>.............] - ETA: 1:23 - loss: 1.5736 - regression_loss: 1.3544 - classification_loss: 0.2192 285/500 [================>.............] - ETA: 1:22 - loss: 1.5742 - regression_loss: 1.3549 - classification_loss: 0.2192 286/500 [================>.............] - ETA: 1:22 - loss: 1.5735 - regression_loss: 1.3544 - classification_loss: 0.2191 287/500 [================>.............] - ETA: 1:21 - loss: 1.5737 - regression_loss: 1.3546 - classification_loss: 0.2191 288/500 [================>.............] - ETA: 1:21 - loss: 1.5713 - regression_loss: 1.3526 - classification_loss: 0.2187 289/500 [================>.............] - ETA: 1:21 - loss: 1.5705 - regression_loss: 1.3519 - classification_loss: 0.2186 290/500 [================>.............] - ETA: 1:20 - loss: 1.5691 - regression_loss: 1.3508 - classification_loss: 0.2184 291/500 [================>.............] - ETA: 1:20 - loss: 1.5681 - regression_loss: 1.3499 - classification_loss: 0.2182 292/500 [================>.............] - ETA: 1:19 - loss: 1.5673 - regression_loss: 1.3492 - classification_loss: 0.2180 293/500 [================>.............] - ETA: 1:19 - loss: 1.5681 - regression_loss: 1.3502 - classification_loss: 0.2179 294/500 [================>.............] - ETA: 1:19 - loss: 1.5657 - regression_loss: 1.3482 - classification_loss: 0.2175 295/500 [================>.............] - ETA: 1:18 - loss: 1.5657 - regression_loss: 1.3482 - classification_loss: 0.2174 296/500 [================>.............] - ETA: 1:18 - loss: 1.5652 - regression_loss: 1.3479 - classification_loss: 0.2173 297/500 [================>.............] - ETA: 1:17 - loss: 1.5664 - regression_loss: 1.3493 - classification_loss: 0.2170 298/500 [================>.............] - ETA: 1:17 - loss: 1.5669 - regression_loss: 1.3499 - classification_loss: 0.2170 299/500 [================>.............] - ETA: 1:17 - loss: 1.5689 - regression_loss: 1.3517 - classification_loss: 0.2172 300/500 [=================>............] - ETA: 1:16 - loss: 1.5673 - regression_loss: 1.3503 - classification_loss: 0.2171 301/500 [=================>............] - ETA: 1:16 - loss: 1.5670 - regression_loss: 1.3500 - classification_loss: 0.2170 302/500 [=================>............] - ETA: 1:16 - loss: 1.5658 - regression_loss: 1.3491 - classification_loss: 0.2167 303/500 [=================>............] - ETA: 1:15 - loss: 1.5684 - regression_loss: 1.3514 - classification_loss: 0.2170 304/500 [=================>............] - ETA: 1:15 - loss: 1.5682 - regression_loss: 1.3513 - classification_loss: 0.2169 305/500 [=================>............] - ETA: 1:14 - loss: 1.5659 - regression_loss: 1.3494 - classification_loss: 0.2165 306/500 [=================>............] - ETA: 1:14 - loss: 1.5631 - regression_loss: 1.3471 - classification_loss: 0.2160 307/500 [=================>............] - ETA: 1:14 - loss: 1.5632 - regression_loss: 1.3472 - classification_loss: 0.2159 308/500 [=================>............] - ETA: 1:13 - loss: 1.5624 - regression_loss: 1.3464 - classification_loss: 0.2160 309/500 [=================>............] - ETA: 1:13 - loss: 1.5605 - regression_loss: 1.3449 - classification_loss: 0.2156 310/500 [=================>............] - ETA: 1:12 - loss: 1.5596 - regression_loss: 1.3440 - classification_loss: 0.2155 311/500 [=================>............] - ETA: 1:12 - loss: 1.5600 - regression_loss: 1.3443 - classification_loss: 0.2157 312/500 [=================>............] - ETA: 1:12 - loss: 1.5593 - regression_loss: 1.3437 - classification_loss: 0.2156 313/500 [=================>............] - ETA: 1:11 - loss: 1.5602 - regression_loss: 1.3443 - classification_loss: 0.2158 314/500 [=================>............] - ETA: 1:11 - loss: 1.5606 - regression_loss: 1.3448 - classification_loss: 0.2158 315/500 [=================>............] - ETA: 1:10 - loss: 1.5604 - regression_loss: 1.3447 - classification_loss: 0.2157 316/500 [=================>............] - ETA: 1:10 - loss: 1.5606 - regression_loss: 1.3449 - classification_loss: 0.2157 317/500 [==================>...........] - ETA: 1:10 - loss: 1.5615 - regression_loss: 1.3457 - classification_loss: 0.2159 318/500 [==================>...........] - ETA: 1:09 - loss: 1.5615 - regression_loss: 1.3454 - classification_loss: 0.2160 319/500 [==================>...........] - ETA: 1:09 - loss: 1.5613 - regression_loss: 1.3452 - classification_loss: 0.2161 320/500 [==================>...........] - ETA: 1:09 - loss: 1.5620 - regression_loss: 1.3458 - classification_loss: 0.2162 321/500 [==================>...........] - ETA: 1:08 - loss: 1.5629 - regression_loss: 1.3466 - classification_loss: 0.2163 322/500 [==================>...........] - ETA: 1:08 - loss: 1.5624 - regression_loss: 1.3462 - classification_loss: 0.2161 323/500 [==================>...........] - ETA: 1:07 - loss: 1.5616 - regression_loss: 1.3454 - classification_loss: 0.2161 324/500 [==================>...........] - ETA: 1:07 - loss: 1.5623 - regression_loss: 1.3461 - classification_loss: 0.2162 325/500 [==================>...........] - ETA: 1:07 - loss: 1.5623 - regression_loss: 1.3461 - classification_loss: 0.2161 326/500 [==================>...........] - ETA: 1:06 - loss: 1.5629 - regression_loss: 1.3467 - classification_loss: 0.2162 327/500 [==================>...........] - ETA: 1:06 - loss: 1.5631 - regression_loss: 1.3470 - classification_loss: 0.2161 328/500 [==================>...........] - ETA: 1:05 - loss: 1.5605 - regression_loss: 1.3448 - classification_loss: 0.2158 329/500 [==================>...........] - ETA: 1:05 - loss: 1.5584 - regression_loss: 1.3430 - classification_loss: 0.2155 330/500 [==================>...........] - ETA: 1:05 - loss: 1.5586 - regression_loss: 1.3431 - classification_loss: 0.2155 331/500 [==================>...........] - ETA: 1:04 - loss: 1.5597 - regression_loss: 1.3441 - classification_loss: 0.2155 332/500 [==================>...........] - ETA: 1:04 - loss: 1.5578 - regression_loss: 1.3425 - classification_loss: 0.2152 333/500 [==================>...........] - ETA: 1:04 - loss: 1.5589 - regression_loss: 1.3434 - classification_loss: 0.2155 334/500 [===================>..........] - ETA: 1:03 - loss: 1.5589 - regression_loss: 1.3436 - classification_loss: 0.2153 335/500 [===================>..........] - ETA: 1:03 - loss: 1.5579 - regression_loss: 1.3425 - classification_loss: 0.2154 336/500 [===================>..........] - ETA: 1:02 - loss: 1.5580 - regression_loss: 1.3426 - classification_loss: 0.2154 337/500 [===================>..........] - ETA: 1:02 - loss: 1.5580 - regression_loss: 1.3426 - classification_loss: 0.2154 338/500 [===================>..........] - ETA: 1:02 - loss: 1.5580 - regression_loss: 1.3427 - classification_loss: 0.2154 339/500 [===================>..........] - ETA: 1:01 - loss: 1.5584 - regression_loss: 1.3430 - classification_loss: 0.2153 340/500 [===================>..........] - ETA: 1:01 - loss: 1.5595 - regression_loss: 1.3440 - classification_loss: 0.2155 341/500 [===================>..........] - ETA: 1:00 - loss: 1.5592 - regression_loss: 1.3438 - classification_loss: 0.2155 342/500 [===================>..........] - ETA: 1:00 - loss: 1.5582 - regression_loss: 1.3429 - classification_loss: 0.2153 343/500 [===================>..........] - ETA: 1:00 - loss: 1.5585 - regression_loss: 1.3433 - classification_loss: 0.2152 344/500 [===================>..........] - ETA: 59s - loss: 1.5584 - regression_loss: 1.3432 - classification_loss: 0.2152  345/500 [===================>..........] - ETA: 59s - loss: 1.5583 - regression_loss: 1.3430 - classification_loss: 0.2152 346/500 [===================>..........] - ETA: 59s - loss: 1.5588 - regression_loss: 1.3435 - classification_loss: 0.2153 347/500 [===================>..........] - ETA: 58s - loss: 1.5589 - regression_loss: 1.3434 - classification_loss: 0.2154 348/500 [===================>..........] - ETA: 58s - loss: 1.5578 - regression_loss: 1.3425 - classification_loss: 0.2153 349/500 [===================>..........] - ETA: 57s - loss: 1.5578 - regression_loss: 1.3427 - classification_loss: 0.2152 350/500 [====================>.........] - ETA: 57s - loss: 1.5582 - regression_loss: 1.3430 - classification_loss: 0.2152 351/500 [====================>.........] - ETA: 57s - loss: 1.5566 - regression_loss: 1.3416 - classification_loss: 0.2150 352/500 [====================>.........] - ETA: 56s - loss: 1.5564 - regression_loss: 1.3415 - classification_loss: 0.2149 353/500 [====================>.........] - ETA: 56s - loss: 1.5542 - regression_loss: 1.3394 - classification_loss: 0.2147 354/500 [====================>.........] - ETA: 55s - loss: 1.5545 - regression_loss: 1.3398 - classification_loss: 0.2147 355/500 [====================>.........] - ETA: 55s - loss: 1.5556 - regression_loss: 1.3407 - classification_loss: 0.2149 356/500 [====================>.........] - ETA: 55s - loss: 1.5558 - regression_loss: 1.3409 - classification_loss: 0.2149 357/500 [====================>.........] - ETA: 54s - loss: 1.5557 - regression_loss: 1.3408 - classification_loss: 0.2149 358/500 [====================>.........] - ETA: 54s - loss: 1.5560 - regression_loss: 1.3410 - classification_loss: 0.2149 359/500 [====================>.........] - ETA: 54s - loss: 1.5544 - regression_loss: 1.3397 - classification_loss: 0.2147 360/500 [====================>.........] - ETA: 53s - loss: 1.5545 - regression_loss: 1.3398 - classification_loss: 0.2147 361/500 [====================>.........] - ETA: 53s - loss: 1.5549 - regression_loss: 1.3402 - classification_loss: 0.2147 362/500 [====================>.........] - ETA: 52s - loss: 1.5538 - regression_loss: 1.3393 - classification_loss: 0.2145 363/500 [====================>.........] - ETA: 52s - loss: 1.5529 - regression_loss: 1.3385 - classification_loss: 0.2144 364/500 [====================>.........] - ETA: 52s - loss: 1.5540 - regression_loss: 1.3394 - classification_loss: 0.2146 365/500 [====================>.........] - ETA: 51s - loss: 1.5530 - regression_loss: 1.3384 - classification_loss: 0.2146 366/500 [====================>.........] - ETA: 51s - loss: 1.5529 - regression_loss: 1.3384 - classification_loss: 0.2145 367/500 [=====================>........] - ETA: 50s - loss: 1.5510 - regression_loss: 1.3369 - classification_loss: 0.2142 368/500 [=====================>........] - ETA: 50s - loss: 1.5526 - regression_loss: 1.3382 - classification_loss: 0.2144 369/500 [=====================>........] - ETA: 50s - loss: 1.5531 - regression_loss: 1.3386 - classification_loss: 0.2145 370/500 [=====================>........] - ETA: 49s - loss: 1.5536 - regression_loss: 1.3391 - classification_loss: 0.2144 371/500 [=====================>........] - ETA: 49s - loss: 1.5533 - regression_loss: 1.3390 - classification_loss: 0.2143 372/500 [=====================>........] - ETA: 49s - loss: 1.5549 - regression_loss: 1.3404 - classification_loss: 0.2145 373/500 [=====================>........] - ETA: 48s - loss: 1.5548 - regression_loss: 1.3403 - classification_loss: 0.2145 374/500 [=====================>........] - ETA: 48s - loss: 1.5527 - regression_loss: 1.3386 - classification_loss: 0.2142 375/500 [=====================>........] - ETA: 47s - loss: 1.5528 - regression_loss: 1.3386 - classification_loss: 0.2142 376/500 [=====================>........] - ETA: 47s - loss: 1.5530 - regression_loss: 1.3389 - classification_loss: 0.2141 377/500 [=====================>........] - ETA: 47s - loss: 1.5543 - regression_loss: 1.3400 - classification_loss: 0.2143 378/500 [=====================>........] - ETA: 46s - loss: 1.5562 - regression_loss: 1.3416 - classification_loss: 0.2146 379/500 [=====================>........] - ETA: 46s - loss: 1.5566 - regression_loss: 1.3419 - classification_loss: 0.2147 380/500 [=====================>........] - ETA: 45s - loss: 1.5568 - regression_loss: 1.3422 - classification_loss: 0.2146 381/500 [=====================>........] - ETA: 45s - loss: 1.5551 - regression_loss: 1.3407 - classification_loss: 0.2143 382/500 [=====================>........] - ETA: 45s - loss: 1.5556 - regression_loss: 1.3412 - classification_loss: 0.2144 383/500 [=====================>........] - ETA: 44s - loss: 1.5569 - regression_loss: 1.3422 - classification_loss: 0.2147 384/500 [======================>.......] - ETA: 44s - loss: 1.5562 - regression_loss: 1.3416 - classification_loss: 0.2146 385/500 [======================>.......] - ETA: 44s - loss: 1.5548 - regression_loss: 1.3405 - classification_loss: 0.2143 386/500 [======================>.......] - ETA: 43s - loss: 1.5536 - regression_loss: 1.3395 - classification_loss: 0.2142 387/500 [======================>.......] - ETA: 43s - loss: 1.5542 - regression_loss: 1.3402 - classification_loss: 0.2140 388/500 [======================>.......] - ETA: 42s - loss: 1.5554 - regression_loss: 1.3413 - classification_loss: 0.2141 389/500 [======================>.......] - ETA: 42s - loss: 1.5552 - regression_loss: 1.3412 - classification_loss: 0.2140 390/500 [======================>.......] - ETA: 42s - loss: 1.5554 - regression_loss: 1.3413 - classification_loss: 0.2141 391/500 [======================>.......] - ETA: 41s - loss: 1.5554 - regression_loss: 1.3413 - classification_loss: 0.2141 392/500 [======================>.......] - ETA: 41s - loss: 1.5563 - regression_loss: 1.3420 - classification_loss: 0.2143 393/500 [======================>.......] - ETA: 40s - loss: 1.5568 - regression_loss: 1.3425 - classification_loss: 0.2143 394/500 [======================>.......] - ETA: 40s - loss: 1.5577 - regression_loss: 1.3435 - classification_loss: 0.2143 395/500 [======================>.......] - ETA: 40s - loss: 1.5575 - regression_loss: 1.3433 - classification_loss: 0.2142 396/500 [======================>.......] - ETA: 39s - loss: 1.5567 - regression_loss: 1.3428 - classification_loss: 0.2139 397/500 [======================>.......] - ETA: 39s - loss: 1.5571 - regression_loss: 1.3431 - classification_loss: 0.2139 398/500 [======================>.......] - ETA: 39s - loss: 1.5562 - regression_loss: 1.3425 - classification_loss: 0.2137 399/500 [======================>.......] - ETA: 38s - loss: 1.5568 - regression_loss: 1.3429 - classification_loss: 0.2139 400/500 [=======================>......] - ETA: 38s - loss: 1.5549 - regression_loss: 1.3413 - classification_loss: 0.2136 401/500 [=======================>......] - ETA: 37s - loss: 1.5550 - regression_loss: 1.3414 - classification_loss: 0.2136 402/500 [=======================>......] - ETA: 37s - loss: 1.5546 - regression_loss: 1.3411 - classification_loss: 0.2135 403/500 [=======================>......] - ETA: 37s - loss: 1.5542 - regression_loss: 1.3408 - classification_loss: 0.2134 404/500 [=======================>......] - ETA: 36s - loss: 1.5553 - regression_loss: 1.3418 - classification_loss: 0.2135 405/500 [=======================>......] - ETA: 36s - loss: 1.5572 - regression_loss: 1.3437 - classification_loss: 0.2136 406/500 [=======================>......] - ETA: 36s - loss: 1.5569 - regression_loss: 1.3434 - classification_loss: 0.2135 407/500 [=======================>......] - ETA: 35s - loss: 1.5567 - regression_loss: 1.3432 - classification_loss: 0.2134 408/500 [=======================>......] - ETA: 35s - loss: 1.5557 - regression_loss: 1.3423 - classification_loss: 0.2134 409/500 [=======================>......] - ETA: 34s - loss: 1.5558 - regression_loss: 1.3425 - classification_loss: 0.2133 410/500 [=======================>......] - ETA: 34s - loss: 1.5565 - regression_loss: 1.3430 - classification_loss: 0.2134 411/500 [=======================>......] - ETA: 34s - loss: 1.5570 - regression_loss: 1.3435 - classification_loss: 0.2135 412/500 [=======================>......] - ETA: 33s - loss: 1.5575 - regression_loss: 1.3437 - classification_loss: 0.2138 413/500 [=======================>......] - ETA: 33s - loss: 1.5560 - regression_loss: 1.3425 - classification_loss: 0.2135 414/500 [=======================>......] - ETA: 32s - loss: 1.5569 - regression_loss: 1.3433 - classification_loss: 0.2136 415/500 [=======================>......] - ETA: 32s - loss: 1.5554 - regression_loss: 1.3420 - classification_loss: 0.2133 416/500 [=======================>......] - ETA: 32s - loss: 1.5544 - regression_loss: 1.3411 - classification_loss: 0.2133 417/500 [========================>.....] - ETA: 31s - loss: 1.5523 - regression_loss: 1.3392 - classification_loss: 0.2130 418/500 [========================>.....] - ETA: 31s - loss: 1.5510 - regression_loss: 1.3380 - classification_loss: 0.2130 419/500 [========================>.....] - ETA: 31s - loss: 1.5512 - regression_loss: 1.3381 - classification_loss: 0.2131 420/500 [========================>.....] - ETA: 30s - loss: 1.5517 - regression_loss: 1.3385 - classification_loss: 0.2131 421/500 [========================>.....] - ETA: 30s - loss: 1.5519 - regression_loss: 1.3387 - classification_loss: 0.2133 422/500 [========================>.....] - ETA: 29s - loss: 1.5520 - regression_loss: 1.3386 - classification_loss: 0.2135 423/500 [========================>.....] - ETA: 29s - loss: 1.5527 - regression_loss: 1.3391 - classification_loss: 0.2135 424/500 [========================>.....] - ETA: 29s - loss: 1.5526 - regression_loss: 1.3392 - classification_loss: 0.2135 425/500 [========================>.....] - ETA: 28s - loss: 1.5525 - regression_loss: 1.3390 - classification_loss: 0.2136 426/500 [========================>.....] - ETA: 28s - loss: 1.5526 - regression_loss: 1.3391 - classification_loss: 0.2135 427/500 [========================>.....] - ETA: 27s - loss: 1.5527 - regression_loss: 1.3392 - classification_loss: 0.2135 428/500 [========================>.....] - ETA: 27s - loss: 1.5535 - regression_loss: 1.3398 - classification_loss: 0.2136 429/500 [========================>.....] - ETA: 27s - loss: 1.5535 - regression_loss: 1.3398 - classification_loss: 0.2137 430/500 [========================>.....] - ETA: 26s - loss: 1.5535 - regression_loss: 1.3398 - classification_loss: 0.2137 431/500 [========================>.....] - ETA: 26s - loss: 1.5520 - regression_loss: 1.3384 - classification_loss: 0.2135 432/500 [========================>.....] - ETA: 26s - loss: 1.5515 - regression_loss: 1.3380 - classification_loss: 0.2135 433/500 [========================>.....] - ETA: 25s - loss: 1.5516 - regression_loss: 1.3383 - classification_loss: 0.2133 434/500 [=========================>....] - ETA: 25s - loss: 1.5520 - regression_loss: 1.3386 - classification_loss: 0.2134 435/500 [=========================>....] - ETA: 24s - loss: 1.5517 - regression_loss: 1.3383 - classification_loss: 0.2134 436/500 [=========================>....] - ETA: 24s - loss: 1.5511 - regression_loss: 1.3378 - classification_loss: 0.2132 437/500 [=========================>....] - ETA: 24s - loss: 1.5506 - regression_loss: 1.3375 - classification_loss: 0.2132 438/500 [=========================>....] - ETA: 23s - loss: 1.5493 - regression_loss: 1.3363 - classification_loss: 0.2130 439/500 [=========================>....] - ETA: 23s - loss: 1.5483 - regression_loss: 1.3354 - classification_loss: 0.2129 440/500 [=========================>....] - ETA: 22s - loss: 1.5479 - regression_loss: 1.3350 - classification_loss: 0.2128 441/500 [=========================>....] - ETA: 22s - loss: 1.5466 - regression_loss: 1.3337 - classification_loss: 0.2130 442/500 [=========================>....] - ETA: 22s - loss: 1.5449 - regression_loss: 1.3323 - classification_loss: 0.2126 443/500 [=========================>....] - ETA: 21s - loss: 1.5437 - regression_loss: 1.3312 - classification_loss: 0.2124 444/500 [=========================>....] - ETA: 21s - loss: 1.5425 - regression_loss: 1.3303 - classification_loss: 0.2122 445/500 [=========================>....] - ETA: 21s - loss: 1.5417 - regression_loss: 1.3297 - classification_loss: 0.2120 446/500 [=========================>....] - ETA: 20s - loss: 1.5409 - regression_loss: 1.3289 - classification_loss: 0.2119 447/500 [=========================>....] - ETA: 20s - loss: 1.5415 - regression_loss: 1.3296 - classification_loss: 0.2119 448/500 [=========================>....] - ETA: 19s - loss: 1.5407 - regression_loss: 1.3288 - classification_loss: 0.2119 449/500 [=========================>....] - ETA: 19s - loss: 1.5391 - regression_loss: 1.3275 - classification_loss: 0.2116 450/500 [==========================>...] - ETA: 19s - loss: 1.5367 - regression_loss: 1.3254 - classification_loss: 0.2114 451/500 [==========================>...] - ETA: 18s - loss: 1.5366 - regression_loss: 1.3253 - classification_loss: 0.2113 452/500 [==========================>...] - ETA: 18s - loss: 1.5367 - regression_loss: 1.3253 - classification_loss: 0.2114 453/500 [==========================>...] - ETA: 17s - loss: 1.5374 - regression_loss: 1.3259 - classification_loss: 0.2114 454/500 [==========================>...] - ETA: 17s - loss: 1.5362 - regression_loss: 1.3248 - classification_loss: 0.2113 455/500 [==========================>...] - ETA: 17s - loss: 1.5362 - regression_loss: 1.3250 - classification_loss: 0.2113 456/500 [==========================>...] - ETA: 16s - loss: 1.5368 - regression_loss: 1.3255 - classification_loss: 0.2113 457/500 [==========================>...] - ETA: 16s - loss: 1.5375 - regression_loss: 1.3263 - classification_loss: 0.2112 458/500 [==========================>...] - ETA: 16s - loss: 1.5371 - regression_loss: 1.3260 - classification_loss: 0.2112 459/500 [==========================>...] - ETA: 15s - loss: 1.5374 - regression_loss: 1.3262 - classification_loss: 0.2112 460/500 [==========================>...] - ETA: 15s - loss: 1.5371 - regression_loss: 1.3260 - classification_loss: 0.2111 461/500 [==========================>...] - ETA: 14s - loss: 1.5375 - regression_loss: 1.3264 - classification_loss: 0.2111 462/500 [==========================>...] - ETA: 14s - loss: 1.5377 - regression_loss: 1.3266 - classification_loss: 0.2111 463/500 [==========================>...] - ETA: 14s - loss: 1.5360 - regression_loss: 1.3251 - classification_loss: 0.2109 464/500 [==========================>...] - ETA: 13s - loss: 1.5355 - regression_loss: 1.3247 - classification_loss: 0.2108 465/500 [==========================>...] - ETA: 13s - loss: 1.5351 - regression_loss: 1.3244 - classification_loss: 0.2108 466/500 [==========================>...] - ETA: 13s - loss: 1.5347 - regression_loss: 1.3240 - classification_loss: 0.2107 467/500 [===========================>..] - ETA: 12s - loss: 1.5336 - regression_loss: 1.3231 - classification_loss: 0.2104 468/500 [===========================>..] - ETA: 12s - loss: 1.5337 - regression_loss: 1.3232 - classification_loss: 0.2105 469/500 [===========================>..] - ETA: 11s - loss: 1.5339 - regression_loss: 1.3234 - classification_loss: 0.2105 470/500 [===========================>..] - ETA: 11s - loss: 1.5328 - regression_loss: 1.3225 - classification_loss: 0.2103 471/500 [===========================>..] - ETA: 11s - loss: 1.5323 - regression_loss: 1.3218 - classification_loss: 0.2104 472/500 [===========================>..] - ETA: 10s - loss: 1.5318 - regression_loss: 1.3215 - classification_loss: 0.2103 473/500 [===========================>..] - ETA: 10s - loss: 1.5316 - regression_loss: 1.3212 - classification_loss: 0.2105 474/500 [===========================>..] - ETA: 9s - loss: 1.5315 - regression_loss: 1.3211 - classification_loss: 0.2104  475/500 [===========================>..] - ETA: 9s - loss: 1.5321 - regression_loss: 1.3217 - classification_loss: 0.2103 476/500 [===========================>..] - ETA: 9s - loss: 1.5306 - regression_loss: 1.3206 - classification_loss: 0.2101 477/500 [===========================>..] - ETA: 8s - loss: 1.5313 - regression_loss: 1.3212 - classification_loss: 0.2102 478/500 [===========================>..] - ETA: 8s - loss: 1.5322 - regression_loss: 1.3219 - classification_loss: 0.2103 479/500 [===========================>..] - ETA: 8s - loss: 1.5334 - regression_loss: 1.3228 - classification_loss: 0.2105 480/500 [===========================>..] - ETA: 7s - loss: 1.5338 - regression_loss: 1.3231 - classification_loss: 0.2107 481/500 [===========================>..] - ETA: 7s - loss: 1.5347 - regression_loss: 1.3239 - classification_loss: 0.2108 482/500 [===========================>..] - ETA: 6s - loss: 1.5345 - regression_loss: 1.3237 - classification_loss: 0.2108 483/500 [===========================>..] - ETA: 6s - loss: 1.5333 - regression_loss: 1.3227 - classification_loss: 0.2106 484/500 [============================>.] - ETA: 6s - loss: 1.5325 - regression_loss: 1.3219 - classification_loss: 0.2106 485/500 [============================>.] - ETA: 5s - loss: 1.5329 - regression_loss: 1.3222 - classification_loss: 0.2107 486/500 [============================>.] - ETA: 5s - loss: 1.5323 - regression_loss: 1.3217 - classification_loss: 0.2106 487/500 [============================>.] - ETA: 4s - loss: 1.5331 - regression_loss: 1.3223 - classification_loss: 0.2108 488/500 [============================>.] - ETA: 4s - loss: 1.5334 - regression_loss: 1.3226 - classification_loss: 0.2109 489/500 [============================>.] - ETA: 4s - loss: 1.5321 - regression_loss: 1.3213 - classification_loss: 0.2108 490/500 [============================>.] - ETA: 3s - loss: 1.5326 - regression_loss: 1.3217 - classification_loss: 0.2109 491/500 [============================>.] - ETA: 3s - loss: 1.5336 - regression_loss: 1.3226 - classification_loss: 0.2110 492/500 [============================>.] - ETA: 3s - loss: 1.5322 - regression_loss: 1.3214 - classification_loss: 0.2108 493/500 [============================>.] - ETA: 2s - loss: 1.5320 - regression_loss: 1.3213 - classification_loss: 0.2108 494/500 [============================>.] - ETA: 2s - loss: 1.5317 - regression_loss: 1.3211 - classification_loss: 0.2107 495/500 [============================>.] - ETA: 1s - loss: 1.5322 - regression_loss: 1.3214 - classification_loss: 0.2108 496/500 [============================>.] - ETA: 1s - loss: 1.5313 - regression_loss: 1.3207 - classification_loss: 0.2106 497/500 [============================>.] - ETA: 1s - loss: 1.5311 - regression_loss: 1.3206 - classification_loss: 0.2105 498/500 [============================>.] - ETA: 0s - loss: 1.5312 - regression_loss: 1.3206 - classification_loss: 0.2105 499/500 [============================>.] - ETA: 0s - loss: 1.5298 - regression_loss: 1.3195 - classification_loss: 0.2103 500/500 [==============================] - 191s 383ms/step - loss: 1.5292 - regression_loss: 1.3190 - classification_loss: 0.2102 1172 instances of class plum with average precision: 0.6089 mAP: 0.6089 Epoch 00005: saving model to ./training/snapshots/resnet101_pascal_05.h5 Epoch 6/20 1/500 [..............................] - ETA: 3:05 - loss: 1.7174 - regression_loss: 1.4878 - classification_loss: 0.2295 2/500 [..............................] - ETA: 3:10 - loss: 1.6241 - regression_loss: 1.3357 - classification_loss: 0.2884 3/500 [..............................] - ETA: 3:08 - loss: 1.3912 - regression_loss: 1.1369 - classification_loss: 0.2543 4/500 [..............................] - ETA: 3:11 - loss: 1.2072 - regression_loss: 1.0011 - classification_loss: 0.2061 5/500 [..............................] - ETA: 3:10 - loss: 1.1892 - regression_loss: 0.9895 - classification_loss: 0.1997 6/500 [..............................] - ETA: 3:10 - loss: 1.2912 - regression_loss: 1.0827 - classification_loss: 0.2085 7/500 [..............................] - ETA: 3:09 - loss: 1.2136 - regression_loss: 1.0179 - classification_loss: 0.1956 8/500 [..............................] - ETA: 3:09 - loss: 1.2236 - regression_loss: 1.0288 - classification_loss: 0.1949 9/500 [..............................] - ETA: 3:10 - loss: 1.1762 - regression_loss: 0.9924 - classification_loss: 0.1839 10/500 [..............................] - ETA: 3:10 - loss: 1.2262 - regression_loss: 1.0394 - classification_loss: 0.1868 11/500 [..............................] - ETA: 3:09 - loss: 1.2182 - regression_loss: 1.0317 - classification_loss: 0.1864 12/500 [..............................] - ETA: 3:07 - loss: 1.1865 - regression_loss: 1.0053 - classification_loss: 0.1812 13/500 [..............................] - ETA: 3:08 - loss: 1.2010 - regression_loss: 1.0204 - classification_loss: 0.1807 14/500 [..............................] - ETA: 3:07 - loss: 1.2197 - regression_loss: 1.0383 - classification_loss: 0.1814 15/500 [..............................] - ETA: 3:06 - loss: 1.2776 - regression_loss: 1.0866 - classification_loss: 0.1910 16/500 [..............................] - ETA: 3:06 - loss: 1.2859 - regression_loss: 1.0937 - classification_loss: 0.1922 17/500 [>.............................] - ETA: 3:06 - loss: 1.3427 - regression_loss: 1.1437 - classification_loss: 0.1990 18/500 [>.............................] - ETA: 3:05 - loss: 1.3580 - regression_loss: 1.1574 - classification_loss: 0.2006 19/500 [>.............................] - ETA: 3:05 - loss: 1.3589 - regression_loss: 1.1593 - classification_loss: 0.1996 20/500 [>.............................] - ETA: 3:04 - loss: 1.3725 - regression_loss: 1.1726 - classification_loss: 0.1999 21/500 [>.............................] - ETA: 3:04 - loss: 1.3783 - regression_loss: 1.1785 - classification_loss: 0.1997 22/500 [>.............................] - ETA: 3:04 - loss: 1.3753 - regression_loss: 1.1791 - classification_loss: 0.1962 23/500 [>.............................] - ETA: 3:04 - loss: 1.3725 - regression_loss: 1.1785 - classification_loss: 0.1940 24/500 [>.............................] - ETA: 3:03 - loss: 1.3726 - regression_loss: 1.1798 - classification_loss: 0.1928 25/500 [>.............................] - ETA: 3:02 - loss: 1.3989 - regression_loss: 1.2028 - classification_loss: 0.1961 26/500 [>.............................] - ETA: 3:02 - loss: 1.4081 - regression_loss: 1.2113 - classification_loss: 0.1968 27/500 [>.............................] - ETA: 3:02 - loss: 1.4097 - regression_loss: 1.2139 - classification_loss: 0.1958 28/500 [>.............................] - ETA: 3:02 - loss: 1.4134 - regression_loss: 1.2170 - classification_loss: 0.1964 29/500 [>.............................] - ETA: 3:02 - loss: 1.4335 - regression_loss: 1.2340 - classification_loss: 0.1995 30/500 [>.............................] - ETA: 3:01 - loss: 1.4050 - regression_loss: 1.2093 - classification_loss: 0.1957 31/500 [>.............................] - ETA: 3:01 - loss: 1.4006 - regression_loss: 1.2050 - classification_loss: 0.1956 32/500 [>.............................] - ETA: 3:01 - loss: 1.3976 - regression_loss: 1.2029 - classification_loss: 0.1947 33/500 [>.............................] - ETA: 3:01 - loss: 1.3997 - regression_loss: 1.2050 - classification_loss: 0.1947 34/500 [=>............................] - ETA: 3:00 - loss: 1.3866 - regression_loss: 1.1951 - classification_loss: 0.1915 35/500 [=>............................] - ETA: 2:59 - loss: 1.3906 - regression_loss: 1.1998 - classification_loss: 0.1908 36/500 [=>............................] - ETA: 2:58 - loss: 1.3947 - regression_loss: 1.2030 - classification_loss: 0.1917 37/500 [=>............................] - ETA: 2:58 - loss: 1.3911 - regression_loss: 1.1993 - classification_loss: 0.1918 38/500 [=>............................] - ETA: 2:58 - loss: 1.3878 - regression_loss: 1.1968 - classification_loss: 0.1910 39/500 [=>............................] - ETA: 2:57 - loss: 1.3793 - regression_loss: 1.1880 - classification_loss: 0.1913 40/500 [=>............................] - ETA: 2:57 - loss: 1.3816 - regression_loss: 1.1901 - classification_loss: 0.1915 41/500 [=>............................] - ETA: 2:57 - loss: 1.3827 - regression_loss: 1.1919 - classification_loss: 0.1908 42/500 [=>............................] - ETA: 2:56 - loss: 1.3702 - regression_loss: 1.1816 - classification_loss: 0.1886 43/500 [=>............................] - ETA: 2:56 - loss: 1.3670 - regression_loss: 1.1788 - classification_loss: 0.1883 44/500 [=>............................] - ETA: 2:56 - loss: 1.3808 - regression_loss: 1.1902 - classification_loss: 0.1905 45/500 [=>............................] - ETA: 2:55 - loss: 1.3706 - regression_loss: 1.1819 - classification_loss: 0.1887 46/500 [=>............................] - ETA: 2:55 - loss: 1.3677 - regression_loss: 1.1792 - classification_loss: 0.1885 47/500 [=>............................] - ETA: 2:54 - loss: 1.3745 - regression_loss: 1.1851 - classification_loss: 0.1895 48/500 [=>............................] - ETA: 2:54 - loss: 1.3759 - regression_loss: 1.1861 - classification_loss: 0.1899 49/500 [=>............................] - ETA: 2:54 - loss: 1.3835 - regression_loss: 1.1948 - classification_loss: 0.1887 50/500 [==>...........................] - ETA: 2:53 - loss: 1.3899 - regression_loss: 1.2006 - classification_loss: 0.1893 51/500 [==>...........................] - ETA: 2:53 - loss: 1.3820 - regression_loss: 1.1947 - classification_loss: 0.1874 52/500 [==>...........................] - ETA: 2:53 - loss: 1.3746 - regression_loss: 1.1877 - classification_loss: 0.1868 53/500 [==>...........................] - ETA: 2:53 - loss: 1.3819 - regression_loss: 1.1945 - classification_loss: 0.1874 54/500 [==>...........................] - ETA: 2:52 - loss: 1.3748 - regression_loss: 1.1882 - classification_loss: 0.1866 55/500 [==>...........................] - ETA: 2:52 - loss: 1.3778 - regression_loss: 1.1911 - classification_loss: 0.1867 56/500 [==>...........................] - ETA: 2:51 - loss: 1.3859 - regression_loss: 1.1987 - classification_loss: 0.1872 57/500 [==>...........................] - ETA: 2:51 - loss: 1.3951 - regression_loss: 1.2070 - classification_loss: 0.1880 58/500 [==>...........................] - ETA: 2:50 - loss: 1.4029 - regression_loss: 1.2144 - classification_loss: 0.1886 59/500 [==>...........................] - ETA: 2:50 - loss: 1.4036 - regression_loss: 1.2156 - classification_loss: 0.1880 60/500 [==>...........................] - ETA: 2:49 - loss: 1.4156 - regression_loss: 1.2258 - classification_loss: 0.1898 61/500 [==>...........................] - ETA: 2:49 - loss: 1.4232 - regression_loss: 1.2332 - classification_loss: 0.1900 62/500 [==>...........................] - ETA: 2:48 - loss: 1.4305 - regression_loss: 1.2397 - classification_loss: 0.1908 63/500 [==>...........................] - ETA: 2:48 - loss: 1.4259 - regression_loss: 1.2359 - classification_loss: 0.1900 64/500 [==>...........................] - ETA: 2:47 - loss: 1.4266 - regression_loss: 1.2366 - classification_loss: 0.1900 65/500 [==>...........................] - ETA: 2:47 - loss: 1.4239 - regression_loss: 1.2340 - classification_loss: 0.1899 66/500 [==>...........................] - ETA: 2:47 - loss: 1.4235 - regression_loss: 1.2336 - classification_loss: 0.1899 67/500 [===>..........................] - ETA: 2:46 - loss: 1.4277 - regression_loss: 1.2369 - classification_loss: 0.1908 68/500 [===>..........................] - ETA: 2:46 - loss: 1.4344 - regression_loss: 1.2424 - classification_loss: 0.1921 69/500 [===>..........................] - ETA: 2:45 - loss: 1.4419 - regression_loss: 1.2492 - classification_loss: 0.1926 70/500 [===>..........................] - ETA: 2:45 - loss: 1.4302 - regression_loss: 1.2393 - classification_loss: 0.1909 71/500 [===>..........................] - ETA: 2:45 - loss: 1.4309 - regression_loss: 1.2394 - classification_loss: 0.1914 72/500 [===>..........................] - ETA: 2:44 - loss: 1.4280 - regression_loss: 1.2368 - classification_loss: 0.1912 73/500 [===>..........................] - ETA: 2:44 - loss: 1.4256 - regression_loss: 1.2341 - classification_loss: 0.1915 74/500 [===>..........................] - ETA: 2:44 - loss: 1.4237 - regression_loss: 1.2321 - classification_loss: 0.1916 75/500 [===>..........................] - ETA: 2:43 - loss: 1.4239 - regression_loss: 1.2328 - classification_loss: 0.1911 76/500 [===>..........................] - ETA: 2:43 - loss: 1.4228 - regression_loss: 1.2316 - classification_loss: 0.1912 77/500 [===>..........................] - ETA: 2:43 - loss: 1.4243 - regression_loss: 1.2328 - classification_loss: 0.1915 78/500 [===>..........................] - ETA: 2:42 - loss: 1.4231 - regression_loss: 1.2322 - classification_loss: 0.1910 79/500 [===>..........................] - ETA: 2:42 - loss: 1.4285 - regression_loss: 1.2367 - classification_loss: 0.1918 80/500 [===>..........................] - ETA: 2:41 - loss: 1.4319 - regression_loss: 1.2396 - classification_loss: 0.1923 81/500 [===>..........................] - ETA: 2:41 - loss: 1.4317 - regression_loss: 1.2363 - classification_loss: 0.1954 82/500 [===>..........................] - ETA: 2:41 - loss: 1.4505 - regression_loss: 1.2450 - classification_loss: 0.2055 83/500 [===>..........................] - ETA: 2:40 - loss: 1.4530 - regression_loss: 1.2468 - classification_loss: 0.2062 84/500 [====>.........................] - ETA: 2:40 - loss: 1.4515 - regression_loss: 1.2465 - classification_loss: 0.2050 85/500 [====>.........................] - ETA: 2:39 - loss: 1.4539 - regression_loss: 1.2488 - classification_loss: 0.2051 86/500 [====>.........................] - ETA: 2:39 - loss: 1.4531 - regression_loss: 1.2484 - classification_loss: 0.2047 87/500 [====>.........................] - ETA: 2:38 - loss: 1.4480 - regression_loss: 1.2442 - classification_loss: 0.2038 88/500 [====>.........................] - ETA: 2:38 - loss: 1.4483 - regression_loss: 1.2445 - classification_loss: 0.2038 89/500 [====>.........................] - ETA: 2:37 - loss: 1.4420 - regression_loss: 1.2390 - classification_loss: 0.2029 90/500 [====>.........................] - ETA: 2:37 - loss: 1.4462 - regression_loss: 1.2423 - classification_loss: 0.2039 91/500 [====>.........................] - ETA: 2:37 - loss: 1.4364 - regression_loss: 1.2333 - classification_loss: 0.2030 92/500 [====>.........................] - ETA: 2:36 - loss: 1.4395 - regression_loss: 1.2359 - classification_loss: 0.2036 93/500 [====>.........................] - ETA: 2:36 - loss: 1.4461 - regression_loss: 1.2414 - classification_loss: 0.2047 94/500 [====>.........................] - ETA: 2:35 - loss: 1.4443 - regression_loss: 1.2405 - classification_loss: 0.2038 95/500 [====>.........................] - ETA: 2:35 - loss: 1.4474 - regression_loss: 1.2434 - classification_loss: 0.2040 96/500 [====>.........................] - ETA: 2:35 - loss: 1.4449 - regression_loss: 1.2413 - classification_loss: 0.2036 97/500 [====>.........................] - ETA: 2:34 - loss: 1.4411 - regression_loss: 1.2381 - classification_loss: 0.2030 98/500 [====>.........................] - ETA: 2:34 - loss: 1.4427 - regression_loss: 1.2401 - classification_loss: 0.2026 99/500 [====>.........................] - ETA: 2:34 - loss: 1.4389 - regression_loss: 1.2367 - classification_loss: 0.2022 100/500 [=====>........................] - ETA: 2:33 - loss: 1.4325 - regression_loss: 1.2314 - classification_loss: 0.2012 101/500 [=====>........................] - ETA: 2:33 - loss: 1.4334 - regression_loss: 1.2323 - classification_loss: 0.2011 102/500 [=====>........................] - ETA: 2:32 - loss: 1.4337 - regression_loss: 1.2319 - classification_loss: 0.2018 103/500 [=====>........................] - ETA: 2:32 - loss: 1.4341 - regression_loss: 1.2311 - classification_loss: 0.2029 104/500 [=====>........................] - ETA: 2:32 - loss: 1.4345 - regression_loss: 1.2312 - classification_loss: 0.2033 105/500 [=====>........................] - ETA: 2:31 - loss: 1.4383 - regression_loss: 1.2333 - classification_loss: 0.2050 106/500 [=====>........................] - ETA: 2:31 - loss: 1.4528 - regression_loss: 1.2390 - classification_loss: 0.2138 107/500 [=====>........................] - ETA: 2:30 - loss: 1.4604 - regression_loss: 1.2460 - classification_loss: 0.2144 108/500 [=====>........................] - ETA: 2:30 - loss: 1.4563 - regression_loss: 1.2426 - classification_loss: 0.2137 109/500 [=====>........................] - ETA: 2:30 - loss: 1.4579 - regression_loss: 1.2436 - classification_loss: 0.2143 110/500 [=====>........................] - ETA: 2:29 - loss: 1.4609 - regression_loss: 1.2461 - classification_loss: 0.2148 111/500 [=====>........................] - ETA: 2:29 - loss: 1.4637 - regression_loss: 1.2481 - classification_loss: 0.2156 112/500 [=====>........................] - ETA: 2:29 - loss: 1.4638 - regression_loss: 1.2485 - classification_loss: 0.2153 113/500 [=====>........................] - ETA: 2:28 - loss: 1.4623 - regression_loss: 1.2466 - classification_loss: 0.2157 114/500 [=====>........................] - ETA: 2:28 - loss: 1.4624 - regression_loss: 1.2470 - classification_loss: 0.2154 115/500 [=====>........................] - ETA: 2:27 - loss: 1.4671 - regression_loss: 1.2514 - classification_loss: 0.2157 116/500 [=====>........................] - ETA: 2:27 - loss: 1.4693 - regression_loss: 1.2534 - classification_loss: 0.2159 117/500 [======>.......................] - ETA: 2:27 - loss: 1.4701 - regression_loss: 1.2541 - classification_loss: 0.2160 118/500 [======>.......................] - ETA: 2:26 - loss: 1.4709 - regression_loss: 1.2551 - classification_loss: 0.2158 119/500 [======>.......................] - ETA: 2:26 - loss: 1.4726 - regression_loss: 1.2569 - classification_loss: 0.2157 120/500 [======>.......................] - ETA: 2:26 - loss: 1.4700 - regression_loss: 1.2549 - classification_loss: 0.2151 121/500 [======>.......................] - ETA: 2:25 - loss: 1.4726 - regression_loss: 1.2573 - classification_loss: 0.2153 122/500 [======>.......................] - ETA: 2:25 - loss: 1.4744 - regression_loss: 1.2583 - classification_loss: 0.2160 123/500 [======>.......................] - ETA: 2:24 - loss: 1.4699 - regression_loss: 1.2542 - classification_loss: 0.2157 124/500 [======>.......................] - ETA: 2:24 - loss: 1.4713 - regression_loss: 1.2555 - classification_loss: 0.2158 125/500 [======>.......................] - ETA: 2:24 - loss: 1.4723 - regression_loss: 1.2565 - classification_loss: 0.2158 126/500 [======>.......................] - ETA: 2:23 - loss: 1.4753 - regression_loss: 1.2587 - classification_loss: 0.2167 127/500 [======>.......................] - ETA: 2:23 - loss: 1.4696 - regression_loss: 1.2538 - classification_loss: 0.2158 128/500 [======>.......................] - ETA: 2:22 - loss: 1.4698 - regression_loss: 1.2543 - classification_loss: 0.2155 129/500 [======>.......................] - ETA: 2:22 - loss: 1.4658 - regression_loss: 1.2512 - classification_loss: 0.2145 130/500 [======>.......................] - ETA: 2:22 - loss: 1.4608 - regression_loss: 1.2471 - classification_loss: 0.2137 131/500 [======>.......................] - ETA: 2:21 - loss: 1.4626 - regression_loss: 1.2489 - classification_loss: 0.2137 132/500 [======>.......................] - ETA: 2:21 - loss: 1.4610 - regression_loss: 1.2474 - classification_loss: 0.2135 133/500 [======>.......................] - ETA: 2:20 - loss: 1.4628 - regression_loss: 1.2487 - classification_loss: 0.2142 134/500 [=======>......................] - ETA: 2:20 - loss: 1.4579 - regression_loss: 1.2448 - classification_loss: 0.2132 135/500 [=======>......................] - ETA: 2:20 - loss: 1.4591 - regression_loss: 1.2456 - classification_loss: 0.2135 136/500 [=======>......................] - ETA: 2:19 - loss: 1.4582 - regression_loss: 1.2449 - classification_loss: 0.2132 137/500 [=======>......................] - ETA: 2:19 - loss: 1.4541 - regression_loss: 1.2413 - classification_loss: 0.2128 138/500 [=======>......................] - ETA: 2:18 - loss: 1.4549 - regression_loss: 1.2421 - classification_loss: 0.2127 139/500 [=======>......................] - ETA: 2:18 - loss: 1.4519 - regression_loss: 1.2397 - classification_loss: 0.2122 140/500 [=======>......................] - ETA: 2:18 - loss: 1.4489 - regression_loss: 1.2372 - classification_loss: 0.2117 141/500 [=======>......................] - ETA: 2:17 - loss: 1.4484 - regression_loss: 1.2370 - classification_loss: 0.2114 142/500 [=======>......................] - ETA: 2:17 - loss: 1.4491 - regression_loss: 1.2379 - classification_loss: 0.2111 143/500 [=======>......................] - ETA: 2:16 - loss: 1.4450 - regression_loss: 1.2346 - classification_loss: 0.2104 144/500 [=======>......................] - ETA: 2:16 - loss: 1.4415 - regression_loss: 1.2320 - classification_loss: 0.2095 145/500 [=======>......................] - ETA: 2:15 - loss: 1.4405 - regression_loss: 1.2312 - classification_loss: 0.2093 146/500 [=======>......................] - ETA: 2:15 - loss: 1.4410 - regression_loss: 1.2317 - classification_loss: 0.2093 147/500 [=======>......................] - ETA: 2:15 - loss: 1.4434 - regression_loss: 1.2339 - classification_loss: 0.2094 148/500 [=======>......................] - ETA: 2:14 - loss: 1.4459 - regression_loss: 1.2363 - classification_loss: 0.2095 149/500 [=======>......................] - ETA: 2:14 - loss: 1.4466 - regression_loss: 1.2372 - classification_loss: 0.2094 150/500 [========>.....................] - ETA: 2:13 - loss: 1.4481 - regression_loss: 1.2387 - classification_loss: 0.2094 151/500 [========>.....................] - ETA: 2:13 - loss: 1.4505 - regression_loss: 1.2411 - classification_loss: 0.2095 152/500 [========>.....................] - ETA: 2:13 - loss: 1.4462 - regression_loss: 1.2369 - classification_loss: 0.2092 153/500 [========>.....................] - ETA: 2:12 - loss: 1.4459 - regression_loss: 1.2368 - classification_loss: 0.2091 154/500 [========>.....................] - ETA: 2:12 - loss: 1.4449 - regression_loss: 1.2359 - classification_loss: 0.2090 155/500 [========>.....................] - ETA: 2:11 - loss: 1.4475 - regression_loss: 1.2382 - classification_loss: 0.2093 156/500 [========>.....................] - ETA: 2:11 - loss: 1.4490 - regression_loss: 1.2398 - classification_loss: 0.2092 157/500 [========>.....................] - ETA: 2:11 - loss: 1.4465 - regression_loss: 1.2377 - classification_loss: 0.2088 158/500 [========>.....................] - ETA: 2:10 - loss: 1.4484 - regression_loss: 1.2397 - classification_loss: 0.2087 159/500 [========>.....................] - ETA: 2:10 - loss: 1.4450 - regression_loss: 1.2369 - classification_loss: 0.2080 160/500 [========>.....................] - ETA: 2:10 - loss: 1.4444 - regression_loss: 1.2365 - classification_loss: 0.2079 161/500 [========>.....................] - ETA: 2:09 - loss: 1.4433 - regression_loss: 1.2356 - classification_loss: 0.2077 162/500 [========>.....................] - ETA: 2:09 - loss: 1.4464 - regression_loss: 1.2385 - classification_loss: 0.2079 163/500 [========>.....................] - ETA: 2:09 - loss: 1.4493 - regression_loss: 1.2410 - classification_loss: 0.2084 164/500 [========>.....................] - ETA: 2:08 - loss: 1.4482 - regression_loss: 1.2399 - classification_loss: 0.2083 165/500 [========>.....................] - ETA: 2:08 - loss: 1.4491 - regression_loss: 1.2408 - classification_loss: 0.2083 166/500 [========>.....................] - ETA: 2:08 - loss: 1.4472 - regression_loss: 1.2393 - classification_loss: 0.2080 167/500 [=========>....................] - ETA: 2:07 - loss: 1.4486 - regression_loss: 1.2407 - classification_loss: 0.2080 168/500 [=========>....................] - ETA: 2:07 - loss: 1.4484 - regression_loss: 1.2405 - classification_loss: 0.2079 169/500 [=========>....................] - ETA: 2:06 - loss: 1.4510 - regression_loss: 1.2425 - classification_loss: 0.2085 170/500 [=========>....................] - ETA: 2:06 - loss: 1.4488 - regression_loss: 1.2410 - classification_loss: 0.2078 171/500 [=========>....................] - ETA: 2:06 - loss: 1.4479 - regression_loss: 1.2403 - classification_loss: 0.2076 172/500 [=========>....................] - ETA: 2:05 - loss: 1.4465 - regression_loss: 1.2394 - classification_loss: 0.2071 173/500 [=========>....................] - ETA: 2:05 - loss: 1.4434 - regression_loss: 1.2368 - classification_loss: 0.2066 174/500 [=========>....................] - ETA: 2:04 - loss: 1.4378 - regression_loss: 1.2320 - classification_loss: 0.2058 175/500 [=========>....................] - ETA: 2:04 - loss: 1.4388 - regression_loss: 1.2330 - classification_loss: 0.2058 176/500 [=========>....................] - ETA: 2:04 - loss: 1.4386 - regression_loss: 1.2329 - classification_loss: 0.2057 177/500 [=========>....................] - ETA: 2:03 - loss: 1.4398 - regression_loss: 1.2345 - classification_loss: 0.2052 178/500 [=========>....................] - ETA: 2:03 - loss: 1.4403 - regression_loss: 1.2349 - classification_loss: 0.2054 179/500 [=========>....................] - ETA: 2:03 - loss: 1.4388 - regression_loss: 1.2338 - classification_loss: 0.2050 180/500 [=========>....................] - ETA: 2:02 - loss: 1.4378 - regression_loss: 1.2330 - classification_loss: 0.2048 181/500 [=========>....................] - ETA: 2:02 - loss: 1.4372 - regression_loss: 1.2327 - classification_loss: 0.2045 182/500 [=========>....................] - ETA: 2:01 - loss: 1.4370 - regression_loss: 1.2325 - classification_loss: 0.2045 183/500 [=========>....................] - ETA: 2:01 - loss: 1.4354 - regression_loss: 1.2314 - classification_loss: 0.2039 184/500 [==========>...................] - ETA: 2:01 - loss: 1.4341 - regression_loss: 1.2303 - classification_loss: 0.2038 185/500 [==========>...................] - ETA: 2:00 - loss: 1.4338 - regression_loss: 1.2304 - classification_loss: 0.2033 186/500 [==========>...................] - ETA: 2:00 - loss: 1.4303 - regression_loss: 1.2276 - classification_loss: 0.2027 187/500 [==========>...................] - ETA: 1:59 - loss: 1.4295 - regression_loss: 1.2267 - classification_loss: 0.2027 188/500 [==========>...................] - ETA: 1:59 - loss: 1.4275 - regression_loss: 1.2250 - classification_loss: 0.2024 189/500 [==========>...................] - ETA: 1:59 - loss: 1.4281 - regression_loss: 1.2258 - classification_loss: 0.2023 190/500 [==========>...................] - ETA: 1:58 - loss: 1.4260 - regression_loss: 1.2242 - classification_loss: 0.2018 191/500 [==========>...................] - ETA: 1:58 - loss: 1.4239 - regression_loss: 1.2226 - classification_loss: 0.2013 192/500 [==========>...................] - ETA: 1:58 - loss: 1.4233 - regression_loss: 1.2224 - classification_loss: 0.2009 193/500 [==========>...................] - ETA: 1:57 - loss: 1.4239 - regression_loss: 1.2229 - classification_loss: 0.2010 194/500 [==========>...................] - ETA: 1:57 - loss: 1.4195 - regression_loss: 1.2192 - classification_loss: 0.2004 195/500 [==========>...................] - ETA: 1:56 - loss: 1.4200 - regression_loss: 1.2194 - classification_loss: 0.2006 196/500 [==========>...................] - ETA: 1:56 - loss: 1.4172 - regression_loss: 1.2172 - classification_loss: 0.2000 197/500 [==========>...................] - ETA: 1:56 - loss: 1.4148 - regression_loss: 1.2152 - classification_loss: 0.1996 198/500 [==========>...................] - ETA: 1:55 - loss: 1.4152 - regression_loss: 1.2157 - classification_loss: 0.1995 199/500 [==========>...................] - ETA: 1:55 - loss: 1.4175 - regression_loss: 1.2181 - classification_loss: 0.1994 200/500 [===========>..................] - ETA: 1:54 - loss: 1.4169 - regression_loss: 1.2175 - classification_loss: 0.1994 201/500 [===========>..................] - ETA: 1:54 - loss: 1.4164 - regression_loss: 1.2169 - classification_loss: 0.1995 202/500 [===========>..................] - ETA: 1:54 - loss: 1.4146 - regression_loss: 1.2154 - classification_loss: 0.1992 203/500 [===========>..................] - ETA: 1:53 - loss: 1.4120 - regression_loss: 1.2135 - classification_loss: 0.1986 204/500 [===========>..................] - ETA: 1:53 - loss: 1.4120 - regression_loss: 1.2136 - classification_loss: 0.1985 205/500 [===========>..................] - ETA: 1:53 - loss: 1.4098 - regression_loss: 1.2118 - classification_loss: 0.1980 206/500 [===========>..................] - ETA: 1:52 - loss: 1.4106 - regression_loss: 1.2125 - classification_loss: 0.1981 207/500 [===========>..................] - ETA: 1:52 - loss: 1.4099 - regression_loss: 1.2120 - classification_loss: 0.1980 208/500 [===========>..................] - ETA: 1:51 - loss: 1.4053 - regression_loss: 1.2081 - classification_loss: 0.1973 209/500 [===========>..................] - ETA: 1:51 - loss: 1.4019 - regression_loss: 1.2051 - classification_loss: 0.1968 210/500 [===========>..................] - ETA: 1:51 - loss: 1.4029 - regression_loss: 1.2059 - classification_loss: 0.1970 211/500 [===========>..................] - ETA: 1:50 - loss: 1.4034 - regression_loss: 1.2066 - classification_loss: 0.1968 212/500 [===========>..................] - ETA: 1:50 - loss: 1.4007 - regression_loss: 1.2042 - classification_loss: 0.1966 213/500 [===========>..................] - ETA: 1:49 - loss: 1.4033 - regression_loss: 1.2064 - classification_loss: 0.1969 214/500 [===========>..................] - ETA: 1:49 - loss: 1.4042 - regression_loss: 1.2071 - classification_loss: 0.1971 215/500 [===========>..................] - ETA: 1:49 - loss: 1.4054 - regression_loss: 1.2082 - classification_loss: 0.1972 216/500 [===========>..................] - ETA: 1:48 - loss: 1.4060 - regression_loss: 1.2088 - classification_loss: 0.1972 217/500 [============>.................] - ETA: 1:48 - loss: 1.4076 - regression_loss: 1.2099 - classification_loss: 0.1978 218/500 [============>.................] - ETA: 1:47 - loss: 1.4070 - regression_loss: 1.2095 - classification_loss: 0.1975 219/500 [============>.................] - ETA: 1:47 - loss: 1.4037 - regression_loss: 1.2067 - classification_loss: 0.1970 220/500 [============>.................] - ETA: 1:47 - loss: 1.4020 - regression_loss: 1.2052 - classification_loss: 0.1968 221/500 [============>.................] - ETA: 1:46 - loss: 1.4032 - regression_loss: 1.2063 - classification_loss: 0.1969 222/500 [============>.................] - ETA: 1:46 - loss: 1.4005 - regression_loss: 1.2039 - classification_loss: 0.1966 223/500 [============>.................] - ETA: 1:46 - loss: 1.3983 - regression_loss: 1.2020 - classification_loss: 0.1964 224/500 [============>.................] - ETA: 1:45 - loss: 1.3954 - regression_loss: 1.1995 - classification_loss: 0.1959 225/500 [============>.................] - ETA: 1:45 - loss: 1.3947 - regression_loss: 1.1989 - classification_loss: 0.1958 226/500 [============>.................] - ETA: 1:45 - loss: 1.3933 - regression_loss: 1.1978 - classification_loss: 0.1955 227/500 [============>.................] - ETA: 1:44 - loss: 1.3931 - regression_loss: 1.1978 - classification_loss: 0.1953 228/500 [============>.................] - ETA: 1:44 - loss: 1.3938 - regression_loss: 1.1984 - classification_loss: 0.1954 229/500 [============>.................] - ETA: 1:43 - loss: 1.3933 - regression_loss: 1.1981 - classification_loss: 0.1952 230/500 [============>.................] - ETA: 1:43 - loss: 1.3949 - regression_loss: 1.1996 - classification_loss: 0.1954 231/500 [============>.................] - ETA: 1:43 - loss: 1.3953 - regression_loss: 1.2001 - classification_loss: 0.1953 232/500 [============>.................] - ETA: 1:42 - loss: 1.3933 - regression_loss: 1.1983 - classification_loss: 0.1950 233/500 [============>.................] - ETA: 1:42 - loss: 1.3941 - regression_loss: 1.1992 - classification_loss: 0.1949 234/500 [=============>................] - ETA: 1:42 - loss: 1.3939 - regression_loss: 1.1991 - classification_loss: 0.1948 235/500 [=============>................] - ETA: 1:41 - loss: 1.3945 - regression_loss: 1.1994 - classification_loss: 0.1951 236/500 [=============>................] - ETA: 1:41 - loss: 1.3938 - regression_loss: 1.1987 - classification_loss: 0.1951 237/500 [=============>................] - ETA: 1:40 - loss: 1.3935 - regression_loss: 1.1986 - classification_loss: 0.1950 238/500 [=============>................] - ETA: 1:40 - loss: 1.3948 - regression_loss: 1.1997 - classification_loss: 0.1951 239/500 [=============>................] - ETA: 1:40 - loss: 1.3945 - regression_loss: 1.1994 - classification_loss: 0.1951 240/500 [=============>................] - ETA: 1:39 - loss: 1.3930 - regression_loss: 1.1982 - classification_loss: 0.1948 241/500 [=============>................] - ETA: 1:39 - loss: 1.3941 - regression_loss: 1.1992 - classification_loss: 0.1949 242/500 [=============>................] - ETA: 1:39 - loss: 1.3975 - regression_loss: 1.2016 - classification_loss: 0.1959 243/500 [=============>................] - ETA: 1:38 - loss: 1.3977 - regression_loss: 1.2015 - classification_loss: 0.1962 244/500 [=============>................] - ETA: 1:38 - loss: 1.3960 - regression_loss: 1.2001 - classification_loss: 0.1959 245/500 [=============>................] - ETA: 1:37 - loss: 1.3967 - regression_loss: 1.2006 - classification_loss: 0.1960 246/500 [=============>................] - ETA: 1:37 - loss: 1.3979 - regression_loss: 1.2018 - classification_loss: 0.1961 247/500 [=============>................] - ETA: 1:37 - loss: 1.3978 - regression_loss: 1.2017 - classification_loss: 0.1961 248/500 [=============>................] - ETA: 1:36 - loss: 1.3946 - regression_loss: 1.1990 - classification_loss: 0.1956 249/500 [=============>................] - ETA: 1:36 - loss: 1.3946 - regression_loss: 1.1991 - classification_loss: 0.1955 250/500 [==============>...............] - ETA: 1:35 - loss: 1.3955 - regression_loss: 1.1998 - classification_loss: 0.1957 251/500 [==============>...............] - ETA: 1:35 - loss: 1.3942 - regression_loss: 1.1987 - classification_loss: 0.1955 252/500 [==============>...............] - ETA: 1:35 - loss: 1.3939 - regression_loss: 1.1986 - classification_loss: 0.1953 253/500 [==============>...............] - ETA: 1:34 - loss: 1.3948 - regression_loss: 1.1993 - classification_loss: 0.1955 254/500 [==============>...............] - ETA: 1:34 - loss: 1.3955 - regression_loss: 1.2000 - classification_loss: 0.1955 255/500 [==============>...............] - ETA: 1:33 - loss: 1.3939 - regression_loss: 1.1987 - classification_loss: 0.1952 256/500 [==============>...............] - ETA: 1:33 - loss: 1.3939 - regression_loss: 1.1987 - classification_loss: 0.1952 257/500 [==============>...............] - ETA: 1:33 - loss: 1.3932 - regression_loss: 1.1982 - classification_loss: 0.1950 258/500 [==============>...............] - ETA: 1:32 - loss: 1.3935 - regression_loss: 1.1982 - classification_loss: 0.1953 259/500 [==============>...............] - ETA: 1:32 - loss: 1.3939 - regression_loss: 1.1986 - classification_loss: 0.1953 260/500 [==============>...............] - ETA: 1:32 - loss: 1.3943 - regression_loss: 1.1990 - classification_loss: 0.1953 261/500 [==============>...............] - ETA: 1:31 - loss: 1.3957 - regression_loss: 1.1999 - classification_loss: 0.1957 262/500 [==============>...............] - ETA: 1:31 - loss: 1.3963 - regression_loss: 1.2005 - classification_loss: 0.1958 263/500 [==============>...............] - ETA: 1:30 - loss: 1.3981 - regression_loss: 1.2021 - classification_loss: 0.1959 264/500 [==============>...............] - ETA: 1:30 - loss: 1.3988 - regression_loss: 1.2027 - classification_loss: 0.1962 265/500 [==============>...............] - ETA: 1:30 - loss: 1.3985 - regression_loss: 1.2025 - classification_loss: 0.1961 266/500 [==============>...............] - ETA: 1:29 - loss: 1.3988 - regression_loss: 1.2026 - classification_loss: 0.1962 267/500 [===============>..............] - ETA: 1:29 - loss: 1.3993 - regression_loss: 1.2031 - classification_loss: 0.1962 268/500 [===============>..............] - ETA: 1:28 - loss: 1.3968 - regression_loss: 1.2011 - classification_loss: 0.1958 269/500 [===============>..............] - ETA: 1:28 - loss: 1.3943 - regression_loss: 1.1988 - classification_loss: 0.1954 270/500 [===============>..............] - ETA: 1:28 - loss: 1.3955 - regression_loss: 1.1997 - classification_loss: 0.1958 271/500 [===============>..............] - ETA: 1:27 - loss: 1.3962 - regression_loss: 1.2003 - classification_loss: 0.1959 272/500 [===============>..............] - ETA: 1:27 - loss: 1.3964 - regression_loss: 1.2007 - classification_loss: 0.1956 273/500 [===============>..............] - ETA: 1:26 - loss: 1.3958 - regression_loss: 1.2002 - classification_loss: 0.1956 274/500 [===============>..............] - ETA: 1:26 - loss: 1.3969 - regression_loss: 1.2013 - classification_loss: 0.1956 275/500 [===============>..............] - ETA: 1:26 - loss: 1.3970 - regression_loss: 1.2015 - classification_loss: 0.1956 276/500 [===============>..............] - ETA: 1:25 - loss: 1.3947 - regression_loss: 1.1994 - classification_loss: 0.1952 277/500 [===============>..............] - ETA: 1:25 - loss: 1.3929 - regression_loss: 1.1978 - classification_loss: 0.1951 278/500 [===============>..............] - ETA: 1:25 - loss: 1.3912 - regression_loss: 1.1960 - classification_loss: 0.1952 279/500 [===============>..............] - ETA: 1:24 - loss: 1.3932 - regression_loss: 1.1977 - classification_loss: 0.1954 280/500 [===============>..............] - ETA: 1:24 - loss: 1.3932 - regression_loss: 1.1978 - classification_loss: 0.1953 281/500 [===============>..............] - ETA: 1:23 - loss: 1.3934 - regression_loss: 1.1978 - classification_loss: 0.1956 282/500 [===============>..............] - ETA: 1:23 - loss: 1.3938 - regression_loss: 1.1981 - classification_loss: 0.1957 283/500 [===============>..............] - ETA: 1:23 - loss: 1.3945 - regression_loss: 1.1988 - classification_loss: 0.1957 284/500 [================>.............] - ETA: 1:22 - loss: 1.3944 - regression_loss: 1.1988 - classification_loss: 0.1956 285/500 [================>.............] - ETA: 1:22 - loss: 1.3976 - regression_loss: 1.2016 - classification_loss: 0.1960 286/500 [================>.............] - ETA: 1:21 - loss: 1.3967 - regression_loss: 1.2008 - classification_loss: 0.1958 287/500 [================>.............] - ETA: 1:21 - loss: 1.3969 - regression_loss: 1.2011 - classification_loss: 0.1958 288/500 [================>.............] - ETA: 1:21 - loss: 1.3958 - regression_loss: 1.2000 - classification_loss: 0.1958 289/500 [================>.............] - ETA: 1:20 - loss: 1.3964 - regression_loss: 1.2004 - classification_loss: 0.1960 290/500 [================>.............] - ETA: 1:20 - loss: 1.3952 - regression_loss: 1.1996 - classification_loss: 0.1956 291/500 [================>.............] - ETA: 1:20 - loss: 1.3944 - regression_loss: 1.1991 - classification_loss: 0.1954 292/500 [================>.............] - ETA: 1:19 - loss: 1.3939 - regression_loss: 1.1985 - classification_loss: 0.1953 293/500 [================>.............] - ETA: 1:19 - loss: 1.3960 - regression_loss: 1.2005 - classification_loss: 0.1956 294/500 [================>.............] - ETA: 1:18 - loss: 1.3939 - regression_loss: 1.1987 - classification_loss: 0.1952 295/500 [================>.............] - ETA: 1:18 - loss: 1.3950 - regression_loss: 1.1998 - classification_loss: 0.1952 296/500 [================>.............] - ETA: 1:18 - loss: 1.3955 - regression_loss: 1.2005 - classification_loss: 0.1950 297/500 [================>.............] - ETA: 1:17 - loss: 1.3944 - regression_loss: 1.1995 - classification_loss: 0.1949 298/500 [================>.............] - ETA: 1:17 - loss: 1.3947 - regression_loss: 1.1999 - classification_loss: 0.1948 299/500 [================>.............] - ETA: 1:16 - loss: 1.3953 - regression_loss: 1.2004 - classification_loss: 0.1949 300/500 [=================>............] - ETA: 1:16 - loss: 1.3963 - regression_loss: 1.2013 - classification_loss: 0.1950 301/500 [=================>............] - ETA: 1:16 - loss: 1.3986 - regression_loss: 1.2034 - classification_loss: 0.1952 302/500 [=================>............] - ETA: 1:15 - loss: 1.3990 - regression_loss: 1.2037 - classification_loss: 0.1953 303/500 [=================>............] - ETA: 1:15 - loss: 1.4004 - regression_loss: 1.2048 - classification_loss: 0.1956 304/500 [=================>............] - ETA: 1:15 - loss: 1.4008 - regression_loss: 1.2051 - classification_loss: 0.1957 305/500 [=================>............] - ETA: 1:14 - loss: 1.4011 - regression_loss: 1.2053 - classification_loss: 0.1957 306/500 [=================>............] - ETA: 1:14 - loss: 1.4007 - regression_loss: 1.2051 - classification_loss: 0.1956 307/500 [=================>............] - ETA: 1:13 - loss: 1.3983 - regression_loss: 1.2030 - classification_loss: 0.1953 308/500 [=================>............] - ETA: 1:13 - loss: 1.3974 - regression_loss: 1.2023 - classification_loss: 0.1951 309/500 [=================>............] - ETA: 1:13 - loss: 1.3967 - regression_loss: 1.2017 - classification_loss: 0.1950 310/500 [=================>............] - ETA: 1:12 - loss: 1.3979 - regression_loss: 1.2029 - classification_loss: 0.1950 311/500 [=================>............] - ETA: 1:12 - loss: 1.3988 - regression_loss: 1.2039 - classification_loss: 0.1950 312/500 [=================>............] - ETA: 1:11 - loss: 1.3988 - regression_loss: 1.2038 - classification_loss: 0.1950 313/500 [=================>............] - ETA: 1:11 - loss: 1.3995 - regression_loss: 1.2044 - classification_loss: 0.1951 314/500 [=================>............] - ETA: 1:11 - loss: 1.4003 - regression_loss: 1.2051 - classification_loss: 0.1952 315/500 [=================>............] - ETA: 1:10 - loss: 1.3995 - regression_loss: 1.2043 - classification_loss: 0.1952 316/500 [=================>............] - ETA: 1:10 - loss: 1.3985 - regression_loss: 1.2036 - classification_loss: 0.1949 317/500 [==================>...........] - ETA: 1:10 - loss: 1.3987 - regression_loss: 1.2039 - classification_loss: 0.1948 318/500 [==================>...........] - ETA: 1:09 - loss: 1.3993 - regression_loss: 1.2045 - classification_loss: 0.1949 319/500 [==================>...........] - ETA: 1:09 - loss: 1.3989 - regression_loss: 1.2041 - classification_loss: 0.1947 320/500 [==================>...........] - ETA: 1:08 - loss: 1.3984 - regression_loss: 1.2038 - classification_loss: 0.1946 321/500 [==================>...........] - ETA: 1:08 - loss: 1.3992 - regression_loss: 1.2044 - classification_loss: 0.1947 322/500 [==================>...........] - ETA: 1:08 - loss: 1.3982 - regression_loss: 1.2036 - classification_loss: 0.1946 323/500 [==================>...........] - ETA: 1:07 - loss: 1.3991 - regression_loss: 1.2043 - classification_loss: 0.1948 324/500 [==================>...........] - ETA: 1:07 - loss: 1.3978 - regression_loss: 1.2032 - classification_loss: 0.1946 325/500 [==================>...........] - ETA: 1:07 - loss: 1.3966 - regression_loss: 1.2023 - classification_loss: 0.1943 326/500 [==================>...........] - ETA: 1:06 - loss: 1.3954 - regression_loss: 1.2014 - classification_loss: 0.1941 327/500 [==================>...........] - ETA: 1:06 - loss: 1.3950 - regression_loss: 1.2011 - classification_loss: 0.1939 328/500 [==================>...........] - ETA: 1:05 - loss: 1.3942 - regression_loss: 1.2005 - classification_loss: 0.1937 329/500 [==================>...........] - ETA: 1:05 - loss: 1.3950 - regression_loss: 1.2014 - classification_loss: 0.1936 330/500 [==================>...........] - ETA: 1:05 - loss: 1.3952 - regression_loss: 1.2016 - classification_loss: 0.1936 331/500 [==================>...........] - ETA: 1:04 - loss: 1.3948 - regression_loss: 1.2012 - classification_loss: 0.1937 332/500 [==================>...........] - ETA: 1:04 - loss: 1.3928 - regression_loss: 1.1993 - classification_loss: 0.1935 333/500 [==================>...........] - ETA: 1:04 - loss: 1.3938 - regression_loss: 1.2003 - classification_loss: 0.1935 334/500 [===================>..........] - ETA: 1:03 - loss: 1.3923 - regression_loss: 1.1991 - classification_loss: 0.1933 335/500 [===================>..........] - ETA: 1:03 - loss: 1.3920 - regression_loss: 1.1989 - classification_loss: 0.1932 336/500 [===================>..........] - ETA: 1:02 - loss: 1.3912 - regression_loss: 1.1981 - classification_loss: 0.1931 337/500 [===================>..........] - ETA: 1:02 - loss: 1.3905 - regression_loss: 1.1976 - classification_loss: 0.1929 338/500 [===================>..........] - ETA: 1:02 - loss: 1.3904 - regression_loss: 1.1976 - classification_loss: 0.1928 339/500 [===================>..........] - ETA: 1:01 - loss: 1.3878 - regression_loss: 1.1953 - classification_loss: 0.1925 340/500 [===================>..........] - ETA: 1:01 - loss: 1.3881 - regression_loss: 1.1956 - classification_loss: 0.1925 341/500 [===================>..........] - ETA: 1:00 - loss: 1.3886 - regression_loss: 1.1961 - classification_loss: 0.1925 342/500 [===================>..........] - ETA: 1:00 - loss: 1.3890 - regression_loss: 1.1964 - classification_loss: 0.1926 343/500 [===================>..........] - ETA: 1:00 - loss: 1.3907 - regression_loss: 1.1979 - classification_loss: 0.1928 344/500 [===================>..........] - ETA: 59s - loss: 1.3900 - regression_loss: 1.1973 - classification_loss: 0.1926  345/500 [===================>..........] - ETA: 59s - loss: 1.3877 - regression_loss: 1.1953 - classification_loss: 0.1924 346/500 [===================>..........] - ETA: 59s - loss: 1.3891 - regression_loss: 1.1966 - classification_loss: 0.1925 347/500 [===================>..........] - ETA: 58s - loss: 1.3871 - regression_loss: 1.1948 - classification_loss: 0.1923 348/500 [===================>..........] - ETA: 58s - loss: 1.3880 - regression_loss: 1.1957 - classification_loss: 0.1923 349/500 [===================>..........] - ETA: 57s - loss: 1.3884 - regression_loss: 1.1960 - classification_loss: 0.1924 350/500 [====================>.........] - ETA: 57s - loss: 1.3889 - regression_loss: 1.1964 - classification_loss: 0.1925 351/500 [====================>.........] - ETA: 57s - loss: 1.3897 - regression_loss: 1.1971 - classification_loss: 0.1925 352/500 [====================>.........] - ETA: 56s - loss: 1.3903 - regression_loss: 1.1977 - classification_loss: 0.1926 353/500 [====================>.........] - ETA: 56s - loss: 1.3901 - regression_loss: 1.1973 - classification_loss: 0.1927 354/500 [====================>.........] - ETA: 55s - loss: 1.3889 - regression_loss: 1.1964 - classification_loss: 0.1925 355/500 [====================>.........] - ETA: 55s - loss: 1.3891 - regression_loss: 1.1967 - classification_loss: 0.1924 356/500 [====================>.........] - ETA: 55s - loss: 1.3890 - regression_loss: 1.1967 - classification_loss: 0.1923 357/500 [====================>.........] - ETA: 54s - loss: 1.3894 - regression_loss: 1.1970 - classification_loss: 0.1924 358/500 [====================>.........] - ETA: 54s - loss: 1.3873 - regression_loss: 1.1953 - classification_loss: 0.1921 359/500 [====================>.........] - ETA: 54s - loss: 1.3873 - regression_loss: 1.1953 - classification_loss: 0.1920 360/500 [====================>.........] - ETA: 53s - loss: 1.3884 - regression_loss: 1.1963 - classification_loss: 0.1921 361/500 [====================>.........] - ETA: 53s - loss: 1.3869 - regression_loss: 1.1951 - classification_loss: 0.1919 362/500 [====================>.........] - ETA: 52s - loss: 1.3858 - regression_loss: 1.1941 - classification_loss: 0.1917 363/500 [====================>.........] - ETA: 52s - loss: 1.3857 - regression_loss: 1.1940 - classification_loss: 0.1918 364/500 [====================>.........] - ETA: 52s - loss: 1.3849 - regression_loss: 1.1932 - classification_loss: 0.1917 365/500 [====================>.........] - ETA: 51s - loss: 1.3849 - regression_loss: 1.1932 - classification_loss: 0.1917 366/500 [====================>.........] - ETA: 51s - loss: 1.3844 - regression_loss: 1.1928 - classification_loss: 0.1916 367/500 [=====================>........] - ETA: 50s - loss: 1.3842 - regression_loss: 1.1928 - classification_loss: 0.1914 368/500 [=====================>........] - ETA: 50s - loss: 1.3844 - regression_loss: 1.1929 - classification_loss: 0.1915 369/500 [=====================>........] - ETA: 50s - loss: 1.3838 - regression_loss: 1.1924 - classification_loss: 0.1914 370/500 [=====================>........] - ETA: 49s - loss: 1.3841 - regression_loss: 1.1925 - classification_loss: 0.1915 371/500 [=====================>........] - ETA: 49s - loss: 1.3847 - regression_loss: 1.1932 - classification_loss: 0.1916 372/500 [=====================>........] - ETA: 49s - loss: 1.3846 - regression_loss: 1.1931 - classification_loss: 0.1915 373/500 [=====================>........] - ETA: 48s - loss: 1.3835 - regression_loss: 1.1921 - classification_loss: 0.1914 374/500 [=====================>........] - ETA: 48s - loss: 1.3827 - regression_loss: 1.1914 - classification_loss: 0.1913 375/500 [=====================>........] - ETA: 47s - loss: 1.3836 - regression_loss: 1.1922 - classification_loss: 0.1915 376/500 [=====================>........] - ETA: 47s - loss: 1.3834 - regression_loss: 1.1920 - classification_loss: 0.1914 377/500 [=====================>........] - ETA: 47s - loss: 1.3842 - regression_loss: 1.1927 - classification_loss: 0.1915 378/500 [=====================>........] - ETA: 46s - loss: 1.3845 - regression_loss: 1.1931 - classification_loss: 0.1914 379/500 [=====================>........] - ETA: 46s - loss: 1.3851 - regression_loss: 1.1936 - classification_loss: 0.1915 380/500 [=====================>........] - ETA: 45s - loss: 1.3846 - regression_loss: 1.1931 - classification_loss: 0.1915 381/500 [=====================>........] - ETA: 45s - loss: 1.3838 - regression_loss: 1.1924 - classification_loss: 0.1915 382/500 [=====================>........] - ETA: 45s - loss: 1.3831 - regression_loss: 1.1917 - classification_loss: 0.1914 383/500 [=====================>........] - ETA: 44s - loss: 1.3839 - regression_loss: 1.1925 - classification_loss: 0.1914 384/500 [======================>.......] - ETA: 44s - loss: 1.3827 - regression_loss: 1.1916 - classification_loss: 0.1911 385/500 [======================>.......] - ETA: 44s - loss: 1.3830 - regression_loss: 1.1920 - classification_loss: 0.1910 386/500 [======================>.......] - ETA: 43s - loss: 1.3811 - regression_loss: 1.1904 - classification_loss: 0.1907 387/500 [======================>.......] - ETA: 43s - loss: 1.3813 - regression_loss: 1.1906 - classification_loss: 0.1907 388/500 [======================>.......] - ETA: 42s - loss: 1.3831 - regression_loss: 1.1924 - classification_loss: 0.1907 389/500 [======================>.......] - ETA: 42s - loss: 1.3834 - regression_loss: 1.1927 - classification_loss: 0.1907 390/500 [======================>.......] - ETA: 42s - loss: 1.3835 - regression_loss: 1.1928 - classification_loss: 0.1906 391/500 [======================>.......] - ETA: 41s - loss: 1.3829 - regression_loss: 1.1923 - classification_loss: 0.1906 392/500 [======================>.......] - ETA: 41s - loss: 1.3819 - regression_loss: 1.1916 - classification_loss: 0.1904 393/500 [======================>.......] - ETA: 40s - loss: 1.3826 - regression_loss: 1.1921 - classification_loss: 0.1905 394/500 [======================>.......] - ETA: 40s - loss: 1.3827 - regression_loss: 1.1923 - classification_loss: 0.1904 395/500 [======================>.......] - ETA: 40s - loss: 1.3817 - regression_loss: 1.1916 - classification_loss: 0.1902 396/500 [======================>.......] - ETA: 39s - loss: 1.3807 - regression_loss: 1.1908 - classification_loss: 0.1899 397/500 [======================>.......] - ETA: 39s - loss: 1.3792 - regression_loss: 1.1895 - classification_loss: 0.1897 398/500 [======================>.......] - ETA: 39s - loss: 1.3786 - regression_loss: 1.1890 - classification_loss: 0.1895 399/500 [======================>.......] - ETA: 38s - loss: 1.3790 - regression_loss: 1.1893 - classification_loss: 0.1897 400/500 [=======================>......] - ETA: 38s - loss: 1.3790 - regression_loss: 1.1895 - classification_loss: 0.1895 401/500 [=======================>......] - ETA: 37s - loss: 1.3791 - regression_loss: 1.1898 - classification_loss: 0.1893 402/500 [=======================>......] - ETA: 37s - loss: 1.3793 - regression_loss: 1.1901 - classification_loss: 0.1892 403/500 [=======================>......] - ETA: 37s - loss: 1.3792 - regression_loss: 1.1900 - classification_loss: 0.1892 404/500 [=======================>......] - ETA: 36s - loss: 1.3782 - regression_loss: 1.1892 - classification_loss: 0.1890 405/500 [=======================>......] - ETA: 36s - loss: 1.3774 - regression_loss: 1.1886 - classification_loss: 0.1888 406/500 [=======================>......] - ETA: 36s - loss: 1.3779 - regression_loss: 1.1890 - classification_loss: 0.1889 407/500 [=======================>......] - ETA: 35s - loss: 1.3773 - regression_loss: 1.1884 - classification_loss: 0.1889 408/500 [=======================>......] - ETA: 35s - loss: 1.3764 - regression_loss: 1.1876 - classification_loss: 0.1887 409/500 [=======================>......] - ETA: 34s - loss: 1.3764 - regression_loss: 1.1878 - classification_loss: 0.1886 410/500 [=======================>......] - ETA: 34s - loss: 1.3760 - regression_loss: 1.1874 - classification_loss: 0.1886 411/500 [=======================>......] - ETA: 34s - loss: 1.3756 - regression_loss: 1.1870 - classification_loss: 0.1885 412/500 [=======================>......] - ETA: 33s - loss: 1.3744 - regression_loss: 1.1861 - classification_loss: 0.1883 413/500 [=======================>......] - ETA: 33s - loss: 1.3751 - regression_loss: 1.1870 - classification_loss: 0.1881 414/500 [=======================>......] - ETA: 32s - loss: 1.3752 - regression_loss: 1.1871 - classification_loss: 0.1881 415/500 [=======================>......] - ETA: 32s - loss: 1.3753 - regression_loss: 1.1871 - classification_loss: 0.1882 416/500 [=======================>......] - ETA: 32s - loss: 1.3762 - regression_loss: 1.1878 - classification_loss: 0.1883 417/500 [========================>.....] - ETA: 31s - loss: 1.3762 - regression_loss: 1.1879 - classification_loss: 0.1883 418/500 [========================>.....] - ETA: 31s - loss: 1.3763 - regression_loss: 1.1881 - classification_loss: 0.1883 419/500 [========================>.....] - ETA: 31s - loss: 1.3757 - regression_loss: 1.1875 - classification_loss: 0.1882 420/500 [========================>.....] - ETA: 30s - loss: 1.3750 - regression_loss: 1.1868 - classification_loss: 0.1881 421/500 [========================>.....] - ETA: 30s - loss: 1.3754 - regression_loss: 1.1873 - classification_loss: 0.1881 422/500 [========================>.....] - ETA: 29s - loss: 1.3758 - regression_loss: 1.1876 - classification_loss: 0.1882 423/500 [========================>.....] - ETA: 29s - loss: 1.3771 - regression_loss: 1.1887 - classification_loss: 0.1884 424/500 [========================>.....] - ETA: 29s - loss: 1.3773 - regression_loss: 1.1888 - classification_loss: 0.1884 425/500 [========================>.....] - ETA: 28s - loss: 1.3772 - regression_loss: 1.1889 - classification_loss: 0.1882 426/500 [========================>.....] - ETA: 28s - loss: 1.3775 - regression_loss: 1.1893 - classification_loss: 0.1883 427/500 [========================>.....] - ETA: 27s - loss: 1.3769 - regression_loss: 1.1888 - classification_loss: 0.1881 428/500 [========================>.....] - ETA: 27s - loss: 1.3757 - regression_loss: 1.1878 - classification_loss: 0.1879 429/500 [========================>.....] - ETA: 27s - loss: 1.3750 - regression_loss: 1.1871 - classification_loss: 0.1878 430/500 [========================>.....] - ETA: 26s - loss: 1.3755 - regression_loss: 1.1875 - classification_loss: 0.1880 431/500 [========================>.....] - ETA: 26s - loss: 1.3756 - regression_loss: 1.1876 - classification_loss: 0.1880 432/500 [========================>.....] - ETA: 26s - loss: 1.3745 - regression_loss: 1.1866 - classification_loss: 0.1879 433/500 [========================>.....] - ETA: 25s - loss: 1.3753 - regression_loss: 1.1874 - classification_loss: 0.1879 434/500 [=========================>....] - ETA: 25s - loss: 1.3736 - regression_loss: 1.1858 - classification_loss: 0.1879 435/500 [=========================>....] - ETA: 24s - loss: 1.3725 - regression_loss: 1.1849 - classification_loss: 0.1876 436/500 [=========================>....] - ETA: 24s - loss: 1.3722 - regression_loss: 1.1847 - classification_loss: 0.1876 437/500 [=========================>....] - ETA: 24s - loss: 1.3699 - regression_loss: 1.1827 - classification_loss: 0.1873 438/500 [=========================>....] - ETA: 23s - loss: 1.3697 - regression_loss: 1.1825 - classification_loss: 0.1872 439/500 [=========================>....] - ETA: 23s - loss: 1.3703 - regression_loss: 1.1831 - classification_loss: 0.1872 440/500 [=========================>....] - ETA: 22s - loss: 1.3710 - regression_loss: 1.1838 - classification_loss: 0.1873 441/500 [=========================>....] - ETA: 22s - loss: 1.3715 - regression_loss: 1.1840 - classification_loss: 0.1874 442/500 [=========================>....] - ETA: 22s - loss: 1.3713 - regression_loss: 1.1839 - classification_loss: 0.1874 443/500 [=========================>....] - ETA: 21s - loss: 1.3719 - regression_loss: 1.1845 - classification_loss: 0.1874 444/500 [=========================>....] - ETA: 21s - loss: 1.3716 - regression_loss: 1.1842 - classification_loss: 0.1874 445/500 [=========================>....] - ETA: 21s - loss: 1.3724 - regression_loss: 1.1848 - classification_loss: 0.1875 446/500 [=========================>....] - ETA: 20s - loss: 1.3735 - regression_loss: 1.1858 - classification_loss: 0.1877 447/500 [=========================>....] - ETA: 20s - loss: 1.3747 - regression_loss: 1.1869 - classification_loss: 0.1878 448/500 [=========================>....] - ETA: 19s - loss: 1.3745 - regression_loss: 1.1868 - classification_loss: 0.1877 449/500 [=========================>....] - ETA: 19s - loss: 1.3745 - regression_loss: 1.1868 - classification_loss: 0.1877 450/500 [==========================>...] - ETA: 19s - loss: 1.3737 - regression_loss: 1.1863 - classification_loss: 0.1874 451/500 [==========================>...] - ETA: 18s - loss: 1.3736 - regression_loss: 1.1861 - classification_loss: 0.1876 452/500 [==========================>...] - ETA: 18s - loss: 1.3746 - regression_loss: 1.1869 - classification_loss: 0.1877 453/500 [==========================>...] - ETA: 17s - loss: 1.3741 - regression_loss: 1.1866 - classification_loss: 0.1875 454/500 [==========================>...] - ETA: 17s - loss: 1.3754 - regression_loss: 1.1878 - classification_loss: 0.1876 455/500 [==========================>...] - ETA: 17s - loss: 1.3762 - regression_loss: 1.1884 - classification_loss: 0.1877 456/500 [==========================>...] - ETA: 16s - loss: 1.3763 - regression_loss: 1.1885 - classification_loss: 0.1878 457/500 [==========================>...] - ETA: 16s - loss: 1.3756 - regression_loss: 1.1879 - classification_loss: 0.1877 458/500 [==========================>...] - ETA: 16s - loss: 1.3760 - regression_loss: 1.1883 - classification_loss: 0.1877 459/500 [==========================>...] - ETA: 15s - loss: 1.3749 - regression_loss: 1.1874 - classification_loss: 0.1875 460/500 [==========================>...] - ETA: 15s - loss: 1.3739 - regression_loss: 1.1865 - classification_loss: 0.1875 461/500 [==========================>...] - ETA: 14s - loss: 1.3736 - regression_loss: 1.1863 - classification_loss: 0.1873 462/500 [==========================>...] - ETA: 14s - loss: 1.3735 - regression_loss: 1.1863 - classification_loss: 0.1872 463/500 [==========================>...] - ETA: 14s - loss: 1.3742 - regression_loss: 1.1870 - classification_loss: 0.1873 464/500 [==========================>...] - ETA: 13s - loss: 1.3742 - regression_loss: 1.1870 - classification_loss: 0.1872 465/500 [==========================>...] - ETA: 13s - loss: 1.3742 - regression_loss: 1.1871 - classification_loss: 0.1871 466/500 [==========================>...] - ETA: 13s - loss: 1.3741 - regression_loss: 1.1870 - classification_loss: 0.1870 467/500 [===========================>..] - ETA: 12s - loss: 1.3741 - regression_loss: 1.1872 - classification_loss: 0.1869 468/500 [===========================>..] - ETA: 12s - loss: 1.3752 - regression_loss: 1.1882 - classification_loss: 0.1870 469/500 [===========================>..] - ETA: 11s - loss: 1.3744 - regression_loss: 1.1876 - classification_loss: 0.1868 470/500 [===========================>..] - ETA: 11s - loss: 1.3756 - regression_loss: 1.1886 - classification_loss: 0.1870 471/500 [===========================>..] - ETA: 11s - loss: 1.3757 - regression_loss: 1.1887 - classification_loss: 0.1870 472/500 [===========================>..] - ETA: 10s - loss: 1.3765 - regression_loss: 1.1894 - classification_loss: 0.1871 473/500 [===========================>..] - ETA: 10s - loss: 1.3753 - regression_loss: 1.1884 - classification_loss: 0.1869 474/500 [===========================>..] - ETA: 9s - loss: 1.3748 - regression_loss: 1.1880 - classification_loss: 0.1868  475/500 [===========================>..] - ETA: 9s - loss: 1.3763 - regression_loss: 1.1893 - classification_loss: 0.1870 476/500 [===========================>..] - ETA: 9s - loss: 1.3764 - regression_loss: 1.1894 - classification_loss: 0.1871 477/500 [===========================>..] - ETA: 8s - loss: 1.3771 - regression_loss: 1.1894 - classification_loss: 0.1877 478/500 [===========================>..] - ETA: 8s - loss: 1.3763 - regression_loss: 1.1888 - classification_loss: 0.1876 479/500 [===========================>..] - ETA: 8s - loss: 1.3751 - regression_loss: 1.1877 - classification_loss: 0.1873 480/500 [===========================>..] - ETA: 7s - loss: 1.3764 - regression_loss: 1.1888 - classification_loss: 0.1877 481/500 [===========================>..] - ETA: 7s - loss: 1.3753 - regression_loss: 1.1878 - classification_loss: 0.1875 482/500 [===========================>..] - ETA: 6s - loss: 1.3744 - regression_loss: 1.1870 - classification_loss: 0.1874 483/500 [===========================>..] - ETA: 6s - loss: 1.3759 - regression_loss: 1.1882 - classification_loss: 0.1877 484/500 [============================>.] - ETA: 6s - loss: 1.3760 - regression_loss: 1.1883 - classification_loss: 0.1877 485/500 [============================>.] - ETA: 5s - loss: 1.3765 - regression_loss: 1.1888 - classification_loss: 0.1877 486/500 [============================>.] - ETA: 5s - loss: 1.3763 - regression_loss: 1.1887 - classification_loss: 0.1876 487/500 [============================>.] - ETA: 4s - loss: 1.3779 - regression_loss: 1.1900 - classification_loss: 0.1878 488/500 [============================>.] - ETA: 4s - loss: 1.3776 - regression_loss: 1.1898 - classification_loss: 0.1878 489/500 [============================>.] - ETA: 4s - loss: 1.3766 - regression_loss: 1.1890 - classification_loss: 0.1876 490/500 [============================>.] - ETA: 3s - loss: 1.3754 - regression_loss: 1.1878 - classification_loss: 0.1876 491/500 [============================>.] - ETA: 3s - loss: 1.3750 - regression_loss: 1.1875 - classification_loss: 0.1875 492/500 [============================>.] - ETA: 3s - loss: 1.3746 - regression_loss: 1.1871 - classification_loss: 0.1874 493/500 [============================>.] - ETA: 2s - loss: 1.3756 - regression_loss: 1.1880 - classification_loss: 0.1876 494/500 [============================>.] - ETA: 2s - loss: 1.3751 - regression_loss: 1.1875 - classification_loss: 0.1876 495/500 [============================>.] - ETA: 1s - loss: 1.3748 - regression_loss: 1.1873 - classification_loss: 0.1875 496/500 [============================>.] - ETA: 1s - loss: 1.3746 - regression_loss: 1.1872 - classification_loss: 0.1874 497/500 [============================>.] - ETA: 1s - loss: 1.3746 - regression_loss: 1.1874 - classification_loss: 0.1873 498/500 [============================>.] - ETA: 0s - loss: 1.3751 - regression_loss: 1.1878 - classification_loss: 0.1873 499/500 [============================>.] - ETA: 0s - loss: 1.3752 - regression_loss: 1.1878 - classification_loss: 0.1873 500/500 [==============================] - 192s 383ms/step - loss: 1.3745 - regression_loss: 1.1873 - classification_loss: 0.1872 1172 instances of class plum with average precision: 0.6585 mAP: 0.6585 Epoch 00006: saving model to ./training/snapshots/resnet101_pascal_06.h5 Epoch 7/20 1/500 [..............................] - ETA: 2:59 - loss: 1.1029 - regression_loss: 0.9721 - classification_loss: 0.1308 2/500 [..............................] - ETA: 3:08 - loss: 1.0084 - regression_loss: 0.8889 - classification_loss: 0.1195 3/500 [..............................] - ETA: 3:10 - loss: 1.1071 - regression_loss: 0.9621 - classification_loss: 0.1450 4/500 [..............................] - ETA: 3:07 - loss: 1.0343 - regression_loss: 0.9011 - classification_loss: 0.1332 5/500 [..............................] - ETA: 3:07 - loss: 0.9789 - regression_loss: 0.8537 - classification_loss: 0.1252 6/500 [..............................] - ETA: 3:07 - loss: 1.0514 - regression_loss: 0.9153 - classification_loss: 0.1361 7/500 [..............................] - ETA: 3:07 - loss: 0.9859 - regression_loss: 0.8583 - classification_loss: 0.1276 8/500 [..............................] - ETA: 3:09 - loss: 1.0525 - regression_loss: 0.9135 - classification_loss: 0.1390 9/500 [..............................] - ETA: 3:09 - loss: 1.1054 - regression_loss: 0.9621 - classification_loss: 0.1432 10/500 [..............................] - ETA: 3:08 - loss: 1.1653 - regression_loss: 1.0080 - classification_loss: 0.1573 11/500 [..............................] - ETA: 3:07 - loss: 1.1207 - regression_loss: 0.9691 - classification_loss: 0.1516 12/500 [..............................] - ETA: 3:07 - loss: 1.1288 - regression_loss: 0.9792 - classification_loss: 0.1496 13/500 [..............................] - ETA: 3:07 - loss: 1.1099 - regression_loss: 0.9652 - classification_loss: 0.1447 14/500 [..............................] - ETA: 3:06 - loss: 1.1059 - regression_loss: 0.9593 - classification_loss: 0.1466 15/500 [..............................] - ETA: 3:05 - loss: 1.0855 - regression_loss: 0.9394 - classification_loss: 0.1462 16/500 [..............................] - ETA: 3:05 - loss: 1.0739 - regression_loss: 0.9314 - classification_loss: 0.1425 17/500 [>.............................] - ETA: 3:04 - loss: 1.1221 - regression_loss: 0.9737 - classification_loss: 0.1484 18/500 [>.............................] - ETA: 3:04 - loss: 1.1333 - regression_loss: 0.9844 - classification_loss: 0.1489 19/500 [>.............................] - ETA: 3:04 - loss: 1.1587 - regression_loss: 1.0076 - classification_loss: 0.1512 20/500 [>.............................] - ETA: 3:03 - loss: 1.1709 - regression_loss: 1.0199 - classification_loss: 0.1510 21/500 [>.............................] - ETA: 3:03 - loss: 1.2024 - regression_loss: 1.0487 - classification_loss: 0.1537 22/500 [>.............................] - ETA: 3:02 - loss: 1.2093 - regression_loss: 1.0535 - classification_loss: 0.1559 23/500 [>.............................] - ETA: 3:03 - loss: 1.2316 - regression_loss: 1.0702 - classification_loss: 0.1614 24/500 [>.............................] - ETA: 3:02 - loss: 1.2242 - regression_loss: 1.0645 - classification_loss: 0.1597 25/500 [>.............................] - ETA: 3:02 - loss: 1.2411 - regression_loss: 1.0783 - classification_loss: 0.1628 26/500 [>.............................] - ETA: 3:02 - loss: 1.2260 - regression_loss: 1.0650 - classification_loss: 0.1610 27/500 [>.............................] - ETA: 3:01 - loss: 1.2205 - regression_loss: 1.0595 - classification_loss: 0.1610 28/500 [>.............................] - ETA: 3:01 - loss: 1.2281 - regression_loss: 1.0659 - classification_loss: 0.1622 29/500 [>.............................] - ETA: 3:00 - loss: 1.2142 - regression_loss: 1.0517 - classification_loss: 0.1625 30/500 [>.............................] - ETA: 3:00 - loss: 1.2180 - regression_loss: 1.0567 - classification_loss: 0.1612 31/500 [>.............................] - ETA: 3:00 - loss: 1.2165 - regression_loss: 1.0549 - classification_loss: 0.1616 32/500 [>.............................] - ETA: 3:00 - loss: 1.2108 - regression_loss: 1.0487 - classification_loss: 0.1621 33/500 [>.............................] - ETA: 2:59 - loss: 1.2209 - regression_loss: 1.0579 - classification_loss: 0.1629 34/500 [=>............................] - ETA: 2:58 - loss: 1.2106 - regression_loss: 1.0483 - classification_loss: 0.1623 35/500 [=>............................] - ETA: 2:57 - loss: 1.2182 - regression_loss: 1.0554 - classification_loss: 0.1628 36/500 [=>............................] - ETA: 2:57 - loss: 1.2222 - regression_loss: 1.0579 - classification_loss: 0.1644 37/500 [=>............................] - ETA: 2:57 - loss: 1.2265 - regression_loss: 1.0618 - classification_loss: 0.1647 38/500 [=>............................] - ETA: 2:56 - loss: 1.2319 - regression_loss: 1.0660 - classification_loss: 0.1659 39/500 [=>............................] - ETA: 2:56 - loss: 1.2336 - regression_loss: 1.0679 - classification_loss: 0.1658 40/500 [=>............................] - ETA: 2:56 - loss: 1.2250 - regression_loss: 1.0595 - classification_loss: 0.1654 41/500 [=>............................] - ETA: 2:55 - loss: 1.2208 - regression_loss: 1.0537 - classification_loss: 0.1672 42/500 [=>............................] - ETA: 2:55 - loss: 1.2315 - regression_loss: 1.0638 - classification_loss: 0.1677 43/500 [=>............................] - ETA: 2:54 - loss: 1.2392 - regression_loss: 1.0701 - classification_loss: 0.1691 44/500 [=>............................] - ETA: 2:54 - loss: 1.2356 - regression_loss: 1.0664 - classification_loss: 0.1692 45/500 [=>............................] - ETA: 2:54 - loss: 1.2261 - regression_loss: 1.0585 - classification_loss: 0.1677 46/500 [=>............................] - ETA: 2:54 - loss: 1.2245 - regression_loss: 1.0572 - classification_loss: 0.1673 47/500 [=>............................] - ETA: 2:54 - loss: 1.2256 - regression_loss: 1.0588 - classification_loss: 0.1668 48/500 [=>............................] - ETA: 2:53 - loss: 1.2354 - regression_loss: 1.0675 - classification_loss: 0.1679 49/500 [=>............................] - ETA: 2:53 - loss: 1.2534 - regression_loss: 1.0830 - classification_loss: 0.1705 50/500 [==>...........................] - ETA: 2:53 - loss: 1.2446 - regression_loss: 1.0759 - classification_loss: 0.1687 51/500 [==>...........................] - ETA: 2:52 - loss: 1.2442 - regression_loss: 1.0762 - classification_loss: 0.1680 52/500 [==>...........................] - ETA: 2:52 - loss: 1.2550 - regression_loss: 1.0847 - classification_loss: 0.1703 53/500 [==>...........................] - ETA: 2:52 - loss: 1.2662 - regression_loss: 1.0946 - classification_loss: 0.1716 54/500 [==>...........................] - ETA: 2:51 - loss: 1.2579 - regression_loss: 1.0876 - classification_loss: 0.1704 55/500 [==>...........................] - ETA: 2:51 - loss: 1.2474 - regression_loss: 1.0783 - classification_loss: 0.1691 56/500 [==>...........................] - ETA: 2:50 - loss: 1.2457 - regression_loss: 1.0769 - classification_loss: 0.1688 57/500 [==>...........................] - ETA: 2:50 - loss: 1.2358 - regression_loss: 1.0687 - classification_loss: 0.1671 58/500 [==>...........................] - ETA: 2:50 - loss: 1.2350 - regression_loss: 1.0682 - classification_loss: 0.1668 59/500 [==>...........................] - ETA: 2:49 - loss: 1.2407 - regression_loss: 1.0729 - classification_loss: 0.1677 60/500 [==>...........................] - ETA: 2:49 - loss: 1.2479 - regression_loss: 1.0791 - classification_loss: 0.1689 61/500 [==>...........................] - ETA: 2:49 - loss: 1.2518 - regression_loss: 1.0823 - classification_loss: 0.1695 62/500 [==>...........................] - ETA: 2:49 - loss: 1.2421 - regression_loss: 1.0744 - classification_loss: 0.1677 63/500 [==>...........................] - ETA: 2:48 - loss: 1.2344 - regression_loss: 1.0678 - classification_loss: 0.1666 64/500 [==>...........................] - ETA: 2:48 - loss: 1.2329 - regression_loss: 1.0671 - classification_loss: 0.1658 65/500 [==>...........................] - ETA: 2:48 - loss: 1.2390 - regression_loss: 1.0720 - classification_loss: 0.1670 66/500 [==>...........................] - ETA: 2:47 - loss: 1.2418 - regression_loss: 1.0746 - classification_loss: 0.1673 67/500 [===>..........................] - ETA: 2:47 - loss: 1.2355 - regression_loss: 1.0691 - classification_loss: 0.1665 68/500 [===>..........................] - ETA: 2:47 - loss: 1.2405 - regression_loss: 1.0735 - classification_loss: 0.1670 69/500 [===>..........................] - ETA: 2:46 - loss: 1.2400 - regression_loss: 1.0732 - classification_loss: 0.1668 70/500 [===>..........................] - ETA: 2:46 - loss: 1.2511 - regression_loss: 1.0833 - classification_loss: 0.1677 71/500 [===>..........................] - ETA: 2:45 - loss: 1.2475 - regression_loss: 1.0806 - classification_loss: 0.1669 72/500 [===>..........................] - ETA: 2:45 - loss: 1.2558 - regression_loss: 1.0874 - classification_loss: 0.1684 73/500 [===>..........................] - ETA: 2:44 - loss: 1.2632 - regression_loss: 1.0936 - classification_loss: 0.1696 74/500 [===>..........................] - ETA: 2:44 - loss: 1.2564 - regression_loss: 1.0881 - classification_loss: 0.1684 75/500 [===>..........................] - ETA: 2:43 - loss: 1.2641 - regression_loss: 1.0949 - classification_loss: 0.1693 76/500 [===>..........................] - ETA: 2:43 - loss: 1.2671 - regression_loss: 1.0982 - classification_loss: 0.1689 77/500 [===>..........................] - ETA: 2:42 - loss: 1.2723 - regression_loss: 1.1022 - classification_loss: 0.1701 78/500 [===>..........................] - ETA: 2:42 - loss: 1.2773 - regression_loss: 1.1068 - classification_loss: 0.1705 79/500 [===>..........................] - ETA: 2:42 - loss: 1.2677 - regression_loss: 1.0986 - classification_loss: 0.1691 80/500 [===>..........................] - ETA: 2:41 - loss: 1.2700 - regression_loss: 1.1005 - classification_loss: 0.1695 81/500 [===>..........................] - ETA: 2:41 - loss: 1.2717 - regression_loss: 1.1025 - classification_loss: 0.1693 82/500 [===>..........................] - ETA: 2:41 - loss: 1.2741 - regression_loss: 1.1038 - classification_loss: 0.1703 83/500 [===>..........................] - ETA: 2:40 - loss: 1.2713 - regression_loss: 1.1014 - classification_loss: 0.1698 84/500 [====>.........................] - ETA: 2:40 - loss: 1.2701 - regression_loss: 1.0998 - classification_loss: 0.1703 85/500 [====>.........................] - ETA: 2:39 - loss: 1.2756 - regression_loss: 1.1042 - classification_loss: 0.1714 86/500 [====>.........................] - ETA: 2:39 - loss: 1.2756 - regression_loss: 1.1045 - classification_loss: 0.1712 87/500 [====>.........................] - ETA: 2:38 - loss: 1.2728 - regression_loss: 1.1024 - classification_loss: 0.1704 88/500 [====>.........................] - ETA: 2:38 - loss: 1.2675 - regression_loss: 1.0981 - classification_loss: 0.1694 89/500 [====>.........................] - ETA: 2:38 - loss: 1.2697 - regression_loss: 1.1002 - classification_loss: 0.1695 90/500 [====>.........................] - ETA: 2:37 - loss: 1.2716 - regression_loss: 1.1026 - classification_loss: 0.1689 91/500 [====>.........................] - ETA: 2:37 - loss: 1.2726 - regression_loss: 1.1034 - classification_loss: 0.1692 92/500 [====>.........................] - ETA: 2:37 - loss: 1.2657 - regression_loss: 1.0976 - classification_loss: 0.1682 93/500 [====>.........................] - ETA: 2:36 - loss: 1.2679 - regression_loss: 1.0991 - classification_loss: 0.1689 94/500 [====>.........................] - ETA: 2:36 - loss: 1.2700 - regression_loss: 1.1012 - classification_loss: 0.1688 95/500 [====>.........................] - ETA: 2:36 - loss: 1.2662 - regression_loss: 1.0983 - classification_loss: 0.1679 96/500 [====>.........................] - ETA: 2:35 - loss: 1.2610 - regression_loss: 1.0939 - classification_loss: 0.1671 97/500 [====>.........................] - ETA: 2:35 - loss: 1.2636 - regression_loss: 1.0959 - classification_loss: 0.1677 98/500 [====>.........................] - ETA: 2:34 - loss: 1.2626 - regression_loss: 1.0950 - classification_loss: 0.1677 99/500 [====>.........................] - ETA: 2:34 - loss: 1.2588 - regression_loss: 1.0910 - classification_loss: 0.1678 100/500 [=====>........................] - ETA: 2:33 - loss: 1.2595 - regression_loss: 1.0918 - classification_loss: 0.1677 101/500 [=====>........................] - ETA: 2:33 - loss: 1.2548 - regression_loss: 1.0880 - classification_loss: 0.1668 102/500 [=====>........................] - ETA: 2:33 - loss: 1.2563 - regression_loss: 1.0899 - classification_loss: 0.1664 103/500 [=====>........................] - ETA: 2:32 - loss: 1.2579 - regression_loss: 1.0912 - classification_loss: 0.1667 104/500 [=====>........................] - ETA: 2:32 - loss: 1.2529 - regression_loss: 1.0871 - classification_loss: 0.1658 105/500 [=====>........................] - ETA: 2:31 - loss: 1.2497 - regression_loss: 1.0846 - classification_loss: 0.1651 106/500 [=====>........................] - ETA: 2:31 - loss: 1.2416 - regression_loss: 1.0776 - classification_loss: 0.1640 107/500 [=====>........................] - ETA: 2:31 - loss: 1.2419 - regression_loss: 1.0780 - classification_loss: 0.1640 108/500 [=====>........................] - ETA: 2:30 - loss: 1.2477 - regression_loss: 1.0832 - classification_loss: 0.1645 109/500 [=====>........................] - ETA: 2:30 - loss: 1.2479 - regression_loss: 1.0833 - classification_loss: 0.1647 110/500 [=====>........................] - ETA: 2:29 - loss: 1.2487 - regression_loss: 1.0840 - classification_loss: 0.1647 111/500 [=====>........................] - ETA: 2:29 - loss: 1.2488 - regression_loss: 1.0842 - classification_loss: 0.1645 112/500 [=====>........................] - ETA: 2:29 - loss: 1.2487 - regression_loss: 1.0842 - classification_loss: 0.1645 113/500 [=====>........................] - ETA: 2:28 - loss: 1.2460 - regression_loss: 1.0818 - classification_loss: 0.1641 114/500 [=====>........................] - ETA: 2:28 - loss: 1.2447 - regression_loss: 1.0806 - classification_loss: 0.1641 115/500 [=====>........................] - ETA: 2:27 - loss: 1.2458 - regression_loss: 1.0814 - classification_loss: 0.1645 116/500 [=====>........................] - ETA: 2:27 - loss: 1.2424 - regression_loss: 1.0781 - classification_loss: 0.1644 117/500 [======>.......................] - ETA: 2:27 - loss: 1.2413 - regression_loss: 1.0772 - classification_loss: 0.1642 118/500 [======>.......................] - ETA: 2:26 - loss: 1.2404 - regression_loss: 1.0764 - classification_loss: 0.1640 119/500 [======>.......................] - ETA: 2:26 - loss: 1.2407 - regression_loss: 1.0770 - classification_loss: 0.1636 120/500 [======>.......................] - ETA: 2:27 - loss: 1.2393 - regression_loss: 1.0758 - classification_loss: 0.1635 121/500 [======>.......................] - ETA: 2:27 - loss: 1.2393 - regression_loss: 1.0758 - classification_loss: 0.1636 122/500 [======>.......................] - ETA: 2:26 - loss: 1.2442 - regression_loss: 1.0799 - classification_loss: 0.1643 123/500 [======>.......................] - ETA: 2:26 - loss: 1.2449 - regression_loss: 1.0807 - classification_loss: 0.1642 124/500 [======>.......................] - ETA: 2:25 - loss: 1.2507 - regression_loss: 1.0857 - classification_loss: 0.1649 125/500 [======>.......................] - ETA: 2:25 - loss: 1.2483 - regression_loss: 1.0835 - classification_loss: 0.1648 126/500 [======>.......................] - ETA: 2:24 - loss: 1.2483 - regression_loss: 1.0837 - classification_loss: 0.1646 127/500 [======>.......................] - ETA: 2:24 - loss: 1.2446 - regression_loss: 1.0806 - classification_loss: 0.1640 128/500 [======>.......................] - ETA: 2:24 - loss: 1.2473 - regression_loss: 1.0827 - classification_loss: 0.1646 129/500 [======>.......................] - ETA: 2:23 - loss: 1.2498 - regression_loss: 1.0850 - classification_loss: 0.1648 130/500 [======>.......................] - ETA: 2:23 - loss: 1.2513 - regression_loss: 1.0863 - classification_loss: 0.1650 131/500 [======>.......................] - ETA: 2:22 - loss: 1.2500 - regression_loss: 1.0852 - classification_loss: 0.1649 132/500 [======>.......................] - ETA: 2:22 - loss: 1.2506 - regression_loss: 1.0859 - classification_loss: 0.1647 133/500 [======>.......................] - ETA: 2:22 - loss: 1.2521 - regression_loss: 1.0875 - classification_loss: 0.1647 134/500 [=======>......................] - ETA: 2:21 - loss: 1.2490 - regression_loss: 1.0849 - classification_loss: 0.1641 135/500 [=======>......................] - ETA: 2:21 - loss: 1.2461 - regression_loss: 1.0825 - classification_loss: 0.1636 136/500 [=======>......................] - ETA: 2:20 - loss: 1.2463 - regression_loss: 1.0827 - classification_loss: 0.1636 137/500 [=======>......................] - ETA: 2:20 - loss: 1.2446 - regression_loss: 1.0817 - classification_loss: 0.1630 138/500 [=======>......................] - ETA: 2:20 - loss: 1.2432 - regression_loss: 1.0804 - classification_loss: 0.1628 139/500 [=======>......................] - ETA: 2:19 - loss: 1.2447 - regression_loss: 1.0817 - classification_loss: 0.1630 140/500 [=======>......................] - ETA: 2:19 - loss: 1.2474 - regression_loss: 1.0842 - classification_loss: 0.1632 141/500 [=======>......................] - ETA: 2:18 - loss: 1.2480 - regression_loss: 1.0849 - classification_loss: 0.1631 142/500 [=======>......................] - ETA: 2:18 - loss: 1.2474 - regression_loss: 1.0846 - classification_loss: 0.1628 143/500 [=======>......................] - ETA: 2:18 - loss: 1.2462 - regression_loss: 1.0836 - classification_loss: 0.1626 144/500 [=======>......................] - ETA: 2:17 - loss: 1.2456 - regression_loss: 1.0827 - classification_loss: 0.1630 145/500 [=======>......................] - ETA: 2:17 - loss: 1.2454 - regression_loss: 1.0822 - classification_loss: 0.1631 146/500 [=======>......................] - ETA: 2:16 - loss: 1.2488 - regression_loss: 1.0853 - classification_loss: 0.1635 147/500 [=======>......................] - ETA: 2:16 - loss: 1.2448 - regression_loss: 1.0819 - classification_loss: 0.1629 148/500 [=======>......................] - ETA: 2:16 - loss: 1.2405 - regression_loss: 1.0782 - classification_loss: 0.1623 149/500 [=======>......................] - ETA: 2:15 - loss: 1.2398 - regression_loss: 1.0775 - classification_loss: 0.1623 150/500 [========>.....................] - ETA: 2:15 - loss: 1.2366 - regression_loss: 1.0750 - classification_loss: 0.1616 151/500 [========>.....................] - ETA: 2:14 - loss: 1.2356 - regression_loss: 1.0742 - classification_loss: 0.1614 152/500 [========>.....................] - ETA: 2:14 - loss: 1.2365 - regression_loss: 1.0749 - classification_loss: 0.1615 153/500 [========>.....................] - ETA: 2:14 - loss: 1.2320 - regression_loss: 1.0710 - classification_loss: 0.1610 154/500 [========>.....................] - ETA: 2:13 - loss: 1.2349 - regression_loss: 1.0734 - classification_loss: 0.1614 155/500 [========>.....................] - ETA: 2:13 - loss: 1.2346 - regression_loss: 1.0732 - classification_loss: 0.1614 156/500 [========>.....................] - ETA: 2:12 - loss: 1.2334 - regression_loss: 1.0721 - classification_loss: 0.1613 157/500 [========>.....................] - ETA: 2:12 - loss: 1.2346 - regression_loss: 1.0732 - classification_loss: 0.1615 158/500 [========>.....................] - ETA: 2:12 - loss: 1.2365 - regression_loss: 1.0748 - classification_loss: 0.1617 159/500 [========>.....................] - ETA: 2:11 - loss: 1.2334 - regression_loss: 1.0721 - classification_loss: 0.1613 160/500 [========>.....................] - ETA: 2:11 - loss: 1.2344 - regression_loss: 1.0729 - classification_loss: 0.1615 161/500 [========>.....................] - ETA: 2:10 - loss: 1.2367 - regression_loss: 1.0752 - classification_loss: 0.1614 162/500 [========>.....................] - ETA: 2:10 - loss: 1.2383 - regression_loss: 1.0768 - classification_loss: 0.1614 163/500 [========>.....................] - ETA: 2:09 - loss: 1.2391 - regression_loss: 1.0776 - classification_loss: 0.1615 164/500 [========>.....................] - ETA: 2:09 - loss: 1.2462 - regression_loss: 1.0830 - classification_loss: 0.1632 165/500 [========>.....................] - ETA: 2:09 - loss: 1.2447 - regression_loss: 1.0810 - classification_loss: 0.1637 166/500 [========>.....................] - ETA: 2:08 - loss: 1.2478 - regression_loss: 1.0831 - classification_loss: 0.1647 167/500 [=========>....................] - ETA: 2:08 - loss: 1.2475 - regression_loss: 1.0828 - classification_loss: 0.1647 168/500 [=========>....................] - ETA: 2:07 - loss: 1.2508 - regression_loss: 1.0856 - classification_loss: 0.1652 169/500 [=========>....................] - ETA: 2:07 - loss: 1.2530 - regression_loss: 1.0875 - classification_loss: 0.1656 170/500 [=========>....................] - ETA: 2:07 - loss: 1.2542 - regression_loss: 1.0885 - classification_loss: 0.1657 171/500 [=========>....................] - ETA: 2:06 - loss: 1.2548 - regression_loss: 1.0889 - classification_loss: 0.1659 172/500 [=========>....................] - ETA: 2:06 - loss: 1.2555 - regression_loss: 1.0894 - classification_loss: 0.1661 173/500 [=========>....................] - ETA: 2:05 - loss: 1.2532 - regression_loss: 1.0873 - classification_loss: 0.1659 174/500 [=========>....................] - ETA: 2:05 - loss: 1.2501 - regression_loss: 1.0847 - classification_loss: 0.1653 175/500 [=========>....................] - ETA: 2:05 - loss: 1.2502 - regression_loss: 1.0849 - classification_loss: 0.1653 176/500 [=========>....................] - ETA: 2:04 - loss: 1.2518 - regression_loss: 1.0863 - classification_loss: 0.1655 177/500 [=========>....................] - ETA: 2:04 - loss: 1.2551 - regression_loss: 1.0889 - classification_loss: 0.1662 178/500 [=========>....................] - ETA: 2:03 - loss: 1.2553 - regression_loss: 1.0889 - classification_loss: 0.1664 179/500 [=========>....................] - ETA: 2:03 - loss: 1.2541 - regression_loss: 1.0882 - classification_loss: 0.1659 180/500 [=========>....................] - ETA: 2:03 - loss: 1.2542 - regression_loss: 1.0882 - classification_loss: 0.1661 181/500 [=========>....................] - ETA: 2:02 - loss: 1.2543 - regression_loss: 1.0883 - classification_loss: 0.1660 182/500 [=========>....................] - ETA: 2:02 - loss: 1.2512 - regression_loss: 1.0857 - classification_loss: 0.1654 183/500 [=========>....................] - ETA: 2:01 - loss: 1.2524 - regression_loss: 1.0869 - classification_loss: 0.1655 184/500 [==========>...................] - ETA: 2:01 - loss: 1.2517 - regression_loss: 1.0863 - classification_loss: 0.1654 185/500 [==========>...................] - ETA: 2:01 - loss: 1.2500 - regression_loss: 1.0845 - classification_loss: 0.1656 186/500 [==========>...................] - ETA: 2:00 - loss: 1.2517 - regression_loss: 1.0860 - classification_loss: 0.1657 187/500 [==========>...................] - ETA: 2:00 - loss: 1.2572 - regression_loss: 1.0905 - classification_loss: 0.1668 188/500 [==========>...................] - ETA: 1:59 - loss: 1.2547 - regression_loss: 1.0880 - classification_loss: 0.1667 189/500 [==========>...................] - ETA: 1:59 - loss: 1.2518 - regression_loss: 1.0854 - classification_loss: 0.1664 190/500 [==========>...................] - ETA: 1:59 - loss: 1.2506 - regression_loss: 1.0844 - classification_loss: 0.1662 191/500 [==========>...................] - ETA: 1:58 - loss: 1.2512 - regression_loss: 1.0849 - classification_loss: 0.1663 192/500 [==========>...................] - ETA: 1:58 - loss: 1.2499 - regression_loss: 1.0837 - classification_loss: 0.1662 193/500 [==========>...................] - ETA: 1:57 - loss: 1.2463 - regression_loss: 1.0806 - classification_loss: 0.1657 194/500 [==========>...................] - ETA: 1:57 - loss: 1.2423 - regression_loss: 1.0772 - classification_loss: 0.1652 195/500 [==========>...................] - ETA: 1:57 - loss: 1.2418 - regression_loss: 1.0766 - classification_loss: 0.1652 196/500 [==========>...................] - ETA: 1:56 - loss: 1.2422 - regression_loss: 1.0769 - classification_loss: 0.1652 197/500 [==========>...................] - ETA: 1:56 - loss: 1.2448 - regression_loss: 1.0790 - classification_loss: 0.1658 198/500 [==========>...................] - ETA: 1:55 - loss: 1.2422 - regression_loss: 1.0769 - classification_loss: 0.1653 199/500 [==========>...................] - ETA: 1:55 - loss: 1.2436 - regression_loss: 1.0781 - classification_loss: 0.1655 200/500 [===========>..................] - ETA: 1:55 - loss: 1.2417 - regression_loss: 1.0764 - classification_loss: 0.1653 201/500 [===========>..................] - ETA: 1:54 - loss: 1.2432 - regression_loss: 1.0776 - classification_loss: 0.1656 202/500 [===========>..................] - ETA: 1:54 - loss: 1.2414 - regression_loss: 1.0761 - classification_loss: 0.1653 203/500 [===========>..................] - ETA: 1:54 - loss: 1.2411 - regression_loss: 1.0761 - classification_loss: 0.1650 204/500 [===========>..................] - ETA: 1:53 - loss: 1.2434 - regression_loss: 1.0781 - classification_loss: 0.1653 205/500 [===========>..................] - ETA: 1:53 - loss: 1.2436 - regression_loss: 1.0783 - classification_loss: 0.1653 206/500 [===========>..................] - ETA: 1:52 - loss: 1.2465 - regression_loss: 1.0810 - classification_loss: 0.1655 207/500 [===========>..................] - ETA: 1:52 - loss: 1.2468 - regression_loss: 1.0812 - classification_loss: 0.1656 208/500 [===========>..................] - ETA: 1:52 - loss: 1.2454 - regression_loss: 1.0801 - classification_loss: 0.1653 209/500 [===========>..................] - ETA: 1:51 - loss: 1.2455 - regression_loss: 1.0804 - classification_loss: 0.1652 210/500 [===========>..................] - ETA: 1:51 - loss: 1.2451 - regression_loss: 1.0800 - classification_loss: 0.1651 211/500 [===========>..................] - ETA: 1:50 - loss: 1.2447 - regression_loss: 1.0796 - classification_loss: 0.1650 212/500 [===========>..................] - ETA: 1:50 - loss: 1.2419 - regression_loss: 1.0773 - classification_loss: 0.1645 213/500 [===========>..................] - ETA: 1:50 - loss: 1.2409 - regression_loss: 1.0765 - classification_loss: 0.1644 214/500 [===========>..................] - ETA: 1:49 - loss: 1.2381 - regression_loss: 1.0740 - classification_loss: 0.1641 215/500 [===========>..................] - ETA: 1:49 - loss: 1.2383 - regression_loss: 1.0744 - classification_loss: 0.1639 216/500 [===========>..................] - ETA: 1:48 - loss: 1.2394 - regression_loss: 1.0755 - classification_loss: 0.1639 217/500 [============>.................] - ETA: 1:48 - loss: 1.2393 - regression_loss: 1.0757 - classification_loss: 0.1636 218/500 [============>.................] - ETA: 1:48 - loss: 1.2388 - regression_loss: 1.0754 - classification_loss: 0.1634 219/500 [============>.................] - ETA: 1:47 - loss: 1.2401 - regression_loss: 1.0766 - classification_loss: 0.1635 220/500 [============>.................] - ETA: 1:47 - loss: 1.2380 - regression_loss: 1.0749 - classification_loss: 0.1632 221/500 [============>.................] - ETA: 1:47 - loss: 1.2379 - regression_loss: 1.0748 - classification_loss: 0.1631 222/500 [============>.................] - ETA: 1:46 - loss: 1.2409 - regression_loss: 1.0773 - classification_loss: 0.1636 223/500 [============>.................] - ETA: 1:46 - loss: 1.2439 - regression_loss: 1.0803 - classification_loss: 0.1636 224/500 [============>.................] - ETA: 1:45 - loss: 1.2441 - regression_loss: 1.0807 - classification_loss: 0.1634 225/500 [============>.................] - ETA: 1:45 - loss: 1.2461 - regression_loss: 1.0825 - classification_loss: 0.1636 226/500 [============>.................] - ETA: 1:44 - loss: 1.2485 - regression_loss: 1.0847 - classification_loss: 0.1637 227/500 [============>.................] - ETA: 1:44 - loss: 1.2486 - regression_loss: 1.0852 - classification_loss: 0.1634 228/500 [============>.................] - ETA: 1:44 - loss: 1.2496 - regression_loss: 1.0860 - classification_loss: 0.1636 229/500 [============>.................] - ETA: 1:43 - loss: 1.2515 - regression_loss: 1.0877 - classification_loss: 0.1638 230/500 [============>.................] - ETA: 1:43 - loss: 1.2500 - regression_loss: 1.0865 - classification_loss: 0.1635 231/500 [============>.................] - ETA: 1:43 - loss: 1.2497 - regression_loss: 1.0860 - classification_loss: 0.1637 232/500 [============>.................] - ETA: 1:42 - loss: 1.2501 - regression_loss: 1.0864 - classification_loss: 0.1637 233/500 [============>.................] - ETA: 1:42 - loss: 1.2512 - regression_loss: 1.0873 - classification_loss: 0.1639 234/500 [=============>................] - ETA: 1:41 - loss: 1.2528 - regression_loss: 1.0889 - classification_loss: 0.1639 235/500 [=============>................] - ETA: 1:41 - loss: 1.2530 - regression_loss: 1.0892 - classification_loss: 0.1638 236/500 [=============>................] - ETA: 1:41 - loss: 1.2551 - regression_loss: 1.0908 - classification_loss: 0.1642 237/500 [=============>................] - ETA: 1:40 - loss: 1.2521 - regression_loss: 1.0881 - classification_loss: 0.1640 238/500 [=============>................] - ETA: 1:40 - loss: 1.2512 - regression_loss: 1.0873 - classification_loss: 0.1639 239/500 [=============>................] - ETA: 1:40 - loss: 1.2514 - regression_loss: 1.0875 - classification_loss: 0.1639 240/500 [=============>................] - ETA: 1:39 - loss: 1.2534 - regression_loss: 1.0893 - classification_loss: 0.1641 241/500 [=============>................] - ETA: 1:39 - loss: 1.2539 - regression_loss: 1.0896 - classification_loss: 0.1642 242/500 [=============>................] - ETA: 1:38 - loss: 1.2521 - regression_loss: 1.0880 - classification_loss: 0.1641 243/500 [=============>................] - ETA: 1:38 - loss: 1.2525 - regression_loss: 1.0884 - classification_loss: 0.1641 244/500 [=============>................] - ETA: 1:38 - loss: 1.2539 - regression_loss: 1.0894 - classification_loss: 0.1645 245/500 [=============>................] - ETA: 1:37 - loss: 1.2518 - regression_loss: 1.0876 - classification_loss: 0.1642 246/500 [=============>................] - ETA: 1:37 - loss: 1.2499 - regression_loss: 1.0861 - classification_loss: 0.1638 247/500 [=============>................] - ETA: 1:36 - loss: 1.2492 - regression_loss: 1.0855 - classification_loss: 0.1637 248/500 [=============>................] - ETA: 1:36 - loss: 1.2470 - regression_loss: 1.0837 - classification_loss: 0.1633 249/500 [=============>................] - ETA: 1:36 - loss: 1.2468 - regression_loss: 1.0837 - classification_loss: 0.1631 250/500 [==============>...............] - ETA: 1:35 - loss: 1.2447 - regression_loss: 1.0820 - classification_loss: 0.1628 251/500 [==============>...............] - ETA: 1:35 - loss: 1.2419 - regression_loss: 1.0796 - classification_loss: 0.1623 252/500 [==============>...............] - ETA: 1:35 - loss: 1.2434 - regression_loss: 1.0809 - classification_loss: 0.1625 253/500 [==============>...............] - ETA: 1:34 - loss: 1.2416 - regression_loss: 1.0794 - classification_loss: 0.1623 254/500 [==============>...............] - ETA: 1:34 - loss: 1.2426 - regression_loss: 1.0804 - classification_loss: 0.1623 255/500 [==============>...............] - ETA: 1:33 - loss: 1.2411 - regression_loss: 1.0790 - classification_loss: 0.1621 256/500 [==============>...............] - ETA: 1:33 - loss: 1.2424 - regression_loss: 1.0801 - classification_loss: 0.1622 257/500 [==============>...............] - ETA: 1:33 - loss: 1.2424 - regression_loss: 1.0800 - classification_loss: 0.1624 258/500 [==============>...............] - ETA: 1:32 - loss: 1.2413 - regression_loss: 1.0790 - classification_loss: 0.1623 259/500 [==============>...............] - ETA: 1:32 - loss: 1.2421 - regression_loss: 1.0797 - classification_loss: 0.1623 260/500 [==============>...............] - ETA: 1:31 - loss: 1.2407 - regression_loss: 1.0785 - classification_loss: 0.1623 261/500 [==============>...............] - ETA: 1:31 - loss: 1.2392 - regression_loss: 1.0770 - classification_loss: 0.1621 262/500 [==============>...............] - ETA: 1:31 - loss: 1.2396 - regression_loss: 1.0774 - classification_loss: 0.1622 263/500 [==============>...............] - ETA: 1:30 - loss: 1.2387 - regression_loss: 1.0766 - classification_loss: 0.1621 264/500 [==============>...............] - ETA: 1:30 - loss: 1.2363 - regression_loss: 1.0746 - classification_loss: 0.1618 265/500 [==============>...............] - ETA: 1:29 - loss: 1.2349 - regression_loss: 1.0733 - classification_loss: 0.1616 266/500 [==============>...............] - ETA: 1:29 - loss: 1.2361 - regression_loss: 1.0744 - classification_loss: 0.1617 267/500 [===============>..............] - ETA: 1:29 - loss: 1.2356 - regression_loss: 1.0739 - classification_loss: 0.1617 268/500 [===============>..............] - ETA: 1:28 - loss: 1.2331 - regression_loss: 1.0718 - classification_loss: 0.1614 269/500 [===============>..............] - ETA: 1:28 - loss: 1.2331 - regression_loss: 1.0718 - classification_loss: 0.1613 270/500 [===============>..............] - ETA: 1:28 - loss: 1.2331 - regression_loss: 1.0719 - classification_loss: 0.1612 271/500 [===============>..............] - ETA: 1:27 - loss: 1.2317 - regression_loss: 1.0707 - classification_loss: 0.1610 272/500 [===============>..............] - ETA: 1:27 - loss: 1.2314 - regression_loss: 1.0705 - classification_loss: 0.1609 273/500 [===============>..............] - ETA: 1:26 - loss: 1.2309 - regression_loss: 1.0700 - classification_loss: 0.1610 274/500 [===============>..............] - ETA: 1:26 - loss: 1.2314 - regression_loss: 1.0703 - classification_loss: 0.1611 275/500 [===============>..............] - ETA: 1:26 - loss: 1.2318 - regression_loss: 1.0705 - classification_loss: 0.1613 276/500 [===============>..............] - ETA: 1:25 - loss: 1.2336 - regression_loss: 1.0722 - classification_loss: 0.1614 277/500 [===============>..............] - ETA: 1:25 - loss: 1.2326 - regression_loss: 1.0712 - classification_loss: 0.1615 278/500 [===============>..............] - ETA: 1:25 - loss: 1.2315 - regression_loss: 1.0701 - classification_loss: 0.1614 279/500 [===============>..............] - ETA: 1:24 - loss: 1.2323 - regression_loss: 1.0708 - classification_loss: 0.1615 280/500 [===============>..............] - ETA: 1:24 - loss: 1.2327 - regression_loss: 1.0712 - classification_loss: 0.1614 281/500 [===============>..............] - ETA: 1:23 - loss: 1.2317 - regression_loss: 1.0704 - classification_loss: 0.1613 282/500 [===============>..............] - ETA: 1:23 - loss: 1.2309 - regression_loss: 1.0696 - classification_loss: 0.1613 283/500 [===============>..............] - ETA: 1:23 - loss: 1.2316 - regression_loss: 1.0699 - classification_loss: 0.1618 284/500 [================>.............] - ETA: 1:22 - loss: 1.2302 - regression_loss: 1.0685 - classification_loss: 0.1617 285/500 [================>.............] - ETA: 1:22 - loss: 1.2301 - regression_loss: 1.0684 - classification_loss: 0.1616 286/500 [================>.............] - ETA: 1:21 - loss: 1.2303 - regression_loss: 1.0686 - classification_loss: 0.1618 287/500 [================>.............] - ETA: 1:21 - loss: 1.2315 - regression_loss: 1.0695 - classification_loss: 0.1620 288/500 [================>.............] - ETA: 1:21 - loss: 1.2328 - regression_loss: 1.0708 - classification_loss: 0.1619 289/500 [================>.............] - ETA: 1:20 - loss: 1.2307 - regression_loss: 1.0691 - classification_loss: 0.1616 290/500 [================>.............] - ETA: 1:20 - loss: 1.2307 - regression_loss: 1.0690 - classification_loss: 0.1616 291/500 [================>.............] - ETA: 1:20 - loss: 1.2305 - regression_loss: 1.0690 - classification_loss: 0.1615 292/500 [================>.............] - ETA: 1:19 - loss: 1.2309 - regression_loss: 1.0693 - classification_loss: 0.1616 293/500 [================>.............] - ETA: 1:19 - loss: 1.2310 - regression_loss: 1.0693 - classification_loss: 0.1617 294/500 [================>.............] - ETA: 1:18 - loss: 1.2320 - regression_loss: 1.0702 - classification_loss: 0.1618 295/500 [================>.............] - ETA: 1:18 - loss: 1.2333 - regression_loss: 1.0713 - classification_loss: 0.1620 296/500 [================>.............] - ETA: 1:18 - loss: 1.2344 - regression_loss: 1.0726 - classification_loss: 0.1619 297/500 [================>.............] - ETA: 1:17 - loss: 1.2352 - regression_loss: 1.0731 - classification_loss: 0.1621 298/500 [================>.............] - ETA: 1:17 - loss: 1.2350 - regression_loss: 1.0730 - classification_loss: 0.1621 299/500 [================>.............] - ETA: 1:16 - loss: 1.2351 - regression_loss: 1.0730 - classification_loss: 0.1621 300/500 [=================>............] - ETA: 1:16 - loss: 1.2355 - regression_loss: 1.0734 - classification_loss: 0.1621 301/500 [=================>............] - ETA: 1:16 - loss: 1.2337 - regression_loss: 1.0718 - classification_loss: 0.1620 302/500 [=================>............] - ETA: 1:15 - loss: 1.2323 - regression_loss: 1.0706 - classification_loss: 0.1617 303/500 [=================>............] - ETA: 1:15 - loss: 1.2295 - regression_loss: 1.0681 - classification_loss: 0.1614 304/500 [=================>............] - ETA: 1:15 - loss: 1.2302 - regression_loss: 1.0687 - classification_loss: 0.1615 305/500 [=================>............] - ETA: 1:14 - loss: 1.2311 - regression_loss: 1.0695 - classification_loss: 0.1615 306/500 [=================>............] - ETA: 1:14 - loss: 1.2307 - regression_loss: 1.0693 - classification_loss: 0.1614 307/500 [=================>............] - ETA: 1:13 - loss: 1.2311 - regression_loss: 1.0697 - classification_loss: 0.1614 308/500 [=================>............] - ETA: 1:13 - loss: 1.2312 - regression_loss: 1.0698 - classification_loss: 0.1614 309/500 [=================>............] - ETA: 1:13 - loss: 1.2311 - regression_loss: 1.0699 - classification_loss: 0.1612 310/500 [=================>............] - ETA: 1:12 - loss: 1.2291 - regression_loss: 1.0683 - classification_loss: 0.1609 311/500 [=================>............] - ETA: 1:12 - loss: 1.2284 - regression_loss: 1.0678 - classification_loss: 0.1607 312/500 [=================>............] - ETA: 1:11 - loss: 1.2264 - regression_loss: 1.0661 - classification_loss: 0.1603 313/500 [=================>............] - ETA: 1:11 - loss: 1.2279 - regression_loss: 1.0673 - classification_loss: 0.1606 314/500 [=================>............] - ETA: 1:11 - loss: 1.2280 - regression_loss: 1.0674 - classification_loss: 0.1606 315/500 [=================>............] - ETA: 1:10 - loss: 1.2267 - regression_loss: 1.0663 - classification_loss: 0.1604 316/500 [=================>............] - ETA: 1:10 - loss: 1.2273 - regression_loss: 1.0665 - classification_loss: 0.1608 317/500 [==================>...........] - ETA: 1:10 - loss: 1.2288 - regression_loss: 1.0679 - classification_loss: 0.1609 318/500 [==================>...........] - ETA: 1:09 - loss: 1.2296 - regression_loss: 1.0683 - classification_loss: 0.1613 319/500 [==================>...........] - ETA: 1:09 - loss: 1.2316 - regression_loss: 1.0700 - classification_loss: 0.1616 320/500 [==================>...........] - ETA: 1:08 - loss: 1.2320 - regression_loss: 1.0705 - classification_loss: 0.1615 321/500 [==================>...........] - ETA: 1:08 - loss: 1.2313 - regression_loss: 1.0699 - classification_loss: 0.1614 322/500 [==================>...........] - ETA: 1:08 - loss: 1.2324 - regression_loss: 1.0706 - classification_loss: 0.1618 323/500 [==================>...........] - ETA: 1:07 - loss: 1.2316 - regression_loss: 1.0701 - classification_loss: 0.1615 324/500 [==================>...........] - ETA: 1:07 - loss: 1.2297 - regression_loss: 1.0683 - classification_loss: 0.1613 325/500 [==================>...........] - ETA: 1:06 - loss: 1.2305 - regression_loss: 1.0692 - classification_loss: 0.1613 326/500 [==================>...........] - ETA: 1:06 - loss: 1.2308 - regression_loss: 1.0694 - classification_loss: 0.1613 327/500 [==================>...........] - ETA: 1:06 - loss: 1.2323 - regression_loss: 1.0707 - classification_loss: 0.1616 328/500 [==================>...........] - ETA: 1:05 - loss: 1.2312 - regression_loss: 1.0698 - classification_loss: 0.1614 329/500 [==================>...........] - ETA: 1:05 - loss: 1.2313 - regression_loss: 1.0699 - classification_loss: 0.1614 330/500 [==================>...........] - ETA: 1:05 - loss: 1.2312 - regression_loss: 1.0697 - classification_loss: 0.1615 331/500 [==================>...........] - ETA: 1:04 - loss: 1.2296 - regression_loss: 1.0683 - classification_loss: 0.1613 332/500 [==================>...........] - ETA: 1:04 - loss: 1.2304 - regression_loss: 1.0689 - classification_loss: 0.1615 333/500 [==================>...........] - ETA: 1:03 - loss: 1.2295 - regression_loss: 1.0681 - classification_loss: 0.1613 334/500 [===================>..........] - ETA: 1:03 - loss: 1.2276 - regression_loss: 1.0666 - classification_loss: 0.1611 335/500 [===================>..........] - ETA: 1:03 - loss: 1.2286 - regression_loss: 1.0674 - classification_loss: 0.1612 336/500 [===================>..........] - ETA: 1:02 - loss: 1.2286 - regression_loss: 1.0675 - classification_loss: 0.1612 337/500 [===================>..........] - ETA: 1:02 - loss: 1.2282 - regression_loss: 1.0672 - classification_loss: 0.1610 338/500 [===================>..........] - ETA: 1:02 - loss: 1.2297 - regression_loss: 1.0687 - classification_loss: 0.1610 339/500 [===================>..........] - ETA: 1:01 - loss: 1.2297 - regression_loss: 1.0688 - classification_loss: 0.1609 340/500 [===================>..........] - ETA: 1:01 - loss: 1.2298 - regression_loss: 1.0689 - classification_loss: 0.1609 341/500 [===================>..........] - ETA: 1:00 - loss: 1.2310 - regression_loss: 1.0699 - classification_loss: 0.1611 342/500 [===================>..........] - ETA: 1:00 - loss: 1.2312 - regression_loss: 1.0701 - classification_loss: 0.1611 343/500 [===================>..........] - ETA: 1:00 - loss: 1.2312 - regression_loss: 1.0700 - classification_loss: 0.1611 344/500 [===================>..........] - ETA: 59s - loss: 1.2321 - regression_loss: 1.0708 - classification_loss: 0.1612  345/500 [===================>..........] - ETA: 59s - loss: 1.2317 - regression_loss: 1.0702 - classification_loss: 0.1616 346/500 [===================>..........] - ETA: 58s - loss: 1.2309 - regression_loss: 1.0694 - classification_loss: 0.1615 347/500 [===================>..........] - ETA: 58s - loss: 1.2309 - regression_loss: 1.0695 - classification_loss: 0.1614 348/500 [===================>..........] - ETA: 58s - loss: 1.2309 - regression_loss: 1.0694 - classification_loss: 0.1615 349/500 [===================>..........] - ETA: 57s - loss: 1.2307 - regression_loss: 1.0692 - classification_loss: 0.1615 350/500 [====================>.........] - ETA: 57s - loss: 1.2316 - regression_loss: 1.0701 - classification_loss: 0.1615 351/500 [====================>.........] - ETA: 57s - loss: 1.2320 - regression_loss: 1.0705 - classification_loss: 0.1615 352/500 [====================>.........] - ETA: 56s - loss: 1.2327 - regression_loss: 1.0712 - classification_loss: 0.1615 353/500 [====================>.........] - ETA: 56s - loss: 1.2314 - regression_loss: 1.0701 - classification_loss: 0.1613 354/500 [====================>.........] - ETA: 55s - loss: 1.2309 - regression_loss: 1.0698 - classification_loss: 0.1612 355/500 [====================>.........] - ETA: 55s - loss: 1.2320 - regression_loss: 1.0707 - classification_loss: 0.1613 356/500 [====================>.........] - ETA: 55s - loss: 1.2332 - regression_loss: 1.0718 - classification_loss: 0.1614 357/500 [====================>.........] - ETA: 54s - loss: 1.2334 - regression_loss: 1.0721 - classification_loss: 0.1613 358/500 [====================>.........] - ETA: 54s - loss: 1.2338 - regression_loss: 1.0724 - classification_loss: 0.1614 359/500 [====================>.........] - ETA: 53s - loss: 1.2327 - regression_loss: 1.0715 - classification_loss: 0.1612 360/500 [====================>.........] - ETA: 53s - loss: 1.2327 - regression_loss: 1.0715 - classification_loss: 0.1612 361/500 [====================>.........] - ETA: 53s - loss: 1.2332 - regression_loss: 1.0719 - classification_loss: 0.1613 362/500 [====================>.........] - ETA: 52s - loss: 1.2333 - regression_loss: 1.0721 - classification_loss: 0.1612 363/500 [====================>.........] - ETA: 52s - loss: 1.2320 - regression_loss: 1.0707 - classification_loss: 0.1613 364/500 [====================>.........] - ETA: 52s - loss: 1.2306 - regression_loss: 1.0695 - classification_loss: 0.1611 365/500 [====================>.........] - ETA: 51s - loss: 1.2307 - regression_loss: 1.0695 - classification_loss: 0.1612 366/500 [====================>.........] - ETA: 51s - loss: 1.2317 - regression_loss: 1.0703 - classification_loss: 0.1614 367/500 [=====================>........] - ETA: 50s - loss: 1.2318 - regression_loss: 1.0705 - classification_loss: 0.1614 368/500 [=====================>........] - ETA: 50s - loss: 1.2327 - regression_loss: 1.0713 - classification_loss: 0.1614 369/500 [=====================>........] - ETA: 50s - loss: 1.2325 - regression_loss: 1.0712 - classification_loss: 0.1613 370/500 [=====================>........] - ETA: 49s - loss: 1.2335 - regression_loss: 1.0720 - classification_loss: 0.1615 371/500 [=====================>........] - ETA: 49s - loss: 1.2344 - regression_loss: 1.0727 - classification_loss: 0.1617 372/500 [=====================>........] - ETA: 48s - loss: 1.2339 - regression_loss: 1.0723 - classification_loss: 0.1616 373/500 [=====================>........] - ETA: 48s - loss: 1.2318 - regression_loss: 1.0705 - classification_loss: 0.1613 374/500 [=====================>........] - ETA: 48s - loss: 1.2318 - regression_loss: 1.0704 - classification_loss: 0.1614 375/500 [=====================>........] - ETA: 47s - loss: 1.2319 - regression_loss: 1.0706 - classification_loss: 0.1613 376/500 [=====================>........] - ETA: 47s - loss: 1.2302 - regression_loss: 1.0691 - classification_loss: 0.1611 377/500 [=====================>........] - ETA: 47s - loss: 1.2303 - regression_loss: 1.0692 - classification_loss: 0.1610 378/500 [=====================>........] - ETA: 46s - loss: 1.2315 - regression_loss: 1.0703 - classification_loss: 0.1612 379/500 [=====================>........] - ETA: 46s - loss: 1.2314 - regression_loss: 1.0703 - classification_loss: 0.1612 380/500 [=====================>........] - ETA: 45s - loss: 1.2319 - regression_loss: 1.0707 - classification_loss: 0.1613 381/500 [=====================>........] - ETA: 45s - loss: 1.2331 - regression_loss: 1.0718 - classification_loss: 0.1613 382/500 [=====================>........] - ETA: 45s - loss: 1.2323 - regression_loss: 1.0710 - classification_loss: 0.1612 383/500 [=====================>........] - ETA: 44s - loss: 1.2328 - regression_loss: 1.0715 - classification_loss: 0.1613 384/500 [======================>.......] - ETA: 44s - loss: 1.2319 - regression_loss: 1.0707 - classification_loss: 0.1612 385/500 [======================>.......] - ETA: 44s - loss: 1.2309 - regression_loss: 1.0699 - classification_loss: 0.1609 386/500 [======================>.......] - ETA: 43s - loss: 1.2291 - regression_loss: 1.0684 - classification_loss: 0.1607 387/500 [======================>.......] - ETA: 43s - loss: 1.2291 - regression_loss: 1.0684 - classification_loss: 0.1607 388/500 [======================>.......] - ETA: 42s - loss: 1.2299 - regression_loss: 1.0690 - classification_loss: 0.1609 389/500 [======================>.......] - ETA: 42s - loss: 1.2312 - regression_loss: 1.0702 - classification_loss: 0.1610 390/500 [======================>.......] - ETA: 42s - loss: 1.2326 - regression_loss: 1.0713 - classification_loss: 0.1613 391/500 [======================>.......] - ETA: 41s - loss: 1.2320 - regression_loss: 1.0707 - classification_loss: 0.1613 392/500 [======================>.......] - ETA: 41s - loss: 1.2320 - regression_loss: 1.0707 - classification_loss: 0.1613 393/500 [======================>.......] - ETA: 40s - loss: 1.2318 - regression_loss: 1.0705 - classification_loss: 0.1613 394/500 [======================>.......] - ETA: 40s - loss: 1.2298 - regression_loss: 1.0689 - classification_loss: 0.1610 395/500 [======================>.......] - ETA: 40s - loss: 1.2308 - regression_loss: 1.0697 - classification_loss: 0.1611 396/500 [======================>.......] - ETA: 39s - loss: 1.2302 - regression_loss: 1.0692 - classification_loss: 0.1610 397/500 [======================>.......] - ETA: 39s - loss: 1.2303 - regression_loss: 1.0695 - classification_loss: 0.1609 398/500 [======================>.......] - ETA: 39s - loss: 1.2291 - regression_loss: 1.0684 - classification_loss: 0.1608 399/500 [======================>.......] - ETA: 38s - loss: 1.2295 - regression_loss: 1.0685 - classification_loss: 0.1610 400/500 [=======================>......] - ETA: 38s - loss: 1.2305 - regression_loss: 1.0694 - classification_loss: 0.1611 401/500 [=======================>......] - ETA: 37s - loss: 1.2303 - regression_loss: 1.0692 - classification_loss: 0.1612 402/500 [=======================>......] - ETA: 37s - loss: 1.2310 - regression_loss: 1.0697 - classification_loss: 0.1613 403/500 [=======================>......] - ETA: 37s - loss: 1.2296 - regression_loss: 1.0685 - classification_loss: 0.1611 404/500 [=======================>......] - ETA: 36s - loss: 1.2287 - regression_loss: 1.0678 - classification_loss: 0.1609 405/500 [=======================>......] - ETA: 36s - loss: 1.2288 - regression_loss: 1.0679 - classification_loss: 0.1610 406/500 [=======================>......] - ETA: 35s - loss: 1.2282 - regression_loss: 1.0672 - classification_loss: 0.1611 407/500 [=======================>......] - ETA: 35s - loss: 1.2267 - regression_loss: 1.0659 - classification_loss: 0.1608 408/500 [=======================>......] - ETA: 35s - loss: 1.2260 - regression_loss: 1.0652 - classification_loss: 0.1608 409/500 [=======================>......] - ETA: 34s - loss: 1.2267 - regression_loss: 1.0657 - classification_loss: 0.1610 410/500 [=======================>......] - ETA: 34s - loss: 1.2277 - regression_loss: 1.0665 - classification_loss: 0.1612 411/500 [=======================>......] - ETA: 34s - loss: 1.2284 - regression_loss: 1.0672 - classification_loss: 0.1612 412/500 [=======================>......] - ETA: 33s - loss: 1.2288 - regression_loss: 1.0676 - classification_loss: 0.1612 413/500 [=======================>......] - ETA: 33s - loss: 1.2289 - regression_loss: 1.0676 - classification_loss: 0.1613 414/500 [=======================>......] - ETA: 32s - loss: 1.2293 - regression_loss: 1.0680 - classification_loss: 0.1614 415/500 [=======================>......] - ETA: 32s - loss: 1.2290 - regression_loss: 1.0678 - classification_loss: 0.1612 416/500 [=======================>......] - ETA: 32s - loss: 1.2291 - regression_loss: 1.0679 - classification_loss: 0.1612 417/500 [========================>.....] - ETA: 31s - loss: 1.2310 - regression_loss: 1.0694 - classification_loss: 0.1616 418/500 [========================>.....] - ETA: 31s - loss: 1.2316 - regression_loss: 1.0699 - classification_loss: 0.1617 419/500 [========================>.....] - ETA: 31s - loss: 1.2339 - regression_loss: 1.0719 - classification_loss: 0.1620 420/500 [========================>.....] - ETA: 30s - loss: 1.2338 - regression_loss: 1.0717 - classification_loss: 0.1620 421/500 [========================>.....] - ETA: 30s - loss: 1.2344 - regression_loss: 1.0723 - classification_loss: 0.1621 422/500 [========================>.....] - ETA: 29s - loss: 1.2354 - regression_loss: 1.0732 - classification_loss: 0.1622 423/500 [========================>.....] - ETA: 29s - loss: 1.2355 - regression_loss: 1.0733 - classification_loss: 0.1622 424/500 [========================>.....] - ETA: 29s - loss: 1.2356 - regression_loss: 1.0734 - classification_loss: 0.1622 425/500 [========================>.....] - ETA: 28s - loss: 1.2350 - regression_loss: 1.0729 - classification_loss: 0.1621 426/500 [========================>.....] - ETA: 28s - loss: 1.2335 - regression_loss: 1.0716 - classification_loss: 0.1619 427/500 [========================>.....] - ETA: 27s - loss: 1.2343 - regression_loss: 1.0723 - classification_loss: 0.1619 428/500 [========================>.....] - ETA: 27s - loss: 1.2325 - regression_loss: 1.0708 - classification_loss: 0.1617 429/500 [========================>.....] - ETA: 27s - loss: 1.2332 - regression_loss: 1.0715 - classification_loss: 0.1617 430/500 [========================>.....] - ETA: 26s - loss: 1.2341 - regression_loss: 1.0723 - classification_loss: 0.1618 431/500 [========================>.....] - ETA: 26s - loss: 1.2362 - regression_loss: 1.0742 - classification_loss: 0.1620 432/500 [========================>.....] - ETA: 26s - loss: 1.2360 - regression_loss: 1.0742 - classification_loss: 0.1618 433/500 [========================>.....] - ETA: 25s - loss: 1.2362 - regression_loss: 1.0744 - classification_loss: 0.1619 434/500 [=========================>....] - ETA: 25s - loss: 1.2365 - regression_loss: 1.0746 - classification_loss: 0.1619 435/500 [=========================>....] - ETA: 24s - loss: 1.2362 - regression_loss: 1.0743 - classification_loss: 0.1619 436/500 [=========================>....] - ETA: 24s - loss: 1.2376 - regression_loss: 1.0754 - classification_loss: 0.1622 437/500 [=========================>....] - ETA: 24s - loss: 1.2379 - regression_loss: 1.0756 - classification_loss: 0.1623 438/500 [=========================>....] - ETA: 23s - loss: 1.2385 - regression_loss: 1.0760 - classification_loss: 0.1624 439/500 [=========================>....] - ETA: 23s - loss: 1.2375 - regression_loss: 1.0753 - classification_loss: 0.1623 440/500 [=========================>....] - ETA: 22s - loss: 1.2382 - regression_loss: 1.0758 - classification_loss: 0.1624 441/500 [=========================>....] - ETA: 22s - loss: 1.2374 - regression_loss: 1.0751 - classification_loss: 0.1623 442/500 [=========================>....] - ETA: 22s - loss: 1.2365 - regression_loss: 1.0742 - classification_loss: 0.1623 443/500 [=========================>....] - ETA: 21s - loss: 1.2363 - regression_loss: 1.0739 - classification_loss: 0.1624 444/500 [=========================>....] - ETA: 21s - loss: 1.2365 - regression_loss: 1.0741 - classification_loss: 0.1624 445/500 [=========================>....] - ETA: 21s - loss: 1.2366 - regression_loss: 1.0742 - classification_loss: 0.1624 446/500 [=========================>....] - ETA: 20s - loss: 1.2355 - regression_loss: 1.0733 - classification_loss: 0.1622 447/500 [=========================>....] - ETA: 20s - loss: 1.2346 - regression_loss: 1.0725 - classification_loss: 0.1621 448/500 [=========================>....] - ETA: 19s - loss: 1.2343 - regression_loss: 1.0722 - classification_loss: 0.1621 449/500 [=========================>....] - ETA: 19s - loss: 1.2368 - regression_loss: 1.0742 - classification_loss: 0.1626 450/500 [==========================>...] - ETA: 19s - loss: 1.2364 - regression_loss: 1.0738 - classification_loss: 0.1626 451/500 [==========================>...] - ETA: 18s - loss: 1.2366 - regression_loss: 1.0740 - classification_loss: 0.1626 452/500 [==========================>...] - ETA: 18s - loss: 1.2361 - regression_loss: 1.0736 - classification_loss: 0.1625 453/500 [==========================>...] - ETA: 17s - loss: 1.2359 - regression_loss: 1.0734 - classification_loss: 0.1626 454/500 [==========================>...] - ETA: 17s - loss: 1.2349 - regression_loss: 1.0726 - classification_loss: 0.1623 455/500 [==========================>...] - ETA: 17s - loss: 1.2351 - regression_loss: 1.0727 - classification_loss: 0.1624 456/500 [==========================>...] - ETA: 16s - loss: 1.2360 - regression_loss: 1.0732 - classification_loss: 0.1628 457/500 [==========================>...] - ETA: 16s - loss: 1.2381 - regression_loss: 1.0749 - classification_loss: 0.1632 458/500 [==========================>...] - ETA: 16s - loss: 1.2366 - regression_loss: 1.0736 - classification_loss: 0.1630 459/500 [==========================>...] - ETA: 15s - loss: 1.2364 - regression_loss: 1.0734 - classification_loss: 0.1630 460/500 [==========================>...] - ETA: 15s - loss: 1.2362 - regression_loss: 1.0731 - classification_loss: 0.1631 461/500 [==========================>...] - ETA: 14s - loss: 1.2377 - regression_loss: 1.0743 - classification_loss: 0.1635 462/500 [==========================>...] - ETA: 14s - loss: 1.2372 - regression_loss: 1.0737 - classification_loss: 0.1634 463/500 [==========================>...] - ETA: 14s - loss: 1.2379 - regression_loss: 1.0742 - classification_loss: 0.1637 464/500 [==========================>...] - ETA: 13s - loss: 1.2381 - regression_loss: 1.0744 - classification_loss: 0.1637 465/500 [==========================>...] - ETA: 13s - loss: 1.2388 - regression_loss: 1.0751 - classification_loss: 0.1637 466/500 [==========================>...] - ETA: 13s - loss: 1.2383 - regression_loss: 1.0746 - classification_loss: 0.1637 467/500 [===========================>..] - ETA: 12s - loss: 1.2383 - regression_loss: 1.0745 - classification_loss: 0.1638 468/500 [===========================>..] - ETA: 12s - loss: 1.2386 - regression_loss: 1.0748 - classification_loss: 0.1638 469/500 [===========================>..] - ETA: 11s - loss: 1.2387 - regression_loss: 1.0747 - classification_loss: 0.1639 470/500 [===========================>..] - ETA: 11s - loss: 1.2375 - regression_loss: 1.0737 - classification_loss: 0.1638 471/500 [===========================>..] - ETA: 11s - loss: 1.2373 - regression_loss: 1.0733 - classification_loss: 0.1640 472/500 [===========================>..] - ETA: 10s - loss: 1.2377 - regression_loss: 1.0736 - classification_loss: 0.1641 473/500 [===========================>..] - ETA: 10s - loss: 1.2368 - regression_loss: 1.0728 - classification_loss: 0.1640 474/500 [===========================>..] - ETA: 9s - loss: 1.2359 - regression_loss: 1.0719 - classification_loss: 0.1640  475/500 [===========================>..] - ETA: 9s - loss: 1.2356 - regression_loss: 1.0716 - classification_loss: 0.1640 476/500 [===========================>..] - ETA: 9s - loss: 1.2349 - regression_loss: 1.0710 - classification_loss: 0.1639 477/500 [===========================>..] - ETA: 8s - loss: 1.2337 - regression_loss: 1.0699 - classification_loss: 0.1638 478/500 [===========================>..] - ETA: 8s - loss: 1.2336 - regression_loss: 1.0698 - classification_loss: 0.1638 479/500 [===========================>..] - ETA: 8s - loss: 1.2345 - regression_loss: 1.0704 - classification_loss: 0.1641 480/500 [===========================>..] - ETA: 7s - loss: 1.2361 - regression_loss: 1.0716 - classification_loss: 0.1645 481/500 [===========================>..] - ETA: 7s - loss: 1.2363 - regression_loss: 1.0718 - classification_loss: 0.1645 482/500 [===========================>..] - ETA: 6s - loss: 1.2373 - regression_loss: 1.0726 - classification_loss: 0.1647 483/500 [===========================>..] - ETA: 6s - loss: 1.2372 - regression_loss: 1.0725 - classification_loss: 0.1647 484/500 [============================>.] - ETA: 6s - loss: 1.2376 - regression_loss: 1.0728 - classification_loss: 0.1647 485/500 [============================>.] - ETA: 5s - loss: 1.2372 - regression_loss: 1.0725 - classification_loss: 0.1647 486/500 [============================>.] - ETA: 5s - loss: 1.2380 - regression_loss: 1.0732 - classification_loss: 0.1648 487/500 [============================>.] - ETA: 4s - loss: 1.2385 - regression_loss: 1.0737 - classification_loss: 0.1649 488/500 [============================>.] - ETA: 4s - loss: 1.2387 - regression_loss: 1.0738 - classification_loss: 0.1648 489/500 [============================>.] - ETA: 4s - loss: 1.2381 - regression_loss: 1.0733 - classification_loss: 0.1648 490/500 [============================>.] - ETA: 3s - loss: 1.2379 - regression_loss: 1.0732 - classification_loss: 0.1648 491/500 [============================>.] - ETA: 3s - loss: 1.2378 - regression_loss: 1.0731 - classification_loss: 0.1647 492/500 [============================>.] - ETA: 3s - loss: 1.2375 - regression_loss: 1.0728 - classification_loss: 0.1647 493/500 [============================>.] - ETA: 2s - loss: 1.2374 - regression_loss: 1.0728 - classification_loss: 0.1646 494/500 [============================>.] - ETA: 2s - loss: 1.2368 - regression_loss: 1.0723 - classification_loss: 0.1645 495/500 [============================>.] - ETA: 1s - loss: 1.2369 - regression_loss: 1.0724 - classification_loss: 0.1645 496/500 [============================>.] - ETA: 1s - loss: 1.2368 - regression_loss: 1.0723 - classification_loss: 0.1645 497/500 [============================>.] - ETA: 1s - loss: 1.2373 - regression_loss: 1.0728 - classification_loss: 0.1645 498/500 [============================>.] - ETA: 0s - loss: 1.2367 - regression_loss: 1.0722 - classification_loss: 0.1644 499/500 [============================>.] - ETA: 0s - loss: 1.2372 - regression_loss: 1.0726 - classification_loss: 0.1645 500/500 [==============================] - 191s 383ms/step - loss: 1.2371 - regression_loss: 1.0727 - classification_loss: 0.1644 1172 instances of class plum with average precision: 0.6688 mAP: 0.6688 Epoch 00007: saving model to ./training/snapshots/resnet101_pascal_07.h5 Epoch 8/20 1/500 [..............................] - ETA: 2:56 - loss: 1.3807 - regression_loss: 1.2146 - classification_loss: 0.1661 2/500 [..............................] - ETA: 2:59 - loss: 1.3236 - regression_loss: 1.1695 - classification_loss: 0.1541 3/500 [..............................] - ETA: 3:02 - loss: 1.1007 - regression_loss: 0.9742 - classification_loss: 0.1265 4/500 [..............................] - ETA: 3:03 - loss: 1.1612 - regression_loss: 1.0092 - classification_loss: 0.1520 5/500 [..............................] - ETA: 3:01 - loss: 1.3184 - regression_loss: 1.1435 - classification_loss: 0.1749 6/500 [..............................] - ETA: 3:01 - loss: 1.3340 - regression_loss: 1.1712 - classification_loss: 0.1628 7/500 [..............................] - ETA: 3:01 - loss: 1.2385 - regression_loss: 1.0898 - classification_loss: 0.1488 8/500 [..............................] - ETA: 3:01 - loss: 1.2686 - regression_loss: 1.1176 - classification_loss: 0.1510 9/500 [..............................] - ETA: 3:01 - loss: 1.2126 - regression_loss: 1.0650 - classification_loss: 0.1476 10/500 [..............................] - ETA: 3:02 - loss: 1.2407 - regression_loss: 1.0873 - classification_loss: 0.1534 11/500 [..............................] - ETA: 3:02 - loss: 1.1915 - regression_loss: 1.0437 - classification_loss: 0.1478 12/500 [..............................] - ETA: 3:01 - loss: 1.1740 - regression_loss: 1.0287 - classification_loss: 0.1452 13/500 [..............................] - ETA: 3:01 - loss: 1.1488 - regression_loss: 1.0094 - classification_loss: 0.1394 14/500 [..............................] - ETA: 3:00 - loss: 1.1109 - regression_loss: 0.9755 - classification_loss: 0.1353 15/500 [..............................] - ETA: 3:00 - loss: 1.1540 - regression_loss: 1.0138 - classification_loss: 0.1402 16/500 [..............................] - ETA: 2:59 - loss: 1.1650 - regression_loss: 1.0250 - classification_loss: 0.1400 17/500 [>.............................] - ETA: 2:59 - loss: 1.1742 - regression_loss: 1.0318 - classification_loss: 0.1423 18/500 [>.............................] - ETA: 2:58 - loss: 1.1522 - regression_loss: 1.0142 - classification_loss: 0.1380 19/500 [>.............................] - ETA: 2:59 - loss: 1.1615 - regression_loss: 1.0203 - classification_loss: 0.1412 20/500 [>.............................] - ETA: 2:58 - loss: 1.1444 - regression_loss: 1.0051 - classification_loss: 0.1392 21/500 [>.............................] - ETA: 2:57 - loss: 1.1440 - regression_loss: 1.0050 - classification_loss: 0.1390 22/500 [>.............................] - ETA: 2:57 - loss: 1.1552 - regression_loss: 1.0172 - classification_loss: 0.1380 23/500 [>.............................] - ETA: 2:58 - loss: 1.1847 - regression_loss: 1.0414 - classification_loss: 0.1434 24/500 [>.............................] - ETA: 2:57 - loss: 1.1969 - regression_loss: 1.0504 - classification_loss: 0.1465 25/500 [>.............................] - ETA: 2:57 - loss: 1.2050 - regression_loss: 1.0588 - classification_loss: 0.1462 26/500 [>.............................] - ETA: 2:57 - loss: 1.2015 - regression_loss: 1.0570 - classification_loss: 0.1445 27/500 [>.............................] - ETA: 2:56 - loss: 1.2098 - regression_loss: 1.0631 - classification_loss: 0.1467 28/500 [>.............................] - ETA: 2:56 - loss: 1.2146 - regression_loss: 1.0658 - classification_loss: 0.1488 29/500 [>.............................] - ETA: 2:56 - loss: 1.2090 - regression_loss: 1.0605 - classification_loss: 0.1485 30/500 [>.............................] - ETA: 2:55 - loss: 1.2117 - regression_loss: 1.0620 - classification_loss: 0.1497 31/500 [>.............................] - ETA: 2:55 - loss: 1.2101 - regression_loss: 1.0602 - classification_loss: 0.1498 32/500 [>.............................] - ETA: 2:55 - loss: 1.1969 - regression_loss: 1.0489 - classification_loss: 0.1480 33/500 [>.............................] - ETA: 2:55 - loss: 1.1866 - regression_loss: 1.0391 - classification_loss: 0.1475 34/500 [=>............................] - ETA: 2:55 - loss: 1.1893 - regression_loss: 1.0398 - classification_loss: 0.1495 35/500 [=>............................] - ETA: 2:55 - loss: 1.1905 - regression_loss: 1.0406 - classification_loss: 0.1498 36/500 [=>............................] - ETA: 2:54 - loss: 1.1833 - regression_loss: 1.0344 - classification_loss: 0.1489 37/500 [=>............................] - ETA: 2:54 - loss: 1.1766 - regression_loss: 1.0255 - classification_loss: 0.1511 38/500 [=>............................] - ETA: 2:53 - loss: 1.1722 - regression_loss: 1.0211 - classification_loss: 0.1511 39/500 [=>............................] - ETA: 2:53 - loss: 1.1782 - regression_loss: 1.0279 - classification_loss: 0.1503 40/500 [=>............................] - ETA: 2:53 - loss: 1.1666 - regression_loss: 1.0177 - classification_loss: 0.1489 41/500 [=>............................] - ETA: 2:52 - loss: 1.1646 - regression_loss: 1.0164 - classification_loss: 0.1482 42/500 [=>............................] - ETA: 2:52 - loss: 1.1625 - regression_loss: 1.0143 - classification_loss: 0.1482 43/500 [=>............................] - ETA: 2:52 - loss: 1.1504 - regression_loss: 1.0042 - classification_loss: 0.1462 44/500 [=>............................] - ETA: 2:51 - loss: 1.1549 - regression_loss: 1.0079 - classification_loss: 0.1470 45/500 [=>............................] - ETA: 2:51 - loss: 1.1480 - regression_loss: 1.0018 - classification_loss: 0.1463 46/500 [=>............................] - ETA: 2:51 - loss: 1.1387 - regression_loss: 0.9932 - classification_loss: 0.1454 47/500 [=>............................] - ETA: 2:50 - loss: 1.1410 - regression_loss: 0.9953 - classification_loss: 0.1457 48/500 [=>............................] - ETA: 2:50 - loss: 1.1394 - regression_loss: 0.9951 - classification_loss: 0.1444 49/500 [=>............................] - ETA: 2:50 - loss: 1.1376 - regression_loss: 0.9941 - classification_loss: 0.1436 50/500 [==>...........................] - ETA: 2:50 - loss: 1.1378 - regression_loss: 0.9940 - classification_loss: 0.1438 51/500 [==>...........................] - ETA: 2:50 - loss: 1.1228 - regression_loss: 0.9805 - classification_loss: 0.1424 52/500 [==>...........................] - ETA: 2:49 - loss: 1.1284 - regression_loss: 0.9857 - classification_loss: 0.1427 53/500 [==>...........................] - ETA: 2:49 - loss: 1.1325 - regression_loss: 0.9898 - classification_loss: 0.1427 54/500 [==>...........................] - ETA: 2:48 - loss: 1.1352 - regression_loss: 0.9923 - classification_loss: 0.1429 55/500 [==>...........................] - ETA: 2:48 - loss: 1.1274 - regression_loss: 0.9853 - classification_loss: 0.1422 56/500 [==>...........................] - ETA: 2:47 - loss: 1.1310 - regression_loss: 0.9894 - classification_loss: 0.1416 57/500 [==>...........................] - ETA: 2:47 - loss: 1.1413 - regression_loss: 0.9979 - classification_loss: 0.1434 58/500 [==>...........................] - ETA: 2:47 - loss: 1.1386 - regression_loss: 0.9954 - classification_loss: 0.1432 59/500 [==>...........................] - ETA: 2:47 - loss: 1.1426 - regression_loss: 0.9990 - classification_loss: 0.1436 60/500 [==>...........................] - ETA: 2:46 - loss: 1.1421 - regression_loss: 0.9983 - classification_loss: 0.1438 61/500 [==>...........................] - ETA: 2:46 - loss: 1.1407 - regression_loss: 0.9968 - classification_loss: 0.1440 62/500 [==>...........................] - ETA: 2:46 - loss: 1.1480 - regression_loss: 1.0021 - classification_loss: 0.1459 63/500 [==>...........................] - ETA: 2:45 - loss: 1.1452 - regression_loss: 0.9995 - classification_loss: 0.1457 64/500 [==>...........................] - ETA: 2:45 - loss: 1.1421 - regression_loss: 0.9965 - classification_loss: 0.1456 65/500 [==>...........................] - ETA: 2:45 - loss: 1.1360 - regression_loss: 0.9911 - classification_loss: 0.1449 66/500 [==>...........................] - ETA: 2:44 - loss: 1.1320 - regression_loss: 0.9883 - classification_loss: 0.1437 67/500 [===>..........................] - ETA: 2:44 - loss: 1.1361 - regression_loss: 0.9921 - classification_loss: 0.1441 68/500 [===>..........................] - ETA: 2:44 - loss: 1.1441 - regression_loss: 0.9994 - classification_loss: 0.1447 69/500 [===>..........................] - ETA: 2:43 - loss: 1.1549 - regression_loss: 1.0081 - classification_loss: 0.1468 70/500 [===>..........................] - ETA: 2:43 - loss: 1.1618 - regression_loss: 1.0147 - classification_loss: 0.1471 71/500 [===>..........................] - ETA: 2:42 - loss: 1.1572 - regression_loss: 1.0103 - classification_loss: 0.1469 72/500 [===>..........................] - ETA: 2:42 - loss: 1.1570 - regression_loss: 1.0100 - classification_loss: 0.1470 73/500 [===>..........................] - ETA: 2:41 - loss: 1.1567 - regression_loss: 1.0101 - classification_loss: 0.1466 74/500 [===>..........................] - ETA: 2:41 - loss: 1.1490 - regression_loss: 1.0032 - classification_loss: 0.1458 75/500 [===>..........................] - ETA: 2:41 - loss: 1.1458 - regression_loss: 1.0004 - classification_loss: 0.1454 76/500 [===>..........................] - ETA: 2:40 - loss: 1.1487 - regression_loss: 1.0028 - classification_loss: 0.1459 77/500 [===>..........................] - ETA: 2:40 - loss: 1.1506 - regression_loss: 1.0048 - classification_loss: 0.1458 78/500 [===>..........................] - ETA: 2:40 - loss: 1.1502 - regression_loss: 1.0047 - classification_loss: 0.1456 79/500 [===>..........................] - ETA: 2:39 - loss: 1.1426 - regression_loss: 0.9978 - classification_loss: 0.1448 80/500 [===>..........................] - ETA: 2:39 - loss: 1.1508 - regression_loss: 1.0048 - classification_loss: 0.1460 81/500 [===>..........................] - ETA: 2:38 - loss: 1.1560 - regression_loss: 1.0102 - classification_loss: 0.1458 82/500 [===>..........................] - ETA: 2:38 - loss: 1.1580 - regression_loss: 1.0120 - classification_loss: 0.1460 83/500 [===>..........................] - ETA: 2:37 - loss: 1.1540 - regression_loss: 1.0090 - classification_loss: 0.1451 84/500 [====>.........................] - ETA: 2:37 - loss: 1.1481 - regression_loss: 1.0038 - classification_loss: 0.1443 85/500 [====>.........................] - ETA: 2:36 - loss: 1.1428 - regression_loss: 0.9995 - classification_loss: 0.1434 86/500 [====>.........................] - ETA: 2:36 - loss: 1.1391 - regression_loss: 0.9962 - classification_loss: 0.1429 87/500 [====>.........................] - ETA: 2:36 - loss: 1.1413 - regression_loss: 0.9980 - classification_loss: 0.1433 88/500 [====>.........................] - ETA: 2:35 - loss: 1.1431 - regression_loss: 0.9992 - classification_loss: 0.1439 89/500 [====>.........................] - ETA: 2:35 - loss: 1.1450 - regression_loss: 1.0009 - classification_loss: 0.1442 90/500 [====>.........................] - ETA: 2:35 - loss: 1.1479 - regression_loss: 1.0030 - classification_loss: 0.1449 91/500 [====>.........................] - ETA: 2:34 - loss: 1.1468 - regression_loss: 1.0023 - classification_loss: 0.1445 92/500 [====>.........................] - ETA: 2:34 - loss: 1.1509 - regression_loss: 1.0060 - classification_loss: 0.1449 93/500 [====>.........................] - ETA: 2:34 - loss: 1.1576 - regression_loss: 1.0117 - classification_loss: 0.1459 94/500 [====>.........................] - ETA: 2:33 - loss: 1.1601 - regression_loss: 1.0139 - classification_loss: 0.1462 95/500 [====>.........................] - ETA: 2:33 - loss: 1.1617 - regression_loss: 1.0152 - classification_loss: 0.1465 96/500 [====>.........................] - ETA: 2:33 - loss: 1.1626 - regression_loss: 1.0161 - classification_loss: 0.1465 97/500 [====>.........................] - ETA: 2:32 - loss: 1.1576 - regression_loss: 1.0118 - classification_loss: 0.1458 98/500 [====>.........................] - ETA: 2:32 - loss: 1.1550 - regression_loss: 1.0096 - classification_loss: 0.1454 99/500 [====>.........................] - ETA: 2:31 - loss: 1.1556 - regression_loss: 1.0101 - classification_loss: 0.1455 100/500 [=====>........................] - ETA: 2:31 - loss: 1.1496 - regression_loss: 1.0047 - classification_loss: 0.1449 101/500 [=====>........................] - ETA: 2:31 - loss: 1.1499 - regression_loss: 1.0049 - classification_loss: 0.1450 102/500 [=====>........................] - ETA: 2:30 - loss: 1.1515 - regression_loss: 1.0060 - classification_loss: 0.1455 103/500 [=====>........................] - ETA: 2:30 - loss: 1.1532 - regression_loss: 1.0076 - classification_loss: 0.1456 104/500 [=====>........................] - ETA: 2:30 - loss: 1.1480 - regression_loss: 1.0028 - classification_loss: 0.1452 105/500 [=====>........................] - ETA: 2:29 - loss: 1.1501 - regression_loss: 1.0047 - classification_loss: 0.1454 106/500 [=====>........................] - ETA: 2:29 - loss: 1.1528 - regression_loss: 1.0073 - classification_loss: 0.1454 107/500 [=====>........................] - ETA: 2:28 - loss: 1.1477 - regression_loss: 1.0030 - classification_loss: 0.1447 108/500 [=====>........................] - ETA: 2:28 - loss: 1.1473 - regression_loss: 1.0027 - classification_loss: 0.1446 109/500 [=====>........................] - ETA: 2:28 - loss: 1.1487 - regression_loss: 1.0032 - classification_loss: 0.1455 110/500 [=====>........................] - ETA: 2:27 - loss: 1.1492 - regression_loss: 1.0035 - classification_loss: 0.1457 111/500 [=====>........................] - ETA: 2:27 - loss: 1.1418 - regression_loss: 0.9966 - classification_loss: 0.1451 112/500 [=====>........................] - ETA: 2:27 - loss: 1.1381 - regression_loss: 0.9935 - classification_loss: 0.1446 113/500 [=====>........................] - ETA: 2:26 - loss: 1.1334 - regression_loss: 0.9895 - classification_loss: 0.1438 114/500 [=====>........................] - ETA: 2:26 - loss: 1.1327 - regression_loss: 0.9889 - classification_loss: 0.1437 115/500 [=====>........................] - ETA: 2:25 - loss: 1.1348 - regression_loss: 0.9910 - classification_loss: 0.1437 116/500 [=====>........................] - ETA: 2:25 - loss: 1.1366 - regression_loss: 0.9927 - classification_loss: 0.1439 117/500 [======>.......................] - ETA: 2:25 - loss: 1.1368 - regression_loss: 0.9926 - classification_loss: 0.1441 118/500 [======>.......................] - ETA: 2:24 - loss: 1.1403 - regression_loss: 0.9959 - classification_loss: 0.1444 119/500 [======>.......................] - ETA: 2:24 - loss: 1.1430 - regression_loss: 0.9985 - classification_loss: 0.1445 120/500 [======>.......................] - ETA: 2:24 - loss: 1.1398 - regression_loss: 0.9960 - classification_loss: 0.1438 121/500 [======>.......................] - ETA: 2:23 - loss: 1.1346 - regression_loss: 0.9915 - classification_loss: 0.1432 122/500 [======>.......................] - ETA: 2:23 - loss: 1.1380 - regression_loss: 0.9942 - classification_loss: 0.1438 123/500 [======>.......................] - ETA: 2:22 - loss: 1.1365 - regression_loss: 0.9926 - classification_loss: 0.1439 124/500 [======>.......................] - ETA: 2:22 - loss: 1.1341 - regression_loss: 0.9908 - classification_loss: 0.1434 125/500 [======>.......................] - ETA: 2:21 - loss: 1.1329 - regression_loss: 0.9896 - classification_loss: 0.1433 126/500 [======>.......................] - ETA: 2:21 - loss: 1.1312 - regression_loss: 0.9881 - classification_loss: 0.1431 127/500 [======>.......................] - ETA: 2:21 - loss: 1.1275 - regression_loss: 0.9845 - classification_loss: 0.1430 128/500 [======>.......................] - ETA: 2:20 - loss: 1.1292 - regression_loss: 0.9862 - classification_loss: 0.1430 129/500 [======>.......................] - ETA: 2:20 - loss: 1.1299 - regression_loss: 0.9869 - classification_loss: 0.1431 130/500 [======>.......................] - ETA: 2:20 - loss: 1.1309 - regression_loss: 0.9876 - classification_loss: 0.1432 131/500 [======>.......................] - ETA: 2:19 - loss: 1.1307 - regression_loss: 0.9872 - classification_loss: 0.1435 132/500 [======>.......................] - ETA: 2:19 - loss: 1.1288 - regression_loss: 0.9855 - classification_loss: 0.1433 133/500 [======>.......................] - ETA: 2:19 - loss: 1.1301 - regression_loss: 0.9867 - classification_loss: 0.1434 134/500 [=======>......................] - ETA: 2:18 - loss: 1.1328 - regression_loss: 0.9891 - classification_loss: 0.1437 135/500 [=======>......................] - ETA: 2:18 - loss: 1.1330 - regression_loss: 0.9895 - classification_loss: 0.1435 136/500 [=======>......................] - ETA: 2:17 - loss: 1.1311 - regression_loss: 0.9878 - classification_loss: 0.1433 137/500 [=======>......................] - ETA: 2:17 - loss: 1.1320 - regression_loss: 0.9884 - classification_loss: 0.1436 138/500 [=======>......................] - ETA: 2:17 - loss: 1.1315 - regression_loss: 0.9880 - classification_loss: 0.1435 139/500 [=======>......................] - ETA: 2:16 - loss: 1.1345 - regression_loss: 0.9914 - classification_loss: 0.1432 140/500 [=======>......................] - ETA: 2:16 - loss: 1.1366 - regression_loss: 0.9933 - classification_loss: 0.1433 141/500 [=======>......................] - ETA: 2:16 - loss: 1.1378 - regression_loss: 0.9943 - classification_loss: 0.1434 142/500 [=======>......................] - ETA: 2:15 - loss: 1.1397 - regression_loss: 0.9961 - classification_loss: 0.1436 143/500 [=======>......................] - ETA: 2:15 - loss: 1.1357 - regression_loss: 0.9925 - classification_loss: 0.1431 144/500 [=======>......................] - ETA: 2:15 - loss: 1.1357 - regression_loss: 0.9926 - classification_loss: 0.1431 145/500 [=======>......................] - ETA: 2:14 - loss: 1.1358 - regression_loss: 0.9925 - classification_loss: 0.1433 146/500 [=======>......................] - ETA: 2:14 - loss: 1.1366 - regression_loss: 0.9932 - classification_loss: 0.1433 147/500 [=======>......................] - ETA: 2:14 - loss: 1.1335 - regression_loss: 0.9905 - classification_loss: 0.1430 148/500 [=======>......................] - ETA: 2:13 - loss: 1.1353 - regression_loss: 0.9922 - classification_loss: 0.1431 149/500 [=======>......................] - ETA: 2:13 - loss: 1.1375 - regression_loss: 0.9939 - classification_loss: 0.1436 150/500 [========>.....................] - ETA: 2:12 - loss: 1.1374 - regression_loss: 0.9940 - classification_loss: 0.1434 151/500 [========>.....................] - ETA: 2:12 - loss: 1.1372 - regression_loss: 0.9938 - classification_loss: 0.1434 152/500 [========>.....................] - ETA: 2:12 - loss: 1.1346 - regression_loss: 0.9915 - classification_loss: 0.1431 153/500 [========>.....................] - ETA: 2:11 - loss: 1.1361 - regression_loss: 0.9928 - classification_loss: 0.1434 154/500 [========>.....................] - ETA: 2:11 - loss: 1.1372 - regression_loss: 0.9937 - classification_loss: 0.1435 155/500 [========>.....................] - ETA: 2:11 - loss: 1.1366 - regression_loss: 0.9935 - classification_loss: 0.1432 156/500 [========>.....................] - ETA: 2:10 - loss: 1.1414 - regression_loss: 0.9978 - classification_loss: 0.1437 157/500 [========>.....................] - ETA: 2:10 - loss: 1.1432 - regression_loss: 0.9995 - classification_loss: 0.1437 158/500 [========>.....................] - ETA: 2:09 - loss: 1.1438 - regression_loss: 1.0001 - classification_loss: 0.1437 159/500 [========>.....................] - ETA: 2:09 - loss: 1.1465 - regression_loss: 1.0024 - classification_loss: 0.1441 160/500 [========>.....................] - ETA: 2:09 - loss: 1.1497 - regression_loss: 1.0050 - classification_loss: 0.1447 161/500 [========>.....................] - ETA: 2:08 - loss: 1.1468 - regression_loss: 1.0025 - classification_loss: 0.1443 162/500 [========>.....................] - ETA: 2:08 - loss: 1.1459 - regression_loss: 1.0018 - classification_loss: 0.1441 163/500 [========>.....................] - ETA: 2:07 - loss: 1.1456 - regression_loss: 1.0013 - classification_loss: 0.1443 164/500 [========>.....................] - ETA: 2:07 - loss: 1.1455 - regression_loss: 1.0014 - classification_loss: 0.1441 165/500 [========>.....................] - ETA: 2:07 - loss: 1.1483 - regression_loss: 1.0038 - classification_loss: 0.1446 166/500 [========>.....................] - ETA: 2:06 - loss: 1.1480 - regression_loss: 1.0035 - classification_loss: 0.1445 167/500 [=========>....................] - ETA: 2:06 - loss: 1.1469 - regression_loss: 1.0027 - classification_loss: 0.1442 168/500 [=========>....................] - ETA: 2:06 - loss: 1.1455 - regression_loss: 1.0015 - classification_loss: 0.1441 169/500 [=========>....................] - ETA: 2:05 - loss: 1.1470 - regression_loss: 1.0026 - classification_loss: 0.1444 170/500 [=========>....................] - ETA: 2:05 - loss: 1.1477 - regression_loss: 1.0033 - classification_loss: 0.1444 171/500 [=========>....................] - ETA: 2:04 - loss: 1.1450 - regression_loss: 1.0010 - classification_loss: 0.1440 172/500 [=========>....................] - ETA: 2:04 - loss: 1.1426 - regression_loss: 0.9989 - classification_loss: 0.1437 173/500 [=========>....................] - ETA: 2:04 - loss: 1.1435 - regression_loss: 0.9998 - classification_loss: 0.1437 174/500 [=========>....................] - ETA: 2:03 - loss: 1.1416 - regression_loss: 0.9983 - classification_loss: 0.1433 175/500 [=========>....................] - ETA: 2:03 - loss: 1.1464 - regression_loss: 1.0027 - classification_loss: 0.1437 176/500 [=========>....................] - ETA: 2:02 - loss: 1.1447 - regression_loss: 1.0013 - classification_loss: 0.1434 177/500 [=========>....................] - ETA: 2:02 - loss: 1.1436 - regression_loss: 1.0004 - classification_loss: 0.1433 178/500 [=========>....................] - ETA: 2:02 - loss: 1.1466 - regression_loss: 1.0030 - classification_loss: 0.1436 179/500 [=========>....................] - ETA: 2:01 - loss: 1.1436 - regression_loss: 0.9999 - classification_loss: 0.1437 180/500 [=========>....................] - ETA: 2:01 - loss: 1.1428 - regression_loss: 0.9993 - classification_loss: 0.1435 181/500 [=========>....................] - ETA: 2:01 - loss: 1.1430 - regression_loss: 0.9995 - classification_loss: 0.1435 182/500 [=========>....................] - ETA: 2:00 - loss: 1.1434 - regression_loss: 1.0000 - classification_loss: 0.1435 183/500 [=========>....................] - ETA: 2:00 - loss: 1.1429 - regression_loss: 0.9993 - classification_loss: 0.1435 184/500 [==========>...................] - ETA: 1:59 - loss: 1.1428 - regression_loss: 0.9991 - classification_loss: 0.1438 185/500 [==========>...................] - ETA: 1:59 - loss: 1.1410 - regression_loss: 0.9975 - classification_loss: 0.1434 186/500 [==========>...................] - ETA: 1:59 - loss: 1.1407 - regression_loss: 0.9973 - classification_loss: 0.1434 187/500 [==========>...................] - ETA: 1:58 - loss: 1.1433 - regression_loss: 0.9993 - classification_loss: 0.1441 188/500 [==========>...................] - ETA: 1:58 - loss: 1.1443 - regression_loss: 1.0002 - classification_loss: 0.1441 189/500 [==========>...................] - ETA: 1:58 - loss: 1.1431 - regression_loss: 0.9990 - classification_loss: 0.1441 190/500 [==========>...................] - ETA: 1:57 - loss: 1.1423 - regression_loss: 0.9984 - classification_loss: 0.1439 191/500 [==========>...................] - ETA: 1:57 - loss: 1.1426 - regression_loss: 0.9988 - classification_loss: 0.1437 192/500 [==========>...................] - ETA: 1:57 - loss: 1.1448 - regression_loss: 1.0007 - classification_loss: 0.1441 193/500 [==========>...................] - ETA: 1:56 - loss: 1.1432 - regression_loss: 0.9993 - classification_loss: 0.1439 194/500 [==========>...................] - ETA: 1:56 - loss: 1.1454 - regression_loss: 1.0012 - classification_loss: 0.1442 195/500 [==========>...................] - ETA: 1:55 - loss: 1.1475 - regression_loss: 1.0030 - classification_loss: 0.1445 196/500 [==========>...................] - ETA: 1:55 - loss: 1.1491 - regression_loss: 1.0041 - classification_loss: 0.1450 197/500 [==========>...................] - ETA: 1:55 - loss: 1.1491 - regression_loss: 1.0043 - classification_loss: 0.1448 198/500 [==========>...................] - ETA: 1:54 - loss: 1.1511 - regression_loss: 1.0062 - classification_loss: 0.1449 199/500 [==========>...................] - ETA: 1:54 - loss: 1.1509 - regression_loss: 1.0061 - classification_loss: 0.1448 200/500 [===========>..................] - ETA: 1:54 - loss: 1.1511 - regression_loss: 1.0063 - classification_loss: 0.1448 201/500 [===========>..................] - ETA: 1:53 - loss: 1.1503 - regression_loss: 1.0059 - classification_loss: 0.1445 202/500 [===========>..................] - ETA: 1:53 - loss: 1.1512 - regression_loss: 1.0067 - classification_loss: 0.1445 203/500 [===========>..................] - ETA: 1:52 - loss: 1.1536 - regression_loss: 1.0089 - classification_loss: 0.1447 204/500 [===========>..................] - ETA: 1:52 - loss: 1.1525 - regression_loss: 1.0078 - classification_loss: 0.1447 205/500 [===========>..................] - ETA: 1:52 - loss: 1.1512 - regression_loss: 1.0066 - classification_loss: 0.1446 206/500 [===========>..................] - ETA: 1:51 - loss: 1.1522 - regression_loss: 1.0072 - classification_loss: 0.1450 207/500 [===========>..................] - ETA: 1:51 - loss: 1.1525 - regression_loss: 1.0075 - classification_loss: 0.1450 208/500 [===========>..................] - ETA: 1:50 - loss: 1.1523 - regression_loss: 1.0071 - classification_loss: 0.1453 209/500 [===========>..................] - ETA: 1:50 - loss: 1.1536 - regression_loss: 1.0082 - classification_loss: 0.1454 210/500 [===========>..................] - ETA: 1:50 - loss: 1.1558 - regression_loss: 1.0099 - classification_loss: 0.1458 211/500 [===========>..................] - ETA: 1:49 - loss: 1.1576 - regression_loss: 1.0115 - classification_loss: 0.1461 212/500 [===========>..................] - ETA: 1:49 - loss: 1.1584 - regression_loss: 1.0124 - classification_loss: 0.1460 213/500 [===========>..................] - ETA: 1:49 - loss: 1.1559 - regression_loss: 1.0104 - classification_loss: 0.1456 214/500 [===========>..................] - ETA: 1:48 - loss: 1.1546 - regression_loss: 1.0091 - classification_loss: 0.1455 215/500 [===========>..................] - ETA: 1:48 - loss: 1.1540 - regression_loss: 1.0083 - classification_loss: 0.1457 216/500 [===========>..................] - ETA: 1:47 - loss: 1.1553 - regression_loss: 1.0095 - classification_loss: 0.1458 217/500 [============>.................] - ETA: 1:47 - loss: 1.1540 - regression_loss: 1.0085 - classification_loss: 0.1455 218/500 [============>.................] - ETA: 1:47 - loss: 1.1515 - regression_loss: 1.0064 - classification_loss: 0.1451 219/500 [============>.................] - ETA: 1:46 - loss: 1.1504 - regression_loss: 1.0055 - classification_loss: 0.1449 220/500 [============>.................] - ETA: 1:46 - loss: 1.1494 - regression_loss: 1.0046 - classification_loss: 0.1448 221/500 [============>.................] - ETA: 1:46 - loss: 1.1481 - regression_loss: 1.0035 - classification_loss: 0.1447 222/500 [============>.................] - ETA: 1:45 - loss: 1.1482 - regression_loss: 1.0036 - classification_loss: 0.1446 223/500 [============>.................] - ETA: 1:45 - loss: 1.1485 - regression_loss: 1.0038 - classification_loss: 0.1448 224/500 [============>.................] - ETA: 1:44 - loss: 1.1475 - regression_loss: 1.0026 - classification_loss: 0.1449 225/500 [============>.................] - ETA: 1:44 - loss: 1.1463 - regression_loss: 1.0015 - classification_loss: 0.1448 226/500 [============>.................] - ETA: 1:44 - loss: 1.1466 - regression_loss: 1.0018 - classification_loss: 0.1448 227/500 [============>.................] - ETA: 1:43 - loss: 1.1457 - regression_loss: 1.0011 - classification_loss: 0.1446 228/500 [============>.................] - ETA: 1:43 - loss: 1.1446 - regression_loss: 1.0001 - classification_loss: 0.1445 229/500 [============>.................] - ETA: 1:43 - loss: 1.1438 - regression_loss: 0.9994 - classification_loss: 0.1443 230/500 [============>.................] - ETA: 1:42 - loss: 1.1433 - regression_loss: 0.9990 - classification_loss: 0.1443 231/500 [============>.................] - ETA: 1:42 - loss: 1.1442 - regression_loss: 0.9997 - classification_loss: 0.1445 232/500 [============>.................] - ETA: 1:41 - loss: 1.1418 - regression_loss: 0.9976 - classification_loss: 0.1441 233/500 [============>.................] - ETA: 1:41 - loss: 1.1425 - regression_loss: 0.9984 - classification_loss: 0.1441 234/500 [=============>................] - ETA: 1:41 - loss: 1.1423 - regression_loss: 0.9983 - classification_loss: 0.1440 235/500 [=============>................] - ETA: 1:40 - loss: 1.1402 - regression_loss: 0.9964 - classification_loss: 0.1438 236/500 [=============>................] - ETA: 1:40 - loss: 1.1388 - regression_loss: 0.9947 - classification_loss: 0.1441 237/500 [=============>................] - ETA: 1:40 - loss: 1.1373 - regression_loss: 0.9934 - classification_loss: 0.1439 238/500 [=============>................] - ETA: 1:39 - loss: 1.1370 - regression_loss: 0.9931 - classification_loss: 0.1439 239/500 [=============>................] - ETA: 1:39 - loss: 1.1379 - regression_loss: 0.9939 - classification_loss: 0.1440 240/500 [=============>................] - ETA: 1:38 - loss: 1.1358 - regression_loss: 0.9921 - classification_loss: 0.1437 241/500 [=============>................] - ETA: 1:38 - loss: 1.1362 - regression_loss: 0.9924 - classification_loss: 0.1438 242/500 [=============>................] - ETA: 1:38 - loss: 1.1338 - regression_loss: 0.9903 - classification_loss: 0.1435 243/500 [=============>................] - ETA: 1:37 - loss: 1.1315 - regression_loss: 0.9884 - classification_loss: 0.1431 244/500 [=============>................] - ETA: 1:37 - loss: 1.1307 - regression_loss: 0.9876 - classification_loss: 0.1431 245/500 [=============>................] - ETA: 1:37 - loss: 1.1325 - regression_loss: 0.9890 - classification_loss: 0.1434 246/500 [=============>................] - ETA: 1:36 - loss: 1.1337 - regression_loss: 0.9900 - classification_loss: 0.1437 247/500 [=============>................] - ETA: 1:36 - loss: 1.1345 - regression_loss: 0.9907 - classification_loss: 0.1438 248/500 [=============>................] - ETA: 1:35 - loss: 1.1338 - regression_loss: 0.9901 - classification_loss: 0.1438 249/500 [=============>................] - ETA: 1:35 - loss: 1.1317 - regression_loss: 0.9882 - classification_loss: 0.1435 250/500 [==============>...............] - ETA: 1:35 - loss: 1.1307 - regression_loss: 0.9871 - classification_loss: 0.1436 251/500 [==============>...............] - ETA: 1:34 - loss: 1.1293 - regression_loss: 0.9857 - classification_loss: 0.1436 252/500 [==============>...............] - ETA: 1:34 - loss: 1.1297 - regression_loss: 0.9861 - classification_loss: 0.1436 253/500 [==============>...............] - ETA: 1:34 - loss: 1.1326 - regression_loss: 0.9889 - classification_loss: 0.1436 254/500 [==============>...............] - ETA: 1:33 - loss: 1.1334 - regression_loss: 0.9896 - classification_loss: 0.1438 255/500 [==============>...............] - ETA: 1:33 - loss: 1.1338 - regression_loss: 0.9903 - classification_loss: 0.1436 256/500 [==============>...............] - ETA: 1:33 - loss: 1.1358 - regression_loss: 0.9919 - classification_loss: 0.1438 257/500 [==============>...............] - ETA: 1:32 - loss: 1.1365 - regression_loss: 0.9925 - classification_loss: 0.1440 258/500 [==============>...............] - ETA: 1:32 - loss: 1.1370 - regression_loss: 0.9928 - classification_loss: 0.1442 259/500 [==============>...............] - ETA: 1:31 - loss: 1.1349 - regression_loss: 0.9909 - classification_loss: 0.1440 260/500 [==============>...............] - ETA: 1:31 - loss: 1.1374 - regression_loss: 0.9932 - classification_loss: 0.1442 261/500 [==============>...............] - ETA: 1:31 - loss: 1.1362 - regression_loss: 0.9923 - classification_loss: 0.1439 262/500 [==============>...............] - ETA: 1:30 - loss: 1.1377 - regression_loss: 0.9935 - classification_loss: 0.1442 263/500 [==============>...............] - ETA: 1:30 - loss: 1.1376 - regression_loss: 0.9934 - classification_loss: 0.1442 264/500 [==============>...............] - ETA: 1:29 - loss: 1.1387 - regression_loss: 0.9944 - classification_loss: 0.1443 265/500 [==============>...............] - ETA: 1:29 - loss: 1.1365 - regression_loss: 0.9924 - classification_loss: 0.1441 266/500 [==============>...............] - ETA: 1:29 - loss: 1.1360 - regression_loss: 0.9920 - classification_loss: 0.1440 267/500 [===============>..............] - ETA: 1:28 - loss: 1.1378 - regression_loss: 0.9937 - classification_loss: 0.1441 268/500 [===============>..............] - ETA: 1:28 - loss: 1.1386 - regression_loss: 0.9946 - classification_loss: 0.1441 269/500 [===============>..............] - ETA: 1:28 - loss: 1.1392 - regression_loss: 0.9950 - classification_loss: 0.1442 270/500 [===============>..............] - ETA: 1:27 - loss: 1.1409 - regression_loss: 0.9965 - classification_loss: 0.1444 271/500 [===============>..............] - ETA: 1:27 - loss: 1.1425 - regression_loss: 0.9975 - classification_loss: 0.1449 272/500 [===============>..............] - ETA: 1:26 - loss: 1.1434 - regression_loss: 0.9984 - classification_loss: 0.1449 273/500 [===============>..............] - ETA: 1:26 - loss: 1.1428 - regression_loss: 0.9979 - classification_loss: 0.1449 274/500 [===============>..............] - ETA: 1:26 - loss: 1.1422 - regression_loss: 0.9975 - classification_loss: 0.1447 275/500 [===============>..............] - ETA: 1:25 - loss: 1.1394 - regression_loss: 0.9950 - classification_loss: 0.1444 276/500 [===============>..............] - ETA: 1:25 - loss: 1.1391 - regression_loss: 0.9947 - classification_loss: 0.1444 277/500 [===============>..............] - ETA: 1:24 - loss: 1.1385 - regression_loss: 0.9943 - classification_loss: 0.1442 278/500 [===============>..............] - ETA: 1:24 - loss: 1.1391 - regression_loss: 0.9947 - classification_loss: 0.1444 279/500 [===============>..............] - ETA: 1:24 - loss: 1.1412 - regression_loss: 0.9966 - classification_loss: 0.1446 280/500 [===============>..............] - ETA: 1:23 - loss: 1.1426 - regression_loss: 0.9979 - classification_loss: 0.1447 281/500 [===============>..............] - ETA: 1:23 - loss: 1.1435 - regression_loss: 0.9987 - classification_loss: 0.1448 282/500 [===============>..............] - ETA: 1:23 - loss: 1.1434 - regression_loss: 0.9985 - classification_loss: 0.1449 283/500 [===============>..............] - ETA: 1:22 - loss: 1.1431 - regression_loss: 0.9982 - classification_loss: 0.1449 284/500 [================>.............] - ETA: 1:22 - loss: 1.1420 - regression_loss: 0.9973 - classification_loss: 0.1447 285/500 [================>.............] - ETA: 1:21 - loss: 1.1406 - regression_loss: 0.9961 - classification_loss: 0.1445 286/500 [================>.............] - ETA: 1:21 - loss: 1.1430 - regression_loss: 0.9982 - classification_loss: 0.1449 287/500 [================>.............] - ETA: 1:21 - loss: 1.1425 - regression_loss: 0.9977 - classification_loss: 0.1448 288/500 [================>.............] - ETA: 1:20 - loss: 1.1432 - regression_loss: 0.9983 - classification_loss: 0.1449 289/500 [================>.............] - ETA: 1:20 - loss: 1.1425 - regression_loss: 0.9976 - classification_loss: 0.1448 290/500 [================>.............] - ETA: 1:19 - loss: 1.1412 - regression_loss: 0.9966 - classification_loss: 0.1447 291/500 [================>.............] - ETA: 1:19 - loss: 1.1409 - regression_loss: 0.9963 - classification_loss: 0.1446 292/500 [================>.............] - ETA: 1:19 - loss: 1.1419 - regression_loss: 0.9973 - classification_loss: 0.1446 293/500 [================>.............] - ETA: 1:18 - loss: 1.1402 - regression_loss: 0.9959 - classification_loss: 0.1443 294/500 [================>.............] - ETA: 1:18 - loss: 1.1386 - regression_loss: 0.9946 - classification_loss: 0.1441 295/500 [================>.............] - ETA: 1:18 - loss: 1.1370 - regression_loss: 0.9932 - classification_loss: 0.1439 296/500 [================>.............] - ETA: 1:17 - loss: 1.1367 - regression_loss: 0.9931 - classification_loss: 0.1436 297/500 [================>.............] - ETA: 1:17 - loss: 1.1373 - regression_loss: 0.9937 - classification_loss: 0.1436 298/500 [================>.............] - ETA: 1:16 - loss: 1.1380 - regression_loss: 0.9944 - classification_loss: 0.1437 299/500 [================>.............] - ETA: 1:16 - loss: 1.1380 - regression_loss: 0.9944 - classification_loss: 0.1436 300/500 [=================>............] - ETA: 1:16 - loss: 1.1389 - regression_loss: 0.9951 - classification_loss: 0.1438 301/500 [=================>............] - ETA: 1:15 - loss: 1.1403 - regression_loss: 0.9963 - classification_loss: 0.1440 302/500 [=================>............] - ETA: 1:15 - loss: 1.1406 - regression_loss: 0.9965 - classification_loss: 0.1441 303/500 [=================>............] - ETA: 1:15 - loss: 1.1408 - regression_loss: 0.9967 - classification_loss: 0.1441 304/500 [=================>............] - ETA: 1:14 - loss: 1.1430 - regression_loss: 0.9990 - classification_loss: 0.1439 305/500 [=================>............] - ETA: 1:14 - loss: 1.1442 - regression_loss: 1.0003 - classification_loss: 0.1440 306/500 [=================>............] - ETA: 1:13 - loss: 1.1443 - regression_loss: 1.0002 - classification_loss: 0.1440 307/500 [=================>............] - ETA: 1:13 - loss: 1.1452 - regression_loss: 1.0012 - classification_loss: 0.1440 308/500 [=================>............] - ETA: 1:13 - loss: 1.1434 - regression_loss: 0.9997 - classification_loss: 0.1437 309/500 [=================>............] - ETA: 1:12 - loss: 1.1422 - regression_loss: 0.9987 - classification_loss: 0.1434 310/500 [=================>............] - ETA: 1:12 - loss: 1.1410 - regression_loss: 0.9977 - classification_loss: 0.1433 311/500 [=================>............] - ETA: 1:11 - loss: 1.1427 - regression_loss: 0.9992 - classification_loss: 0.1435 312/500 [=================>............] - ETA: 1:11 - loss: 1.1405 - regression_loss: 0.9973 - classification_loss: 0.1432 313/500 [=================>............] - ETA: 1:11 - loss: 1.1407 - regression_loss: 0.9975 - classification_loss: 0.1432 314/500 [=================>............] - ETA: 1:10 - loss: 1.1429 - regression_loss: 0.9995 - classification_loss: 0.1434 315/500 [=================>............] - ETA: 1:10 - loss: 1.1420 - regression_loss: 0.9987 - classification_loss: 0.1433 316/500 [=================>............] - ETA: 1:10 - loss: 1.1401 - regression_loss: 0.9970 - classification_loss: 0.1431 317/500 [==================>...........] - ETA: 1:09 - loss: 1.1389 - regression_loss: 0.9960 - classification_loss: 0.1429 318/500 [==================>...........] - ETA: 1:09 - loss: 1.1394 - regression_loss: 0.9963 - classification_loss: 0.1431 319/500 [==================>...........] - ETA: 1:08 - loss: 1.1378 - regression_loss: 0.9950 - classification_loss: 0.1428 320/500 [==================>...........] - ETA: 1:08 - loss: 1.1389 - regression_loss: 0.9959 - classification_loss: 0.1430 321/500 [==================>...........] - ETA: 1:08 - loss: 1.1368 - regression_loss: 0.9940 - classification_loss: 0.1428 322/500 [==================>...........] - ETA: 1:07 - loss: 1.1367 - regression_loss: 0.9939 - classification_loss: 0.1428 323/500 [==================>...........] - ETA: 1:07 - loss: 1.1385 - regression_loss: 0.9956 - classification_loss: 0.1429 324/500 [==================>...........] - ETA: 1:06 - loss: 1.1388 - regression_loss: 0.9958 - classification_loss: 0.1430 325/500 [==================>...........] - ETA: 1:06 - loss: 1.1389 - regression_loss: 0.9959 - classification_loss: 0.1430 326/500 [==================>...........] - ETA: 1:06 - loss: 1.1384 - regression_loss: 0.9954 - classification_loss: 0.1430 327/500 [==================>...........] - ETA: 1:05 - loss: 1.1387 - regression_loss: 0.9957 - classification_loss: 0.1430 328/500 [==================>...........] - ETA: 1:05 - loss: 1.1389 - regression_loss: 0.9959 - classification_loss: 0.1430 329/500 [==================>...........] - ETA: 1:05 - loss: 1.1391 - regression_loss: 0.9960 - classification_loss: 0.1431 330/500 [==================>...........] - ETA: 1:04 - loss: 1.1395 - regression_loss: 0.9964 - classification_loss: 0.1432 331/500 [==================>...........] - ETA: 1:04 - loss: 1.1395 - regression_loss: 0.9963 - classification_loss: 0.1432 332/500 [==================>...........] - ETA: 1:03 - loss: 1.1394 - regression_loss: 0.9963 - classification_loss: 0.1431 333/500 [==================>...........] - ETA: 1:03 - loss: 1.1398 - regression_loss: 0.9966 - classification_loss: 0.1432 334/500 [===================>..........] - ETA: 1:03 - loss: 1.1410 - regression_loss: 0.9975 - classification_loss: 0.1435 335/500 [===================>..........] - ETA: 1:02 - loss: 1.1398 - regression_loss: 0.9965 - classification_loss: 0.1433 336/500 [===================>..........] - ETA: 1:02 - loss: 1.1386 - regression_loss: 0.9953 - classification_loss: 0.1433 337/500 [===================>..........] - ETA: 1:01 - loss: 1.1381 - regression_loss: 0.9950 - classification_loss: 0.1432 338/500 [===================>..........] - ETA: 1:01 - loss: 1.1392 - regression_loss: 0.9958 - classification_loss: 0.1434 339/500 [===================>..........] - ETA: 1:01 - loss: 1.1383 - regression_loss: 0.9951 - classification_loss: 0.1433 340/500 [===================>..........] - ETA: 1:00 - loss: 1.1380 - regression_loss: 0.9948 - classification_loss: 0.1432 341/500 [===================>..........] - ETA: 1:00 - loss: 1.1378 - regression_loss: 0.9947 - classification_loss: 0.1431 342/500 [===================>..........] - ETA: 1:00 - loss: 1.1386 - regression_loss: 0.9954 - classification_loss: 0.1432 343/500 [===================>..........] - ETA: 59s - loss: 1.1389 - regression_loss: 0.9956 - classification_loss: 0.1433  344/500 [===================>..........] - ETA: 59s - loss: 1.1382 - regression_loss: 0.9950 - classification_loss: 0.1433 345/500 [===================>..........] - ETA: 58s - loss: 1.1375 - regression_loss: 0.9944 - classification_loss: 0.1431 346/500 [===================>..........] - ETA: 58s - loss: 1.1376 - regression_loss: 0.9945 - classification_loss: 0.1431 347/500 [===================>..........] - ETA: 58s - loss: 1.1385 - regression_loss: 0.9953 - classification_loss: 0.1432 348/500 [===================>..........] - ETA: 57s - loss: 1.1385 - regression_loss: 0.9952 - classification_loss: 0.1434 349/500 [===================>..........] - ETA: 57s - loss: 1.1361 - regression_loss: 0.9930 - classification_loss: 0.1431 350/500 [====================>.........] - ETA: 57s - loss: 1.1344 - regression_loss: 0.9914 - classification_loss: 0.1430 351/500 [====================>.........] - ETA: 56s - loss: 1.1336 - regression_loss: 0.9907 - classification_loss: 0.1429 352/500 [====================>.........] - ETA: 56s - loss: 1.1342 - regression_loss: 0.9912 - classification_loss: 0.1430 353/500 [====================>.........] - ETA: 55s - loss: 1.1342 - regression_loss: 0.9912 - classification_loss: 0.1430 354/500 [====================>.........] - ETA: 55s - loss: 1.1324 - regression_loss: 0.9896 - classification_loss: 0.1428 355/500 [====================>.........] - ETA: 55s - loss: 1.1307 - regression_loss: 0.9880 - classification_loss: 0.1426 356/500 [====================>.........] - ETA: 54s - loss: 1.1308 - regression_loss: 0.9881 - classification_loss: 0.1426 357/500 [====================>.........] - ETA: 54s - loss: 1.1292 - regression_loss: 0.9868 - classification_loss: 0.1424 358/500 [====================>.........] - ETA: 54s - loss: 1.1296 - regression_loss: 0.9872 - classification_loss: 0.1424 359/500 [====================>.........] - ETA: 53s - loss: 1.1305 - regression_loss: 0.9878 - classification_loss: 0.1427 360/500 [====================>.........] - ETA: 53s - loss: 1.1316 - regression_loss: 0.9888 - classification_loss: 0.1428 361/500 [====================>.........] - ETA: 52s - loss: 1.1327 - regression_loss: 0.9897 - classification_loss: 0.1429 362/500 [====================>.........] - ETA: 52s - loss: 1.1321 - regression_loss: 0.9892 - classification_loss: 0.1429 363/500 [====================>.........] - ETA: 52s - loss: 1.1317 - regression_loss: 0.9888 - classification_loss: 0.1429 364/500 [====================>.........] - ETA: 51s - loss: 1.1331 - regression_loss: 0.9899 - classification_loss: 0.1432 365/500 [====================>.........] - ETA: 51s - loss: 1.1339 - regression_loss: 0.9907 - classification_loss: 0.1432 366/500 [====================>.........] - ETA: 50s - loss: 1.1330 - regression_loss: 0.9900 - classification_loss: 0.1430 367/500 [=====================>........] - ETA: 50s - loss: 1.1344 - regression_loss: 0.9908 - classification_loss: 0.1435 368/500 [=====================>........] - ETA: 50s - loss: 1.1332 - regression_loss: 0.9898 - classification_loss: 0.1434 369/500 [=====================>........] - ETA: 49s - loss: 1.1331 - regression_loss: 0.9898 - classification_loss: 0.1433 370/500 [=====================>........] - ETA: 49s - loss: 1.1327 - regression_loss: 0.9894 - classification_loss: 0.1433 371/500 [=====================>........] - ETA: 49s - loss: 1.1330 - regression_loss: 0.9897 - classification_loss: 0.1433 372/500 [=====================>........] - ETA: 48s - loss: 1.1327 - regression_loss: 0.9894 - classification_loss: 0.1433 373/500 [=====================>........] - ETA: 48s - loss: 1.1314 - regression_loss: 0.9882 - classification_loss: 0.1432 374/500 [=====================>........] - ETA: 47s - loss: 1.1311 - regression_loss: 0.9880 - classification_loss: 0.1430 375/500 [=====================>........] - ETA: 47s - loss: 1.1301 - regression_loss: 0.9870 - classification_loss: 0.1431 376/500 [=====================>........] - ETA: 47s - loss: 1.1310 - regression_loss: 0.9878 - classification_loss: 0.1432 377/500 [=====================>........] - ETA: 46s - loss: 1.1326 - regression_loss: 0.9891 - classification_loss: 0.1435 378/500 [=====================>........] - ETA: 46s - loss: 1.1327 - regression_loss: 0.9892 - classification_loss: 0.1435 379/500 [=====================>........] - ETA: 46s - loss: 1.1320 - regression_loss: 0.9886 - classification_loss: 0.1435 380/500 [=====================>........] - ETA: 45s - loss: 1.1335 - regression_loss: 0.9899 - classification_loss: 0.1436 381/500 [=====================>........] - ETA: 45s - loss: 1.1336 - regression_loss: 0.9901 - classification_loss: 0.1435 382/500 [=====================>........] - ETA: 44s - loss: 1.1340 - regression_loss: 0.9905 - classification_loss: 0.1435 383/500 [=====================>........] - ETA: 44s - loss: 1.1338 - regression_loss: 0.9904 - classification_loss: 0.1434 384/500 [======================>.......] - ETA: 44s - loss: 1.1333 - regression_loss: 0.9900 - classification_loss: 0.1433 385/500 [======================>.......] - ETA: 43s - loss: 1.1332 - regression_loss: 0.9899 - classification_loss: 0.1433 386/500 [======================>.......] - ETA: 43s - loss: 1.1322 - regression_loss: 0.9890 - classification_loss: 0.1432 387/500 [======================>.......] - ETA: 43s - loss: 1.1307 - regression_loss: 0.9876 - classification_loss: 0.1430 388/500 [======================>.......] - ETA: 42s - loss: 1.1298 - regression_loss: 0.9869 - classification_loss: 0.1429 389/500 [======================>.......] - ETA: 42s - loss: 1.1289 - regression_loss: 0.9862 - classification_loss: 0.1427 390/500 [======================>.......] - ETA: 41s - loss: 1.1297 - regression_loss: 0.9869 - classification_loss: 0.1428 391/500 [======================>.......] - ETA: 41s - loss: 1.1287 - regression_loss: 0.9861 - classification_loss: 0.1426 392/500 [======================>.......] - ETA: 41s - loss: 1.1290 - regression_loss: 0.9864 - classification_loss: 0.1426 393/500 [======================>.......] - ETA: 40s - loss: 1.1291 - regression_loss: 0.9863 - classification_loss: 0.1427 394/500 [======================>.......] - ETA: 40s - loss: 1.1281 - regression_loss: 0.9855 - classification_loss: 0.1426 395/500 [======================>.......] - ETA: 39s - loss: 1.1284 - regression_loss: 0.9857 - classification_loss: 0.1427 396/500 [======================>.......] - ETA: 39s - loss: 1.1287 - regression_loss: 0.9860 - classification_loss: 0.1427 397/500 [======================>.......] - ETA: 39s - loss: 1.1279 - regression_loss: 0.9852 - classification_loss: 0.1426 398/500 [======================>.......] - ETA: 38s - loss: 1.1276 - regression_loss: 0.9850 - classification_loss: 0.1426 399/500 [======================>.......] - ETA: 38s - loss: 1.1275 - regression_loss: 0.9849 - classification_loss: 0.1426 400/500 [=======================>......] - ETA: 38s - loss: 1.1275 - regression_loss: 0.9849 - classification_loss: 0.1426 401/500 [=======================>......] - ETA: 37s - loss: 1.1262 - regression_loss: 0.9837 - classification_loss: 0.1425 402/500 [=======================>......] - ETA: 37s - loss: 1.1257 - regression_loss: 0.9834 - classification_loss: 0.1424 403/500 [=======================>......] - ETA: 36s - loss: 1.1253 - regression_loss: 0.9831 - classification_loss: 0.1422 404/500 [=======================>......] - ETA: 36s - loss: 1.1259 - regression_loss: 0.9837 - classification_loss: 0.1421 405/500 [=======================>......] - ETA: 36s - loss: 1.1262 - regression_loss: 0.9840 - classification_loss: 0.1422 406/500 [=======================>......] - ETA: 35s - loss: 1.1268 - regression_loss: 0.9846 - classification_loss: 0.1422 407/500 [=======================>......] - ETA: 35s - loss: 1.1255 - regression_loss: 0.9835 - classification_loss: 0.1420 408/500 [=======================>......] - ETA: 35s - loss: 1.1250 - regression_loss: 0.9830 - classification_loss: 0.1420 409/500 [=======================>......] - ETA: 34s - loss: 1.1248 - regression_loss: 0.9828 - classification_loss: 0.1420 410/500 [=======================>......] - ETA: 34s - loss: 1.1243 - regression_loss: 0.9824 - classification_loss: 0.1418 411/500 [=======================>......] - ETA: 33s - loss: 1.1232 - regression_loss: 0.9816 - classification_loss: 0.1417 412/500 [=======================>......] - ETA: 33s - loss: 1.1240 - regression_loss: 0.9823 - classification_loss: 0.1417 413/500 [=======================>......] - ETA: 33s - loss: 1.1230 - regression_loss: 0.9815 - classification_loss: 0.1415 414/500 [=======================>......] - ETA: 32s - loss: 1.1233 - regression_loss: 0.9818 - classification_loss: 0.1415 415/500 [=======================>......] - ETA: 32s - loss: 1.1238 - regression_loss: 0.9822 - classification_loss: 0.1416 416/500 [=======================>......] - ETA: 31s - loss: 1.1245 - regression_loss: 0.9828 - classification_loss: 0.1417 417/500 [========================>.....] - ETA: 31s - loss: 1.1244 - regression_loss: 0.9827 - classification_loss: 0.1417 418/500 [========================>.....] - ETA: 31s - loss: 1.1246 - regression_loss: 0.9830 - classification_loss: 0.1417 419/500 [========================>.....] - ETA: 30s - loss: 1.1251 - regression_loss: 0.9834 - classification_loss: 0.1417 420/500 [========================>.....] - ETA: 30s - loss: 1.1245 - regression_loss: 0.9828 - classification_loss: 0.1417 421/500 [========================>.....] - ETA: 30s - loss: 1.1234 - regression_loss: 0.9819 - classification_loss: 0.1415 422/500 [========================>.....] - ETA: 29s - loss: 1.1243 - regression_loss: 0.9827 - classification_loss: 0.1416 423/500 [========================>.....] - ETA: 29s - loss: 1.1253 - regression_loss: 0.9835 - classification_loss: 0.1417 424/500 [========================>.....] - ETA: 28s - loss: 1.1250 - regression_loss: 0.9829 - classification_loss: 0.1420 425/500 [========================>.....] - ETA: 28s - loss: 1.1257 - regression_loss: 0.9836 - classification_loss: 0.1421 426/500 [========================>.....] - ETA: 28s - loss: 1.1246 - regression_loss: 0.9826 - classification_loss: 0.1420 427/500 [========================>.....] - ETA: 27s - loss: 1.1259 - regression_loss: 0.9837 - classification_loss: 0.1422 428/500 [========================>.....] - ETA: 27s - loss: 1.1253 - regression_loss: 0.9831 - classification_loss: 0.1422 429/500 [========================>.....] - ETA: 27s - loss: 1.1256 - regression_loss: 0.9834 - classification_loss: 0.1422 430/500 [========================>.....] - ETA: 26s - loss: 1.1249 - regression_loss: 0.9828 - classification_loss: 0.1421 431/500 [========================>.....] - ETA: 26s - loss: 1.1244 - regression_loss: 0.9824 - classification_loss: 0.1419 432/500 [========================>.....] - ETA: 25s - loss: 1.1247 - regression_loss: 0.9828 - classification_loss: 0.1419 433/500 [========================>.....] - ETA: 25s - loss: 1.1238 - regression_loss: 0.9820 - classification_loss: 0.1418 434/500 [=========================>....] - ETA: 25s - loss: 1.1239 - regression_loss: 0.9821 - classification_loss: 0.1418 435/500 [=========================>....] - ETA: 24s - loss: 1.1240 - regression_loss: 0.9821 - classification_loss: 0.1419 436/500 [=========================>....] - ETA: 24s - loss: 1.1246 - regression_loss: 0.9826 - classification_loss: 0.1420 437/500 [=========================>....] - ETA: 23s - loss: 1.1247 - regression_loss: 0.9827 - classification_loss: 0.1420 438/500 [=========================>....] - ETA: 23s - loss: 1.1243 - regression_loss: 0.9823 - classification_loss: 0.1420 439/500 [=========================>....] - ETA: 23s - loss: 1.1248 - regression_loss: 0.9828 - classification_loss: 0.1420 440/500 [=========================>....] - ETA: 22s - loss: 1.1272 - regression_loss: 0.9849 - classification_loss: 0.1423 441/500 [=========================>....] - ETA: 22s - loss: 1.1273 - regression_loss: 0.9851 - classification_loss: 0.1422 442/500 [=========================>....] - ETA: 22s - loss: 1.1265 - regression_loss: 0.9844 - classification_loss: 0.1421 443/500 [=========================>....] - ETA: 21s - loss: 1.1253 - regression_loss: 0.9834 - classification_loss: 0.1419 444/500 [=========================>....] - ETA: 21s - loss: 1.1274 - regression_loss: 0.9852 - classification_loss: 0.1423 445/500 [=========================>....] - ETA: 20s - loss: 1.1268 - regression_loss: 0.9847 - classification_loss: 0.1421 446/500 [=========================>....] - ETA: 20s - loss: 1.1267 - regression_loss: 0.9846 - classification_loss: 0.1421 447/500 [=========================>....] - ETA: 20s - loss: 1.1274 - regression_loss: 0.9853 - classification_loss: 0.1421 448/500 [=========================>....] - ETA: 19s - loss: 1.1270 - regression_loss: 0.9850 - classification_loss: 0.1421 449/500 [=========================>....] - ETA: 19s - loss: 1.1266 - regression_loss: 0.9846 - classification_loss: 0.1420 450/500 [==========================>...] - ETA: 19s - loss: 1.1263 - regression_loss: 0.9843 - classification_loss: 0.1420 451/500 [==========================>...] - ETA: 18s - loss: 1.1264 - regression_loss: 0.9846 - classification_loss: 0.1418 452/500 [==========================>...] - ETA: 18s - loss: 1.1261 - regression_loss: 0.9843 - classification_loss: 0.1418 453/500 [==========================>...] - ETA: 17s - loss: 1.1285 - regression_loss: 0.9865 - classification_loss: 0.1420 454/500 [==========================>...] - ETA: 17s - loss: 1.1301 - regression_loss: 0.9878 - classification_loss: 0.1422 455/500 [==========================>...] - ETA: 17s - loss: 1.1302 - regression_loss: 0.9880 - classification_loss: 0.1423 456/500 [==========================>...] - ETA: 16s - loss: 1.1302 - regression_loss: 0.9881 - classification_loss: 0.1421 457/500 [==========================>...] - ETA: 16s - loss: 1.1289 - regression_loss: 0.9870 - classification_loss: 0.1419 458/500 [==========================>...] - ETA: 15s - loss: 1.1285 - regression_loss: 0.9867 - classification_loss: 0.1419 459/500 [==========================>...] - ETA: 15s - loss: 1.1278 - regression_loss: 0.9859 - classification_loss: 0.1419 460/500 [==========================>...] - ETA: 15s - loss: 1.1277 - regression_loss: 0.9859 - classification_loss: 0.1418 461/500 [==========================>...] - ETA: 14s - loss: 1.1285 - regression_loss: 0.9866 - classification_loss: 0.1419 462/500 [==========================>...] - ETA: 14s - loss: 1.1293 - regression_loss: 0.9872 - classification_loss: 0.1421 463/500 [==========================>...] - ETA: 14s - loss: 1.1315 - regression_loss: 0.9890 - classification_loss: 0.1425 464/500 [==========================>...] - ETA: 13s - loss: 1.1311 - regression_loss: 0.9887 - classification_loss: 0.1424 465/500 [==========================>...] - ETA: 13s - loss: 1.1315 - regression_loss: 0.9891 - classification_loss: 0.1424 466/500 [==========================>...] - ETA: 12s - loss: 1.1311 - regression_loss: 0.9888 - classification_loss: 0.1422 467/500 [===========================>..] - ETA: 12s - loss: 1.1316 - regression_loss: 0.9893 - classification_loss: 0.1422 468/500 [===========================>..] - ETA: 12s - loss: 1.1321 - regression_loss: 0.9898 - classification_loss: 0.1423 469/500 [===========================>..] - ETA: 11s - loss: 1.1327 - regression_loss: 0.9904 - classification_loss: 0.1424 470/500 [===========================>..] - ETA: 11s - loss: 1.1329 - regression_loss: 0.9904 - classification_loss: 0.1425 471/500 [===========================>..] - ETA: 11s - loss: 1.1321 - regression_loss: 0.9897 - classification_loss: 0.1424 472/500 [===========================>..] - ETA: 10s - loss: 1.1319 - regression_loss: 0.9896 - classification_loss: 0.1424 473/500 [===========================>..] - ETA: 10s - loss: 1.1313 - regression_loss: 0.9889 - classification_loss: 0.1424 474/500 [===========================>..] - ETA: 9s - loss: 1.1318 - regression_loss: 0.9894 - classification_loss: 0.1424  475/500 [===========================>..] - ETA: 9s - loss: 1.1311 - regression_loss: 0.9888 - classification_loss: 0.1423 476/500 [===========================>..] - ETA: 9s - loss: 1.1314 - regression_loss: 0.9890 - classification_loss: 0.1423 477/500 [===========================>..] - ETA: 8s - loss: 1.1320 - regression_loss: 0.9896 - classification_loss: 0.1424 478/500 [===========================>..] - ETA: 8s - loss: 1.1316 - regression_loss: 0.9893 - classification_loss: 0.1423 479/500 [===========================>..] - ETA: 7s - loss: 1.1313 - regression_loss: 0.9892 - classification_loss: 0.1421 480/500 [===========================>..] - ETA: 7s - loss: 1.1304 - regression_loss: 0.9884 - classification_loss: 0.1420 481/500 [===========================>..] - ETA: 7s - loss: 1.1290 - regression_loss: 0.9872 - classification_loss: 0.1418 482/500 [===========================>..] - ETA: 6s - loss: 1.1282 - regression_loss: 0.9865 - classification_loss: 0.1417 483/500 [===========================>..] - ETA: 6s - loss: 1.1288 - regression_loss: 0.9870 - classification_loss: 0.1418 484/500 [============================>.] - ETA: 6s - loss: 1.1277 - regression_loss: 0.9861 - classification_loss: 0.1416 485/500 [============================>.] - ETA: 5s - loss: 1.1275 - regression_loss: 0.9859 - classification_loss: 0.1415 486/500 [============================>.] - ETA: 5s - loss: 1.1281 - regression_loss: 0.9864 - classification_loss: 0.1417 487/500 [============================>.] - ETA: 4s - loss: 1.1281 - regression_loss: 0.9865 - classification_loss: 0.1416 488/500 [============================>.] - ETA: 4s - loss: 1.1276 - regression_loss: 0.9860 - classification_loss: 0.1416 489/500 [============================>.] - ETA: 4s - loss: 1.1281 - regression_loss: 0.9864 - classification_loss: 0.1417 490/500 [============================>.] - ETA: 3s - loss: 1.1284 - regression_loss: 0.9868 - classification_loss: 0.1416 491/500 [============================>.] - ETA: 3s - loss: 1.1291 - regression_loss: 0.9871 - classification_loss: 0.1419 492/500 [============================>.] - ETA: 3s - loss: 1.1288 - regression_loss: 0.9869 - classification_loss: 0.1419 493/500 [============================>.] - ETA: 2s - loss: 1.1277 - regression_loss: 0.9859 - classification_loss: 0.1419 494/500 [============================>.] - ETA: 2s - loss: 1.1276 - regression_loss: 0.9858 - classification_loss: 0.1418 495/500 [============================>.] - ETA: 1s - loss: 1.1264 - regression_loss: 0.9848 - classification_loss: 0.1416 496/500 [============================>.] - ETA: 1s - loss: 1.1254 - regression_loss: 0.9839 - classification_loss: 0.1415 497/500 [============================>.] - ETA: 1s - loss: 1.1255 - regression_loss: 0.9840 - classification_loss: 0.1415 498/500 [============================>.] - ETA: 0s - loss: 1.1278 - regression_loss: 0.9860 - classification_loss: 0.1418 499/500 [============================>.] - ETA: 0s - loss: 1.1282 - regression_loss: 0.9863 - classification_loss: 0.1420 500/500 [==============================] - 190s 381ms/step - loss: 1.1271 - regression_loss: 0.9853 - classification_loss: 0.1418 1172 instances of class plum with average precision: 0.6675 mAP: 0.6675 Epoch 00008: saving model to ./training/snapshots/resnet101_pascal_08.h5 Epoch 9/20 1/500 [..............................] - ETA: 3:10 - loss: 1.4058 - regression_loss: 1.2191 - classification_loss: 0.1867 2/500 [..............................] - ETA: 3:06 - loss: 1.2516 - regression_loss: 1.1003 - classification_loss: 0.1513 3/500 [..............................] - ETA: 3:03 - loss: 1.2747 - regression_loss: 1.1498 - classification_loss: 0.1249 4/500 [..............................] - ETA: 3:02 - loss: 1.3349 - regression_loss: 1.1897 - classification_loss: 0.1452 5/500 [..............................] - ETA: 3:03 - loss: 1.3941 - regression_loss: 1.2322 - classification_loss: 0.1619 6/500 [..............................] - ETA: 3:03 - loss: 1.3124 - regression_loss: 1.1585 - classification_loss: 0.1538 7/500 [..............................] - ETA: 3:06 - loss: 1.2680 - regression_loss: 1.1155 - classification_loss: 0.1525 8/500 [..............................] - ETA: 3:07 - loss: 1.2373 - regression_loss: 1.0896 - classification_loss: 0.1478 9/500 [..............................] - ETA: 3:06 - loss: 1.3229 - regression_loss: 1.1628 - classification_loss: 0.1602 10/500 [..............................] - ETA: 3:04 - loss: 1.3483 - regression_loss: 1.1818 - classification_loss: 0.1665 11/500 [..............................] - ETA: 3:03 - loss: 1.3292 - regression_loss: 1.1677 - classification_loss: 0.1616 12/500 [..............................] - ETA: 3:03 - loss: 1.3338 - regression_loss: 1.1736 - classification_loss: 0.1602 13/500 [..............................] - ETA: 3:03 - loss: 1.3319 - regression_loss: 1.1691 - classification_loss: 0.1628 14/500 [..............................] - ETA: 3:03 - loss: 1.3079 - regression_loss: 1.1487 - classification_loss: 0.1591 15/500 [..............................] - ETA: 3:02 - loss: 1.2657 - regression_loss: 1.1133 - classification_loss: 0.1525 16/500 [..............................] - ETA: 3:02 - loss: 1.2442 - regression_loss: 1.0920 - classification_loss: 0.1522 17/500 [>.............................] - ETA: 3:02 - loss: 1.2064 - regression_loss: 1.0587 - classification_loss: 0.1478 18/500 [>.............................] - ETA: 3:01 - loss: 1.2314 - regression_loss: 1.0821 - classification_loss: 0.1493 19/500 [>.............................] - ETA: 3:01 - loss: 1.2327 - regression_loss: 1.0821 - classification_loss: 0.1507 20/500 [>.............................] - ETA: 3:00 - loss: 1.2155 - regression_loss: 1.0666 - classification_loss: 0.1489 21/500 [>.............................] - ETA: 2:59 - loss: 1.2233 - regression_loss: 1.0710 - classification_loss: 0.1523 22/500 [>.............................] - ETA: 2:58 - loss: 1.2300 - regression_loss: 1.0756 - classification_loss: 0.1544 23/500 [>.............................] - ETA: 2:57 - loss: 1.2354 - regression_loss: 1.0797 - classification_loss: 0.1558 24/500 [>.............................] - ETA: 2:56 - loss: 1.2450 - regression_loss: 1.0883 - classification_loss: 0.1567 25/500 [>.............................] - ETA: 2:57 - loss: 1.2353 - regression_loss: 1.0813 - classification_loss: 0.1541 26/500 [>.............................] - ETA: 2:57 - loss: 1.2092 - regression_loss: 1.0588 - classification_loss: 0.1505 27/500 [>.............................] - ETA: 2:56 - loss: 1.1997 - regression_loss: 1.0488 - classification_loss: 0.1509 28/500 [>.............................] - ETA: 2:56 - loss: 1.1825 - regression_loss: 1.0334 - classification_loss: 0.1491 29/500 [>.............................] - ETA: 2:55 - loss: 1.1851 - regression_loss: 1.0372 - classification_loss: 0.1479 30/500 [>.............................] - ETA: 2:54 - loss: 1.1763 - regression_loss: 1.0263 - classification_loss: 0.1500 31/500 [>.............................] - ETA: 2:54 - loss: 1.1897 - regression_loss: 1.0382 - classification_loss: 0.1515 32/500 [>.............................] - ETA: 2:54 - loss: 1.1772 - regression_loss: 1.0292 - classification_loss: 0.1480 33/500 [>.............................] - ETA: 2:55 - loss: 1.1652 - regression_loss: 1.0188 - classification_loss: 0.1465 34/500 [=>............................] - ETA: 2:55 - loss: 1.1740 - regression_loss: 1.0265 - classification_loss: 0.1475 35/500 [=>............................] - ETA: 2:54 - loss: 1.1788 - regression_loss: 1.0320 - classification_loss: 0.1468 36/500 [=>............................] - ETA: 2:53 - loss: 1.1729 - regression_loss: 1.0275 - classification_loss: 0.1454 37/500 [=>............................] - ETA: 2:53 - loss: 1.1729 - regression_loss: 1.0271 - classification_loss: 0.1458 38/500 [=>............................] - ETA: 2:53 - loss: 1.1669 - regression_loss: 1.0224 - classification_loss: 0.1445 39/500 [=>............................] - ETA: 2:52 - loss: 1.1661 - regression_loss: 1.0228 - classification_loss: 0.1433 40/500 [=>............................] - ETA: 2:52 - loss: 1.1691 - regression_loss: 1.0252 - classification_loss: 0.1439 41/500 [=>............................] - ETA: 2:51 - loss: 1.1656 - regression_loss: 1.0220 - classification_loss: 0.1436 42/500 [=>............................] - ETA: 2:51 - loss: 1.1529 - regression_loss: 1.0113 - classification_loss: 0.1416 43/500 [=>............................] - ETA: 2:51 - loss: 1.1413 - regression_loss: 1.0010 - classification_loss: 0.1403 44/500 [=>............................] - ETA: 2:50 - loss: 1.1472 - regression_loss: 1.0066 - classification_loss: 0.1407 45/500 [=>............................] - ETA: 2:50 - loss: 1.1323 - regression_loss: 0.9938 - classification_loss: 0.1386 46/500 [=>............................] - ETA: 2:50 - loss: 1.1164 - regression_loss: 0.9800 - classification_loss: 0.1364 47/500 [=>............................] - ETA: 2:49 - loss: 1.1150 - regression_loss: 0.9792 - classification_loss: 0.1358 48/500 [=>............................] - ETA: 2:49 - loss: 1.0994 - regression_loss: 0.9652 - classification_loss: 0.1341 49/500 [=>............................] - ETA: 2:48 - loss: 1.1090 - regression_loss: 0.9751 - classification_loss: 0.1339 50/500 [==>...........................] - ETA: 2:48 - loss: 1.1157 - regression_loss: 0.9817 - classification_loss: 0.1340 51/500 [==>...........................] - ETA: 2:48 - loss: 1.1103 - regression_loss: 0.9773 - classification_loss: 0.1331 52/500 [==>...........................] - ETA: 2:47 - loss: 1.1123 - regression_loss: 0.9792 - classification_loss: 0.1331 53/500 [==>...........................] - ETA: 2:47 - loss: 1.1155 - regression_loss: 0.9813 - classification_loss: 0.1342 54/500 [==>...........................] - ETA: 2:46 - loss: 1.1220 - regression_loss: 0.9869 - classification_loss: 0.1351 55/500 [==>...........................] - ETA: 2:46 - loss: 1.1181 - regression_loss: 0.9839 - classification_loss: 0.1342 56/500 [==>...........................] - ETA: 2:46 - loss: 1.1225 - regression_loss: 0.9871 - classification_loss: 0.1354 57/500 [==>...........................] - ETA: 2:45 - loss: 1.1179 - regression_loss: 0.9828 - classification_loss: 0.1351 58/500 [==>...........................] - ETA: 2:45 - loss: 1.1207 - regression_loss: 0.9861 - classification_loss: 0.1346 59/500 [==>...........................] - ETA: 2:45 - loss: 1.1194 - regression_loss: 0.9846 - classification_loss: 0.1348 60/500 [==>...........................] - ETA: 2:44 - loss: 1.1121 - regression_loss: 0.9782 - classification_loss: 0.1338 61/500 [==>...........................] - ETA: 2:44 - loss: 1.1143 - regression_loss: 0.9800 - classification_loss: 0.1343 62/500 [==>...........................] - ETA: 2:44 - loss: 1.1154 - regression_loss: 0.9805 - classification_loss: 0.1348 63/500 [==>...........................] - ETA: 2:44 - loss: 1.1123 - regression_loss: 0.9779 - classification_loss: 0.1345 64/500 [==>...........................] - ETA: 2:43 - loss: 1.1131 - regression_loss: 0.9785 - classification_loss: 0.1346 65/500 [==>...........................] - ETA: 2:43 - loss: 1.1085 - regression_loss: 0.9739 - classification_loss: 0.1346 66/500 [==>...........................] - ETA: 2:43 - loss: 1.1118 - regression_loss: 0.9767 - classification_loss: 0.1351 67/500 [===>..........................] - ETA: 2:42 - loss: 1.1145 - regression_loss: 0.9786 - classification_loss: 0.1359 68/500 [===>..........................] - ETA: 2:42 - loss: 1.1100 - regression_loss: 0.9746 - classification_loss: 0.1355 69/500 [===>..........................] - ETA: 2:42 - loss: 1.1023 - regression_loss: 0.9668 - classification_loss: 0.1354 70/500 [===>..........................] - ETA: 2:41 - loss: 1.0955 - regression_loss: 0.9611 - classification_loss: 0.1344 71/500 [===>..........................] - ETA: 2:41 - loss: 1.0936 - regression_loss: 0.9597 - classification_loss: 0.1339 72/500 [===>..........................] - ETA: 2:41 - loss: 1.0947 - regression_loss: 0.9613 - classification_loss: 0.1333 73/500 [===>..........................] - ETA: 2:40 - loss: 1.0933 - regression_loss: 0.9601 - classification_loss: 0.1332 74/500 [===>..........................] - ETA: 2:40 - loss: 1.0882 - regression_loss: 0.9556 - classification_loss: 0.1327 75/500 [===>..........................] - ETA: 2:40 - loss: 1.0881 - regression_loss: 0.9550 - classification_loss: 0.1330 76/500 [===>..........................] - ETA: 2:39 - loss: 1.0900 - regression_loss: 0.9564 - classification_loss: 0.1336 77/500 [===>..........................] - ETA: 2:39 - loss: 1.0834 - regression_loss: 0.9505 - classification_loss: 0.1329 78/500 [===>..........................] - ETA: 2:39 - loss: 1.0773 - regression_loss: 0.9453 - classification_loss: 0.1320 79/500 [===>..........................] - ETA: 2:38 - loss: 1.0737 - regression_loss: 0.9419 - classification_loss: 0.1318 80/500 [===>..........................] - ETA: 2:38 - loss: 1.0744 - regression_loss: 0.9426 - classification_loss: 0.1318 81/500 [===>..........................] - ETA: 2:38 - loss: 1.0720 - regression_loss: 0.9408 - classification_loss: 0.1312 82/500 [===>..........................] - ETA: 2:37 - loss: 1.0683 - regression_loss: 0.9373 - classification_loss: 0.1310 83/500 [===>..........................] - ETA: 2:37 - loss: 1.0695 - regression_loss: 0.9388 - classification_loss: 0.1307 84/500 [====>.........................] - ETA: 2:37 - loss: 1.0697 - regression_loss: 0.9387 - classification_loss: 0.1310 85/500 [====>.........................] - ETA: 2:36 - loss: 1.0718 - regression_loss: 0.9406 - classification_loss: 0.1312 86/500 [====>.........................] - ETA: 2:36 - loss: 1.0673 - regression_loss: 0.9367 - classification_loss: 0.1306 87/500 [====>.........................] - ETA: 2:35 - loss: 1.0700 - regression_loss: 0.9393 - classification_loss: 0.1307 88/500 [====>.........................] - ETA: 2:35 - loss: 1.0686 - regression_loss: 0.9382 - classification_loss: 0.1304 89/500 [====>.........................] - ETA: 2:35 - loss: 1.0633 - regression_loss: 0.9337 - classification_loss: 0.1296 90/500 [====>.........................] - ETA: 2:34 - loss: 1.0591 - regression_loss: 0.9301 - classification_loss: 0.1290 91/500 [====>.........................] - ETA: 2:34 - loss: 1.0616 - regression_loss: 0.9324 - classification_loss: 0.1292 92/500 [====>.........................] - ETA: 2:34 - loss: 1.0619 - regression_loss: 0.9328 - classification_loss: 0.1290 93/500 [====>.........................] - ETA: 2:33 - loss: 1.0599 - regression_loss: 0.9315 - classification_loss: 0.1284 94/500 [====>.........................] - ETA: 2:33 - loss: 1.0606 - regression_loss: 0.9320 - classification_loss: 0.1286 95/500 [====>.........................] - ETA: 2:33 - loss: 1.0589 - regression_loss: 0.9307 - classification_loss: 0.1282 96/500 [====>.........................] - ETA: 2:32 - loss: 1.0558 - regression_loss: 0.9280 - classification_loss: 0.1278 97/500 [====>.........................] - ETA: 2:32 - loss: 1.0588 - regression_loss: 0.9313 - classification_loss: 0.1275 98/500 [====>.........................] - ETA: 2:32 - loss: 1.0593 - regression_loss: 0.9316 - classification_loss: 0.1277 99/500 [====>.........................] - ETA: 2:31 - loss: 1.0582 - regression_loss: 0.9306 - classification_loss: 0.1276 100/500 [=====>........................] - ETA: 2:31 - loss: 1.0576 - regression_loss: 0.9304 - classification_loss: 0.1272 101/500 [=====>........................] - ETA: 2:30 - loss: 1.0567 - regression_loss: 0.9296 - classification_loss: 0.1272 102/500 [=====>........................] - ETA: 2:30 - loss: 1.0589 - regression_loss: 0.9316 - classification_loss: 0.1273 103/500 [=====>........................] - ETA: 2:30 - loss: 1.0611 - regression_loss: 0.9337 - classification_loss: 0.1274 104/500 [=====>........................] - ETA: 2:29 - loss: 1.0647 - regression_loss: 0.9368 - classification_loss: 0.1280 105/500 [=====>........................] - ETA: 2:29 - loss: 1.0679 - regression_loss: 0.9399 - classification_loss: 0.1280 106/500 [=====>........................] - ETA: 2:29 - loss: 1.0724 - regression_loss: 0.9437 - classification_loss: 0.1287 107/500 [=====>........................] - ETA: 2:28 - loss: 1.0671 - regression_loss: 0.9390 - classification_loss: 0.1281 108/500 [=====>........................] - ETA: 2:28 - loss: 1.0656 - regression_loss: 0.9373 - classification_loss: 0.1283 109/500 [=====>........................] - ETA: 2:28 - loss: 1.0672 - regression_loss: 0.9386 - classification_loss: 0.1285 110/500 [=====>........................] - ETA: 2:27 - loss: 1.0722 - regression_loss: 0.9430 - classification_loss: 0.1292 111/500 [=====>........................] - ETA: 2:27 - loss: 1.0741 - regression_loss: 0.9442 - classification_loss: 0.1299 112/500 [=====>........................] - ETA: 2:26 - loss: 1.0753 - regression_loss: 0.9451 - classification_loss: 0.1303 113/500 [=====>........................] - ETA: 2:26 - loss: 1.0738 - regression_loss: 0.9440 - classification_loss: 0.1298 114/500 [=====>........................] - ETA: 2:26 - loss: 1.0742 - regression_loss: 0.9441 - classification_loss: 0.1301 115/500 [=====>........................] - ETA: 2:26 - loss: 1.0752 - regression_loss: 0.9449 - classification_loss: 0.1304 116/500 [=====>........................] - ETA: 2:25 - loss: 1.0719 - regression_loss: 0.9419 - classification_loss: 0.1301 117/500 [======>.......................] - ETA: 2:25 - loss: 1.0671 - regression_loss: 0.9376 - classification_loss: 0.1295 118/500 [======>.......................] - ETA: 2:24 - loss: 1.0707 - regression_loss: 0.9408 - classification_loss: 0.1299 119/500 [======>.......................] - ETA: 2:24 - loss: 1.0698 - regression_loss: 0.9400 - classification_loss: 0.1298 120/500 [======>.......................] - ETA: 2:24 - loss: 1.0680 - regression_loss: 0.9388 - classification_loss: 0.1293 121/500 [======>.......................] - ETA: 2:23 - loss: 1.0635 - regression_loss: 0.9349 - classification_loss: 0.1286 122/500 [======>.......................] - ETA: 2:23 - loss: 1.0594 - regression_loss: 0.9315 - classification_loss: 0.1279 123/500 [======>.......................] - ETA: 2:23 - loss: 1.0596 - regression_loss: 0.9320 - classification_loss: 0.1276 124/500 [======>.......................] - ETA: 2:22 - loss: 1.0597 - regression_loss: 0.9322 - classification_loss: 0.1275 125/500 [======>.......................] - ETA: 2:22 - loss: 1.0597 - regression_loss: 0.9322 - classification_loss: 0.1275 126/500 [======>.......................] - ETA: 2:21 - loss: 1.0594 - regression_loss: 0.9320 - classification_loss: 0.1274 127/500 [======>.......................] - ETA: 2:21 - loss: 1.0587 - regression_loss: 0.9315 - classification_loss: 0.1272 128/500 [======>.......................] - ETA: 2:21 - loss: 1.0591 - regression_loss: 0.9313 - classification_loss: 0.1279 129/500 [======>.......................] - ETA: 2:20 - loss: 1.0619 - regression_loss: 0.9340 - classification_loss: 0.1279 130/500 [======>.......................] - ETA: 2:20 - loss: 1.0660 - regression_loss: 0.9382 - classification_loss: 0.1278 131/500 [======>.......................] - ETA: 2:20 - loss: 1.0696 - regression_loss: 0.9415 - classification_loss: 0.1281 132/500 [======>.......................] - ETA: 2:19 - loss: 1.0732 - regression_loss: 0.9443 - classification_loss: 0.1289 133/500 [======>.......................] - ETA: 2:19 - loss: 1.0708 - regression_loss: 0.9424 - classification_loss: 0.1284 134/500 [=======>......................] - ETA: 2:19 - loss: 1.0768 - regression_loss: 0.9474 - classification_loss: 0.1293 135/500 [=======>......................] - ETA: 2:18 - loss: 1.0827 - regression_loss: 0.9525 - classification_loss: 0.1302 136/500 [=======>......................] - ETA: 2:18 - loss: 1.0862 - regression_loss: 0.9548 - classification_loss: 0.1314 137/500 [=======>......................] - ETA: 2:18 - loss: 1.0848 - regression_loss: 0.9537 - classification_loss: 0.1311 138/500 [=======>......................] - ETA: 2:17 - loss: 1.0877 - regression_loss: 0.9562 - classification_loss: 0.1316 139/500 [=======>......................] - ETA: 2:17 - loss: 1.0875 - regression_loss: 0.9560 - classification_loss: 0.1315 140/500 [=======>......................] - ETA: 2:16 - loss: 1.0848 - regression_loss: 0.9539 - classification_loss: 0.1309 141/500 [=======>......................] - ETA: 2:16 - loss: 1.0859 - regression_loss: 0.9548 - classification_loss: 0.1311 142/500 [=======>......................] - ETA: 2:16 - loss: 1.0869 - regression_loss: 0.9554 - classification_loss: 0.1315 143/500 [=======>......................] - ETA: 2:15 - loss: 1.0889 - regression_loss: 0.9569 - classification_loss: 0.1319 144/500 [=======>......................] - ETA: 2:15 - loss: 1.0877 - regression_loss: 0.9560 - classification_loss: 0.1317 145/500 [=======>......................] - ETA: 2:15 - loss: 1.0881 - regression_loss: 0.9561 - classification_loss: 0.1320 146/500 [=======>......................] - ETA: 2:14 - loss: 1.0896 - regression_loss: 0.9574 - classification_loss: 0.1322 147/500 [=======>......................] - ETA: 2:14 - loss: 1.0921 - regression_loss: 0.9596 - classification_loss: 0.1325 148/500 [=======>......................] - ETA: 2:13 - loss: 1.0932 - regression_loss: 0.9604 - classification_loss: 0.1328 149/500 [=======>......................] - ETA: 2:13 - loss: 1.0938 - regression_loss: 0.9608 - classification_loss: 0.1329 150/500 [========>.....................] - ETA: 2:13 - loss: 1.0884 - regression_loss: 0.9558 - classification_loss: 0.1326 151/500 [========>.....................] - ETA: 2:12 - loss: 1.0920 - regression_loss: 0.9590 - classification_loss: 0.1330 152/500 [========>.....................] - ETA: 2:12 - loss: 1.0924 - regression_loss: 0.9595 - classification_loss: 0.1329 153/500 [========>.....................] - ETA: 2:12 - loss: 1.0885 - regression_loss: 0.9562 - classification_loss: 0.1324 154/500 [========>.....................] - ETA: 2:11 - loss: 1.0879 - regression_loss: 0.9557 - classification_loss: 0.1321 155/500 [========>.....................] - ETA: 2:11 - loss: 1.0865 - regression_loss: 0.9546 - classification_loss: 0.1319 156/500 [========>.....................] - ETA: 2:10 - loss: 1.0824 - regression_loss: 0.9511 - classification_loss: 0.1313 157/500 [========>.....................] - ETA: 2:10 - loss: 1.0788 - regression_loss: 0.9479 - classification_loss: 0.1309 158/500 [========>.....................] - ETA: 2:10 - loss: 1.0783 - regression_loss: 0.9473 - classification_loss: 0.1310 159/500 [========>.....................] - ETA: 2:09 - loss: 1.0765 - regression_loss: 0.9461 - classification_loss: 0.1305 160/500 [========>.....................] - ETA: 2:09 - loss: 1.0745 - regression_loss: 0.9441 - classification_loss: 0.1303 161/500 [========>.....................] - ETA: 2:09 - loss: 1.0774 - regression_loss: 0.9464 - classification_loss: 0.1311 162/500 [========>.....................] - ETA: 2:08 - loss: 1.0808 - regression_loss: 0.9495 - classification_loss: 0.1314 163/500 [========>.....................] - ETA: 2:08 - loss: 1.0824 - regression_loss: 0.9509 - classification_loss: 0.1315 164/500 [========>.....................] - ETA: 2:08 - loss: 1.0820 - regression_loss: 0.9505 - classification_loss: 0.1314 165/500 [========>.....................] - ETA: 2:07 - loss: 1.0798 - regression_loss: 0.9486 - classification_loss: 0.1311 166/500 [========>.....................] - ETA: 2:07 - loss: 1.0823 - regression_loss: 0.9507 - classification_loss: 0.1316 167/500 [=========>....................] - ETA: 2:06 - loss: 1.0853 - regression_loss: 0.9533 - classification_loss: 0.1320 168/500 [=========>....................] - ETA: 2:06 - loss: 1.0865 - regression_loss: 0.9542 - classification_loss: 0.1323 169/500 [=========>....................] - ETA: 2:06 - loss: 1.0857 - regression_loss: 0.9536 - classification_loss: 0.1321 170/500 [=========>....................] - ETA: 2:05 - loss: 1.0839 - regression_loss: 0.9521 - classification_loss: 0.1318 171/500 [=========>....................] - ETA: 2:05 - loss: 1.0856 - regression_loss: 0.9535 - classification_loss: 0.1320 172/500 [=========>....................] - ETA: 2:05 - loss: 1.0843 - regression_loss: 0.9526 - classification_loss: 0.1317 173/500 [=========>....................] - ETA: 2:04 - loss: 1.0827 - regression_loss: 0.9513 - classification_loss: 0.1315 174/500 [=========>....................] - ETA: 2:04 - loss: 1.0834 - regression_loss: 0.9519 - classification_loss: 0.1314 175/500 [=========>....................] - ETA: 2:04 - loss: 1.0853 - regression_loss: 0.9535 - classification_loss: 0.1318 176/500 [=========>....................] - ETA: 2:03 - loss: 1.0866 - regression_loss: 0.9547 - classification_loss: 0.1319 177/500 [=========>....................] - ETA: 2:03 - loss: 1.0863 - regression_loss: 0.9546 - classification_loss: 0.1317 178/500 [=========>....................] - ETA: 2:02 - loss: 1.0845 - regression_loss: 0.9527 - classification_loss: 0.1318 179/500 [=========>....................] - ETA: 2:02 - loss: 1.0849 - regression_loss: 0.9531 - classification_loss: 0.1318 180/500 [=========>....................] - ETA: 2:02 - loss: 1.0878 - regression_loss: 0.9558 - classification_loss: 0.1321 181/500 [=========>....................] - ETA: 2:01 - loss: 1.0872 - regression_loss: 0.9553 - classification_loss: 0.1319 182/500 [=========>....................] - ETA: 2:01 - loss: 1.0880 - regression_loss: 0.9563 - classification_loss: 0.1318 183/500 [=========>....................] - ETA: 2:01 - loss: 1.0876 - regression_loss: 0.9559 - classification_loss: 0.1317 184/500 [==========>...................] - ETA: 2:00 - loss: 1.0892 - regression_loss: 0.9574 - classification_loss: 0.1318 185/500 [==========>...................] - ETA: 2:00 - loss: 1.0893 - regression_loss: 0.9574 - classification_loss: 0.1319 186/500 [==========>...................] - ETA: 2:00 - loss: 1.0894 - regression_loss: 0.9575 - classification_loss: 0.1319 187/500 [==========>...................] - ETA: 1:59 - loss: 1.0910 - regression_loss: 0.9589 - classification_loss: 0.1320 188/500 [==========>...................] - ETA: 1:59 - loss: 1.0884 - regression_loss: 0.9569 - classification_loss: 0.1315 189/500 [==========>...................] - ETA: 1:58 - loss: 1.0860 - regression_loss: 0.9548 - classification_loss: 0.1312 190/500 [==========>...................] - ETA: 1:58 - loss: 1.0838 - regression_loss: 0.9529 - classification_loss: 0.1308 191/500 [==========>...................] - ETA: 1:58 - loss: 1.0864 - regression_loss: 0.9553 - classification_loss: 0.1312 192/500 [==========>...................] - ETA: 1:57 - loss: 1.0854 - regression_loss: 0.9546 - classification_loss: 0.1309 193/500 [==========>...................] - ETA: 1:57 - loss: 1.0856 - regression_loss: 0.9545 - classification_loss: 0.1310 194/500 [==========>...................] - ETA: 1:57 - loss: 1.0847 - regression_loss: 0.9538 - classification_loss: 0.1309 195/500 [==========>...................] - ETA: 1:56 - loss: 1.0837 - regression_loss: 0.9528 - classification_loss: 0.1309 196/500 [==========>...................] - ETA: 1:56 - loss: 1.0818 - regression_loss: 0.9512 - classification_loss: 0.1306 197/500 [==========>...................] - ETA: 1:56 - loss: 1.0814 - regression_loss: 0.9511 - classification_loss: 0.1303 198/500 [==========>...................] - ETA: 1:55 - loss: 1.0795 - regression_loss: 0.9495 - classification_loss: 0.1299 199/500 [==========>...................] - ETA: 1:55 - loss: 1.0791 - regression_loss: 0.9493 - classification_loss: 0.1298 200/500 [===========>..................] - ETA: 1:55 - loss: 1.0789 - regression_loss: 0.9490 - classification_loss: 0.1300 201/500 [===========>..................] - ETA: 1:54 - loss: 1.0768 - regression_loss: 0.9472 - classification_loss: 0.1296 202/500 [===========>..................] - ETA: 1:54 - loss: 1.0790 - regression_loss: 0.9489 - classification_loss: 0.1301 203/500 [===========>..................] - ETA: 1:53 - loss: 1.0774 - regression_loss: 0.9477 - classification_loss: 0.1297 204/500 [===========>..................] - ETA: 1:53 - loss: 1.0761 - regression_loss: 0.9465 - classification_loss: 0.1296 205/500 [===========>..................] - ETA: 1:53 - loss: 1.0771 - regression_loss: 0.9474 - classification_loss: 0.1297 206/500 [===========>..................] - ETA: 1:52 - loss: 1.0783 - regression_loss: 0.9485 - classification_loss: 0.1298 207/500 [===========>..................] - ETA: 1:52 - loss: 1.0766 - regression_loss: 0.9470 - classification_loss: 0.1296 208/500 [===========>..................] - ETA: 1:52 - loss: 1.0775 - regression_loss: 0.9479 - classification_loss: 0.1296 209/500 [===========>..................] - ETA: 1:51 - loss: 1.0775 - regression_loss: 0.9480 - classification_loss: 0.1295 210/500 [===========>..................] - ETA: 1:51 - loss: 1.0743 - regression_loss: 0.9453 - classification_loss: 0.1290 211/500 [===========>..................] - ETA: 1:50 - loss: 1.0729 - regression_loss: 0.9439 - classification_loss: 0.1290 212/500 [===========>..................] - ETA: 1:50 - loss: 1.0727 - regression_loss: 0.9438 - classification_loss: 0.1289 213/500 [===========>..................] - ETA: 1:50 - loss: 1.0703 - regression_loss: 0.9417 - classification_loss: 0.1286 214/500 [===========>..................] - ETA: 1:49 - loss: 1.0707 - regression_loss: 0.9423 - classification_loss: 0.1284 215/500 [===========>..................] - ETA: 1:49 - loss: 1.0708 - regression_loss: 0.9426 - classification_loss: 0.1283 216/500 [===========>..................] - ETA: 1:49 - loss: 1.0689 - regression_loss: 0.9409 - classification_loss: 0.1280 217/500 [============>.................] - ETA: 1:48 - loss: 1.0700 - regression_loss: 0.9418 - classification_loss: 0.1282 218/500 [============>.................] - ETA: 1:48 - loss: 1.0712 - regression_loss: 0.9428 - classification_loss: 0.1284 219/500 [============>.................] - ETA: 1:48 - loss: 1.0698 - regression_loss: 0.9414 - classification_loss: 0.1283 220/500 [============>.................] - ETA: 1:47 - loss: 1.0705 - regression_loss: 0.9420 - classification_loss: 0.1285 221/500 [============>.................] - ETA: 1:47 - loss: 1.0714 - regression_loss: 0.9428 - classification_loss: 0.1287 222/500 [============>.................] - ETA: 1:46 - loss: 1.0726 - regression_loss: 0.9439 - classification_loss: 0.1287 223/500 [============>.................] - ETA: 1:46 - loss: 1.0717 - regression_loss: 0.9433 - classification_loss: 0.1285 224/500 [============>.................] - ETA: 1:46 - loss: 1.0744 - regression_loss: 0.9454 - classification_loss: 0.1290 225/500 [============>.................] - ETA: 1:45 - loss: 1.0739 - regression_loss: 0.9447 - classification_loss: 0.1292 226/500 [============>.................] - ETA: 1:45 - loss: 1.0739 - regression_loss: 0.9447 - classification_loss: 0.1292 227/500 [============>.................] - ETA: 1:44 - loss: 1.0732 - regression_loss: 0.9441 - classification_loss: 0.1291 228/500 [============>.................] - ETA: 1:44 - loss: 1.0717 - regression_loss: 0.9427 - classification_loss: 0.1290 229/500 [============>.................] - ETA: 1:44 - loss: 1.0704 - regression_loss: 0.9414 - classification_loss: 0.1290 230/500 [============>.................] - ETA: 1:43 - loss: 1.0722 - regression_loss: 0.9429 - classification_loss: 0.1293 231/500 [============>.................] - ETA: 1:43 - loss: 1.0716 - regression_loss: 0.9423 - classification_loss: 0.1292 232/500 [============>.................] - ETA: 1:42 - loss: 1.0742 - regression_loss: 0.9446 - classification_loss: 0.1296 233/500 [============>.................] - ETA: 1:42 - loss: 1.0716 - regression_loss: 0.9423 - classification_loss: 0.1293 234/500 [=============>................] - ETA: 1:42 - loss: 1.0719 - regression_loss: 0.9426 - classification_loss: 0.1293 235/500 [=============>................] - ETA: 1:41 - loss: 1.0725 - regression_loss: 0.9431 - classification_loss: 0.1294 236/500 [=============>................] - ETA: 1:41 - loss: 1.0705 - regression_loss: 0.9411 - classification_loss: 0.1294 237/500 [=============>................] - ETA: 1:41 - loss: 1.0692 - regression_loss: 0.9399 - classification_loss: 0.1293 238/500 [=============>................] - ETA: 1:40 - loss: 1.0676 - regression_loss: 0.9386 - classification_loss: 0.1290 239/500 [=============>................] - ETA: 1:40 - loss: 1.0686 - regression_loss: 0.9394 - classification_loss: 0.1292 240/500 [=============>................] - ETA: 1:39 - loss: 1.0684 - regression_loss: 0.9393 - classification_loss: 0.1291 241/500 [=============>................] - ETA: 1:39 - loss: 1.0658 - regression_loss: 0.9370 - classification_loss: 0.1288 242/500 [=============>................] - ETA: 1:39 - loss: 1.0665 - regression_loss: 0.9377 - classification_loss: 0.1289 243/500 [=============>................] - ETA: 1:38 - loss: 1.0681 - regression_loss: 0.9389 - classification_loss: 0.1292 244/500 [=============>................] - ETA: 1:38 - loss: 1.0682 - regression_loss: 0.9389 - classification_loss: 0.1293 245/500 [=============>................] - ETA: 1:38 - loss: 1.0681 - regression_loss: 0.9387 - classification_loss: 0.1294 246/500 [=============>................] - ETA: 1:37 - loss: 1.0672 - regression_loss: 0.9379 - classification_loss: 0.1293 247/500 [=============>................] - ETA: 1:37 - loss: 1.0676 - regression_loss: 0.9382 - classification_loss: 0.1294 248/500 [=============>................] - ETA: 1:36 - loss: 1.0668 - regression_loss: 0.9374 - classification_loss: 0.1294 249/500 [=============>................] - ETA: 1:36 - loss: 1.0661 - regression_loss: 0.9367 - classification_loss: 0.1294 250/500 [==============>...............] - ETA: 1:36 - loss: 1.0679 - regression_loss: 0.9382 - classification_loss: 0.1297 251/500 [==============>...............] - ETA: 1:35 - loss: 1.0656 - regression_loss: 0.9362 - classification_loss: 0.1294 252/500 [==============>...............] - ETA: 1:35 - loss: 1.0650 - regression_loss: 0.9357 - classification_loss: 0.1293 253/500 [==============>...............] - ETA: 1:35 - loss: 1.0651 - regression_loss: 0.9356 - classification_loss: 0.1294 254/500 [==============>...............] - ETA: 1:34 - loss: 1.0643 - regression_loss: 0.9348 - classification_loss: 0.1295 255/500 [==============>...............] - ETA: 1:34 - loss: 1.0630 - regression_loss: 0.9338 - classification_loss: 0.1293 256/500 [==============>...............] - ETA: 1:33 - loss: 1.0630 - regression_loss: 0.9338 - classification_loss: 0.1292 257/500 [==============>...............] - ETA: 1:33 - loss: 1.0631 - regression_loss: 0.9338 - classification_loss: 0.1293 258/500 [==============>...............] - ETA: 1:33 - loss: 1.0635 - regression_loss: 0.9341 - classification_loss: 0.1294 259/500 [==============>...............] - ETA: 1:32 - loss: 1.0632 - regression_loss: 0.9337 - classification_loss: 0.1294 260/500 [==============>...............] - ETA: 1:32 - loss: 1.0633 - regression_loss: 0.9339 - classification_loss: 0.1294 261/500 [==============>...............] - ETA: 1:32 - loss: 1.0639 - regression_loss: 0.9346 - classification_loss: 0.1293 262/500 [==============>...............] - ETA: 1:31 - loss: 1.0649 - regression_loss: 0.9354 - classification_loss: 0.1294 263/500 [==============>...............] - ETA: 1:31 - loss: 1.0651 - regression_loss: 0.9355 - classification_loss: 0.1295 264/500 [==============>...............] - ETA: 1:30 - loss: 1.0639 - regression_loss: 0.9344 - classification_loss: 0.1295 265/500 [==============>...............] - ETA: 1:30 - loss: 1.0623 - regression_loss: 0.9330 - classification_loss: 0.1293 266/500 [==============>...............] - ETA: 1:30 - loss: 1.0630 - regression_loss: 0.9337 - classification_loss: 0.1293 267/500 [===============>..............] - ETA: 1:29 - loss: 1.0620 - regression_loss: 0.9328 - classification_loss: 0.1291 268/500 [===============>..............] - ETA: 1:29 - loss: 1.0620 - regression_loss: 0.9329 - classification_loss: 0.1291 269/500 [===============>..............] - ETA: 1:28 - loss: 1.0618 - regression_loss: 0.9328 - classification_loss: 0.1291 270/500 [===============>..............] - ETA: 1:28 - loss: 1.0616 - regression_loss: 0.9325 - classification_loss: 0.1290 271/500 [===============>..............] - ETA: 1:28 - loss: 1.0609 - regression_loss: 0.9321 - classification_loss: 0.1288 272/500 [===============>..............] - ETA: 1:27 - loss: 1.0627 - regression_loss: 0.9336 - classification_loss: 0.1291 273/500 [===============>..............] - ETA: 1:27 - loss: 1.0648 - regression_loss: 0.9353 - classification_loss: 0.1295 274/500 [===============>..............] - ETA: 1:27 - loss: 1.0641 - regression_loss: 0.9345 - classification_loss: 0.1296 275/500 [===============>..............] - ETA: 1:26 - loss: 1.0620 - regression_loss: 0.9328 - classification_loss: 0.1293 276/500 [===============>..............] - ETA: 1:26 - loss: 1.0620 - regression_loss: 0.9326 - classification_loss: 0.1294 277/500 [===============>..............] - ETA: 1:25 - loss: 1.0629 - regression_loss: 0.9334 - classification_loss: 0.1295 278/500 [===============>..............] - ETA: 1:25 - loss: 1.0644 - regression_loss: 0.9346 - classification_loss: 0.1298 279/500 [===============>..............] - ETA: 1:25 - loss: 1.0631 - regression_loss: 0.9336 - classification_loss: 0.1295 280/500 [===============>..............] - ETA: 1:24 - loss: 1.0630 - regression_loss: 0.9335 - classification_loss: 0.1295 281/500 [===============>..............] - ETA: 1:24 - loss: 1.0615 - regression_loss: 0.9323 - classification_loss: 0.1292 282/500 [===============>..............] - ETA: 1:24 - loss: 1.0609 - regression_loss: 0.9317 - classification_loss: 0.1292 283/500 [===============>..............] - ETA: 1:23 - loss: 1.0594 - regression_loss: 0.9304 - classification_loss: 0.1290 284/500 [================>.............] - ETA: 1:23 - loss: 1.0598 - regression_loss: 0.9308 - classification_loss: 0.1291 285/500 [================>.............] - ETA: 1:22 - loss: 1.0594 - regression_loss: 0.9303 - classification_loss: 0.1290 286/500 [================>.............] - ETA: 1:22 - loss: 1.0594 - regression_loss: 0.9304 - classification_loss: 0.1290 287/500 [================>.............] - ETA: 1:22 - loss: 1.0594 - regression_loss: 0.9302 - classification_loss: 0.1292 288/500 [================>.............] - ETA: 1:21 - loss: 1.0579 - regression_loss: 0.9289 - classification_loss: 0.1290 289/500 [================>.............] - ETA: 1:21 - loss: 1.0599 - regression_loss: 0.9306 - classification_loss: 0.1293 290/500 [================>.............] - ETA: 1:21 - loss: 1.0591 - regression_loss: 0.9298 - classification_loss: 0.1293 291/500 [================>.............] - ETA: 1:20 - loss: 1.0595 - regression_loss: 0.9301 - classification_loss: 0.1294 292/500 [================>.............] - ETA: 1:20 - loss: 1.0589 - regression_loss: 0.9296 - classification_loss: 0.1293 293/500 [================>.............] - ETA: 1:19 - loss: 1.0571 - regression_loss: 0.9281 - classification_loss: 0.1290 294/500 [================>.............] - ETA: 1:19 - loss: 1.0577 - regression_loss: 0.9288 - classification_loss: 0.1289 295/500 [================>.............] - ETA: 1:19 - loss: 1.0573 - regression_loss: 0.9284 - classification_loss: 0.1289 296/500 [================>.............] - ETA: 1:18 - loss: 1.0577 - regression_loss: 0.9289 - classification_loss: 0.1289 297/500 [================>.............] - ETA: 1:18 - loss: 1.0566 - regression_loss: 0.9278 - classification_loss: 0.1288 298/500 [================>.............] - ETA: 1:18 - loss: 1.0573 - regression_loss: 0.9284 - classification_loss: 0.1289 299/500 [================>.............] - ETA: 1:17 - loss: 1.0580 - regression_loss: 0.9289 - classification_loss: 0.1291 300/500 [=================>............] - ETA: 1:17 - loss: 1.0573 - regression_loss: 0.9284 - classification_loss: 0.1289 301/500 [=================>............] - ETA: 1:16 - loss: 1.0575 - regression_loss: 0.9287 - classification_loss: 0.1289 302/500 [=================>............] - ETA: 1:16 - loss: 1.0563 - regression_loss: 0.9277 - classification_loss: 0.1286 303/500 [=================>............] - ETA: 1:16 - loss: 1.0554 - regression_loss: 0.9270 - classification_loss: 0.1285 304/500 [=================>............] - ETA: 1:15 - loss: 1.0539 - regression_loss: 0.9256 - classification_loss: 0.1284 305/500 [=================>............] - ETA: 1:15 - loss: 1.0525 - regression_loss: 0.9243 - classification_loss: 0.1283 306/500 [=================>............] - ETA: 1:14 - loss: 1.0511 - regression_loss: 0.9231 - classification_loss: 0.1280 307/500 [=================>............] - ETA: 1:14 - loss: 1.0533 - regression_loss: 0.9249 - classification_loss: 0.1284 308/500 [=================>............] - ETA: 1:14 - loss: 1.0534 - regression_loss: 0.9249 - classification_loss: 0.1284 309/500 [=================>............] - ETA: 1:13 - loss: 1.0545 - regression_loss: 0.9259 - classification_loss: 0.1286 310/500 [=================>............] - ETA: 1:13 - loss: 1.0566 - regression_loss: 0.9276 - classification_loss: 0.1289 311/500 [=================>............] - ETA: 1:13 - loss: 1.0555 - regression_loss: 0.9268 - classification_loss: 0.1287 312/500 [=================>............] - ETA: 1:12 - loss: 1.0565 - regression_loss: 0.9275 - classification_loss: 0.1291 313/500 [=================>............] - ETA: 1:12 - loss: 1.0566 - regression_loss: 0.9276 - classification_loss: 0.1290 314/500 [=================>............] - ETA: 1:11 - loss: 1.0570 - regression_loss: 0.9280 - classification_loss: 0.1290 315/500 [=================>............] - ETA: 1:11 - loss: 1.0568 - regression_loss: 0.9278 - classification_loss: 0.1289 316/500 [=================>............] - ETA: 1:11 - loss: 1.0562 - regression_loss: 0.9273 - classification_loss: 0.1289 317/500 [==================>...........] - ETA: 1:10 - loss: 1.0562 - regression_loss: 0.9272 - classification_loss: 0.1289 318/500 [==================>...........] - ETA: 1:10 - loss: 1.0579 - regression_loss: 0.9288 - classification_loss: 0.1291 319/500 [==================>...........] - ETA: 1:09 - loss: 1.0568 - regression_loss: 0.9278 - classification_loss: 0.1290 320/500 [==================>...........] - ETA: 1:09 - loss: 1.0572 - regression_loss: 0.9281 - classification_loss: 0.1291 321/500 [==================>...........] - ETA: 1:09 - loss: 1.0572 - regression_loss: 0.9281 - classification_loss: 0.1291 322/500 [==================>...........] - ETA: 1:08 - loss: 1.0581 - regression_loss: 0.9289 - classification_loss: 0.1293 323/500 [==================>...........] - ETA: 1:08 - loss: 1.0582 - regression_loss: 0.9289 - classification_loss: 0.1293 324/500 [==================>...........] - ETA: 1:08 - loss: 1.0576 - regression_loss: 0.9283 - classification_loss: 0.1293 325/500 [==================>...........] - ETA: 1:07 - loss: 1.0556 - regression_loss: 0.9266 - classification_loss: 0.1290 326/500 [==================>...........] - ETA: 1:07 - loss: 1.0562 - regression_loss: 0.9271 - classification_loss: 0.1291 327/500 [==================>...........] - ETA: 1:06 - loss: 1.0563 - regression_loss: 0.9272 - classification_loss: 0.1291 328/500 [==================>...........] - ETA: 1:06 - loss: 1.0554 - regression_loss: 0.9265 - classification_loss: 0.1290 329/500 [==================>...........] - ETA: 1:06 - loss: 1.0547 - regression_loss: 0.9259 - classification_loss: 0.1288 330/500 [==================>...........] - ETA: 1:05 - loss: 1.0527 - regression_loss: 0.9241 - classification_loss: 0.1286 331/500 [==================>...........] - ETA: 1:05 - loss: 1.0515 - regression_loss: 0.9230 - classification_loss: 0.1284 332/500 [==================>...........] - ETA: 1:04 - loss: 1.0533 - regression_loss: 0.9248 - classification_loss: 0.1285 333/500 [==================>...........] - ETA: 1:04 - loss: 1.0542 - regression_loss: 0.9256 - classification_loss: 0.1286 334/500 [===================>..........] - ETA: 1:04 - loss: 1.0542 - regression_loss: 0.9256 - classification_loss: 0.1285 335/500 [===================>..........] - ETA: 1:03 - loss: 1.0549 - regression_loss: 0.9263 - classification_loss: 0.1286 336/500 [===================>..........] - ETA: 1:03 - loss: 1.0549 - regression_loss: 0.9263 - classification_loss: 0.1286 337/500 [===================>..........] - ETA: 1:02 - loss: 1.0558 - regression_loss: 0.9268 - classification_loss: 0.1290 338/500 [===================>..........] - ETA: 1:02 - loss: 1.0562 - regression_loss: 0.9273 - classification_loss: 0.1289 339/500 [===================>..........] - ETA: 1:02 - loss: 1.0549 - regression_loss: 0.9261 - classification_loss: 0.1288 340/500 [===================>..........] - ETA: 1:01 - loss: 1.0540 - regression_loss: 0.9253 - classification_loss: 0.1287 341/500 [===================>..........] - ETA: 1:01 - loss: 1.0535 - regression_loss: 0.9249 - classification_loss: 0.1286 342/500 [===================>..........] - ETA: 1:01 - loss: 1.0538 - regression_loss: 0.9251 - classification_loss: 0.1286 343/500 [===================>..........] - ETA: 1:00 - loss: 1.0525 - regression_loss: 0.9240 - classification_loss: 0.1285 344/500 [===================>..........] - ETA: 1:00 - loss: 1.0521 - regression_loss: 0.9238 - classification_loss: 0.1284 345/500 [===================>..........] - ETA: 59s - loss: 1.0506 - regression_loss: 0.9224 - classification_loss: 0.1282  346/500 [===================>..........] - ETA: 59s - loss: 1.0512 - regression_loss: 0.9229 - classification_loss: 0.1282 347/500 [===================>..........] - ETA: 59s - loss: 1.0509 - regression_loss: 0.9227 - classification_loss: 0.1282 348/500 [===================>..........] - ETA: 58s - loss: 1.0527 - regression_loss: 0.9242 - classification_loss: 0.1285 349/500 [===================>..........] - ETA: 58s - loss: 1.0536 - regression_loss: 0.9250 - classification_loss: 0.1287 350/500 [====================>.........] - ETA: 57s - loss: 1.0536 - regression_loss: 0.9246 - classification_loss: 0.1290 351/500 [====================>.........] - ETA: 57s - loss: 1.0543 - regression_loss: 0.9251 - classification_loss: 0.1292 352/500 [====================>.........] - ETA: 57s - loss: 1.0562 - regression_loss: 0.9267 - classification_loss: 0.1295 353/500 [====================>.........] - ETA: 56s - loss: 1.0566 - regression_loss: 0.9271 - classification_loss: 0.1295 354/500 [====================>.........] - ETA: 56s - loss: 1.0548 - regression_loss: 0.9256 - classification_loss: 0.1293 355/500 [====================>.........] - ETA: 56s - loss: 1.0548 - regression_loss: 0.9253 - classification_loss: 0.1294 356/500 [====================>.........] - ETA: 55s - loss: 1.0558 - regression_loss: 0.9260 - classification_loss: 0.1298 357/500 [====================>.........] - ETA: 55s - loss: 1.0558 - regression_loss: 0.9259 - classification_loss: 0.1299 358/500 [====================>.........] - ETA: 54s - loss: 1.0543 - regression_loss: 0.9245 - classification_loss: 0.1297 359/500 [====================>.........] - ETA: 54s - loss: 1.0537 - regression_loss: 0.9240 - classification_loss: 0.1297 360/500 [====================>.........] - ETA: 54s - loss: 1.0548 - regression_loss: 0.9248 - classification_loss: 0.1300 361/500 [====================>.........] - ETA: 53s - loss: 1.0554 - regression_loss: 0.9253 - classification_loss: 0.1301 362/500 [====================>.........] - ETA: 53s - loss: 1.0576 - regression_loss: 0.9269 - classification_loss: 0.1307 363/500 [====================>.........] - ETA: 52s - loss: 1.0564 - regression_loss: 0.9259 - classification_loss: 0.1305 364/500 [====================>.........] - ETA: 52s - loss: 1.0575 - regression_loss: 0.9267 - classification_loss: 0.1308 365/500 [====================>.........] - ETA: 52s - loss: 1.0573 - regression_loss: 0.9265 - classification_loss: 0.1308 366/500 [====================>.........] - ETA: 51s - loss: 1.0576 - regression_loss: 0.9268 - classification_loss: 0.1308 367/500 [=====================>........] - ETA: 51s - loss: 1.0557 - regression_loss: 0.9251 - classification_loss: 0.1306 368/500 [=====================>........] - ETA: 50s - loss: 1.0548 - regression_loss: 0.9243 - classification_loss: 0.1305 369/500 [=====================>........] - ETA: 50s - loss: 1.0543 - regression_loss: 0.9237 - classification_loss: 0.1305 370/500 [=====================>........] - ETA: 50s - loss: 1.0533 - regression_loss: 0.9229 - classification_loss: 0.1304 371/500 [=====================>........] - ETA: 49s - loss: 1.0518 - regression_loss: 0.9215 - classification_loss: 0.1302 372/500 [=====================>........] - ETA: 49s - loss: 1.0513 - regression_loss: 0.9213 - classification_loss: 0.1300 373/500 [=====================>........] - ETA: 49s - loss: 1.0520 - regression_loss: 0.9221 - classification_loss: 0.1299 374/500 [=====================>........] - ETA: 48s - loss: 1.0522 - regression_loss: 0.9223 - classification_loss: 0.1299 375/500 [=====================>........] - ETA: 48s - loss: 1.0528 - regression_loss: 0.9228 - classification_loss: 0.1300 376/500 [=====================>........] - ETA: 47s - loss: 1.0525 - regression_loss: 0.9226 - classification_loss: 0.1299 377/500 [=====================>........] - ETA: 47s - loss: 1.0526 - regression_loss: 0.9226 - classification_loss: 0.1299 378/500 [=====================>........] - ETA: 47s - loss: 1.0523 - regression_loss: 0.9224 - classification_loss: 0.1299 379/500 [=====================>........] - ETA: 46s - loss: 1.0530 - regression_loss: 0.9230 - classification_loss: 0.1300 380/500 [=====================>........] - ETA: 46s - loss: 1.0517 - regression_loss: 0.9219 - classification_loss: 0.1299 381/500 [=====================>........] - ETA: 45s - loss: 1.0524 - regression_loss: 0.9226 - classification_loss: 0.1299 382/500 [=====================>........] - ETA: 45s - loss: 1.0528 - regression_loss: 0.9229 - classification_loss: 0.1299 383/500 [=====================>........] - ETA: 45s - loss: 1.0523 - regression_loss: 0.9226 - classification_loss: 0.1297 384/500 [======================>.......] - ETA: 44s - loss: 1.0507 - regression_loss: 0.9213 - classification_loss: 0.1295 385/500 [======================>.......] - ETA: 44s - loss: 1.0507 - regression_loss: 0.9213 - classification_loss: 0.1294 386/500 [======================>.......] - ETA: 44s - loss: 1.0497 - regression_loss: 0.9205 - classification_loss: 0.1292 387/500 [======================>.......] - ETA: 43s - loss: 1.0496 - regression_loss: 0.9206 - classification_loss: 0.1290 388/500 [======================>.......] - ETA: 43s - loss: 1.0492 - regression_loss: 0.9202 - classification_loss: 0.1289 389/500 [======================>.......] - ETA: 42s - loss: 1.0474 - regression_loss: 0.9187 - classification_loss: 0.1287 390/500 [======================>.......] - ETA: 42s - loss: 1.0466 - regression_loss: 0.9180 - classification_loss: 0.1286 391/500 [======================>.......] - ETA: 42s - loss: 1.0472 - regression_loss: 0.9185 - classification_loss: 0.1287 392/500 [======================>.......] - ETA: 41s - loss: 1.0472 - regression_loss: 0.9186 - classification_loss: 0.1286 393/500 [======================>.......] - ETA: 41s - loss: 1.0477 - regression_loss: 0.9189 - classification_loss: 0.1287 394/500 [======================>.......] - ETA: 40s - loss: 1.0480 - regression_loss: 0.9192 - classification_loss: 0.1288 395/500 [======================>.......] - ETA: 40s - loss: 1.0488 - regression_loss: 0.9198 - classification_loss: 0.1289 396/500 [======================>.......] - ETA: 40s - loss: 1.0479 - regression_loss: 0.9191 - classification_loss: 0.1288 397/500 [======================>.......] - ETA: 39s - loss: 1.0484 - regression_loss: 0.9195 - classification_loss: 0.1289 398/500 [======================>.......] - ETA: 39s - loss: 1.0470 - regression_loss: 0.9181 - classification_loss: 0.1288 399/500 [======================>.......] - ETA: 39s - loss: 1.0466 - regression_loss: 0.9178 - classification_loss: 0.1289 400/500 [=======================>......] - ETA: 38s - loss: 1.0466 - regression_loss: 0.9178 - classification_loss: 0.1288 401/500 [=======================>......] - ETA: 38s - loss: 1.0467 - regression_loss: 0.9179 - classification_loss: 0.1288 402/500 [=======================>......] - ETA: 37s - loss: 1.0459 - regression_loss: 0.9172 - classification_loss: 0.1287 403/500 [=======================>......] - ETA: 37s - loss: 1.0463 - regression_loss: 0.9176 - classification_loss: 0.1287 404/500 [=======================>......] - ETA: 37s - loss: 1.0470 - regression_loss: 0.9182 - classification_loss: 0.1288 405/500 [=======================>......] - ETA: 36s - loss: 1.0465 - regression_loss: 0.9177 - classification_loss: 0.1288 406/500 [=======================>......] - ETA: 36s - loss: 1.0476 - regression_loss: 0.9187 - classification_loss: 0.1289 407/500 [=======================>......] - ETA: 35s - loss: 1.0481 - regression_loss: 0.9191 - classification_loss: 0.1290 408/500 [=======================>......] - ETA: 35s - loss: 1.0508 - regression_loss: 0.9214 - classification_loss: 0.1295 409/500 [=======================>......] - ETA: 35s - loss: 1.0514 - regression_loss: 0.9219 - classification_loss: 0.1295 410/500 [=======================>......] - ETA: 34s - loss: 1.0515 - regression_loss: 0.9220 - classification_loss: 0.1295 411/500 [=======================>......] - ETA: 34s - loss: 1.0526 - regression_loss: 0.9228 - classification_loss: 0.1298 412/500 [=======================>......] - ETA: 34s - loss: 1.0535 - regression_loss: 0.9235 - classification_loss: 0.1300 413/500 [=======================>......] - ETA: 33s - loss: 1.0536 - regression_loss: 0.9236 - classification_loss: 0.1301 414/500 [=======================>......] - ETA: 33s - loss: 1.0526 - regression_loss: 0.9227 - classification_loss: 0.1299 415/500 [=======================>......] - ETA: 32s - loss: 1.0537 - regression_loss: 0.9239 - classification_loss: 0.1298 416/500 [=======================>......] - ETA: 32s - loss: 1.0532 - regression_loss: 0.9234 - classification_loss: 0.1297 417/500 [========================>.....] - ETA: 32s - loss: 1.0533 - regression_loss: 0.9235 - classification_loss: 0.1298 418/500 [========================>.....] - ETA: 31s - loss: 1.0531 - regression_loss: 0.9233 - classification_loss: 0.1298 419/500 [========================>.....] - ETA: 31s - loss: 1.0524 - regression_loss: 0.9227 - classification_loss: 0.1297 420/500 [========================>.....] - ETA: 30s - loss: 1.0521 - regression_loss: 0.9225 - classification_loss: 0.1296 421/500 [========================>.....] - ETA: 30s - loss: 1.0523 - regression_loss: 0.9226 - classification_loss: 0.1296 422/500 [========================>.....] - ETA: 30s - loss: 1.0514 - regression_loss: 0.9219 - classification_loss: 0.1296 423/500 [========================>.....] - ETA: 29s - loss: 1.0520 - regression_loss: 0.9224 - classification_loss: 0.1296 424/500 [========================>.....] - ETA: 29s - loss: 1.0518 - regression_loss: 0.9223 - classification_loss: 0.1295 425/500 [========================>.....] - ETA: 29s - loss: 1.0508 - regression_loss: 0.9214 - classification_loss: 0.1293 426/500 [========================>.....] - ETA: 28s - loss: 1.0494 - regression_loss: 0.9202 - classification_loss: 0.1291 427/500 [========================>.....] - ETA: 28s - loss: 1.0489 - regression_loss: 0.9197 - classification_loss: 0.1291 428/500 [========================>.....] - ETA: 27s - loss: 1.0498 - regression_loss: 0.9204 - classification_loss: 0.1294 429/500 [========================>.....] - ETA: 27s - loss: 1.0486 - regression_loss: 0.9193 - classification_loss: 0.1293 430/500 [========================>.....] - ETA: 27s - loss: 1.0473 - regression_loss: 0.9182 - classification_loss: 0.1291 431/500 [========================>.....] - ETA: 26s - loss: 1.0478 - regression_loss: 0.9188 - classification_loss: 0.1291 432/500 [========================>.....] - ETA: 26s - loss: 1.0480 - regression_loss: 0.9190 - classification_loss: 0.1291 433/500 [========================>.....] - ETA: 25s - loss: 1.0473 - regression_loss: 0.9183 - classification_loss: 0.1290 434/500 [=========================>....] - ETA: 25s - loss: 1.0476 - regression_loss: 0.9186 - classification_loss: 0.1290 435/500 [=========================>....] - ETA: 25s - loss: 1.0470 - regression_loss: 0.9181 - classification_loss: 0.1289 436/500 [=========================>....] - ETA: 24s - loss: 1.0469 - regression_loss: 0.9180 - classification_loss: 0.1290 437/500 [=========================>....] - ETA: 24s - loss: 1.0461 - regression_loss: 0.9173 - classification_loss: 0.1288 438/500 [=========================>....] - ETA: 23s - loss: 1.0471 - regression_loss: 0.9182 - classification_loss: 0.1289 439/500 [=========================>....] - ETA: 23s - loss: 1.0471 - regression_loss: 0.9182 - classification_loss: 0.1289 440/500 [=========================>....] - ETA: 23s - loss: 1.0468 - regression_loss: 0.9180 - classification_loss: 0.1288 441/500 [=========================>....] - ETA: 22s - loss: 1.0461 - regression_loss: 0.9174 - classification_loss: 0.1287 442/500 [=========================>....] - ETA: 22s - loss: 1.0458 - regression_loss: 0.9171 - classification_loss: 0.1287 443/500 [=========================>....] - ETA: 22s - loss: 1.0470 - regression_loss: 0.9181 - classification_loss: 0.1289 444/500 [=========================>....] - ETA: 21s - loss: 1.0472 - regression_loss: 0.9183 - classification_loss: 0.1289 445/500 [=========================>....] - ETA: 21s - loss: 1.0466 - regression_loss: 0.9173 - classification_loss: 0.1293 446/500 [=========================>....] - ETA: 20s - loss: 1.0465 - regression_loss: 0.9171 - classification_loss: 0.1294 447/500 [=========================>....] - ETA: 20s - loss: 1.0463 - regression_loss: 0.9169 - classification_loss: 0.1294 448/500 [=========================>....] - ETA: 20s - loss: 1.0455 - regression_loss: 0.9162 - classification_loss: 0.1293 449/500 [=========================>....] - ETA: 19s - loss: 1.0453 - regression_loss: 0.9160 - classification_loss: 0.1294 450/500 [==========================>...] - ETA: 19s - loss: 1.0455 - regression_loss: 0.9161 - classification_loss: 0.1294 451/500 [==========================>...] - ETA: 18s - loss: 1.0464 - regression_loss: 0.9168 - classification_loss: 0.1295 452/500 [==========================>...] - ETA: 18s - loss: 1.0457 - regression_loss: 0.9163 - classification_loss: 0.1294 453/500 [==========================>...] - ETA: 18s - loss: 1.0446 - regression_loss: 0.9153 - classification_loss: 0.1293 454/500 [==========================>...] - ETA: 17s - loss: 1.0449 - regression_loss: 0.9155 - classification_loss: 0.1294 455/500 [==========================>...] - ETA: 17s - loss: 1.0446 - regression_loss: 0.9153 - classification_loss: 0.1293 456/500 [==========================>...] - ETA: 17s - loss: 1.0439 - regression_loss: 0.9148 - classification_loss: 0.1292 457/500 [==========================>...] - ETA: 16s - loss: 1.0433 - regression_loss: 0.9143 - classification_loss: 0.1291 458/500 [==========================>...] - ETA: 16s - loss: 1.0438 - regression_loss: 0.9147 - classification_loss: 0.1291 459/500 [==========================>...] - ETA: 15s - loss: 1.0440 - regression_loss: 0.9149 - classification_loss: 0.1291 460/500 [==========================>...] - ETA: 15s - loss: 1.0442 - regression_loss: 0.9151 - classification_loss: 0.1291 461/500 [==========================>...] - ETA: 15s - loss: 1.0449 - regression_loss: 0.9157 - classification_loss: 0.1292 462/500 [==========================>...] - ETA: 14s - loss: 1.0445 - regression_loss: 0.9154 - classification_loss: 0.1291 463/500 [==========================>...] - ETA: 14s - loss: 1.0438 - regression_loss: 0.9149 - classification_loss: 0.1290 464/500 [==========================>...] - ETA: 13s - loss: 1.0439 - regression_loss: 0.9149 - classification_loss: 0.1290 465/500 [==========================>...] - ETA: 13s - loss: 1.0444 - regression_loss: 0.9153 - classification_loss: 0.1291 466/500 [==========================>...] - ETA: 13s - loss: 1.0431 - regression_loss: 0.9142 - classification_loss: 0.1289 467/500 [===========================>..] - ETA: 12s - loss: 1.0419 - regression_loss: 0.9131 - classification_loss: 0.1288 468/500 [===========================>..] - ETA: 12s - loss: 1.0422 - regression_loss: 0.9134 - classification_loss: 0.1288 469/500 [===========================>..] - ETA: 12s - loss: 1.0420 - regression_loss: 0.9129 - classification_loss: 0.1290 470/500 [===========================>..] - ETA: 11s - loss: 1.0416 - regression_loss: 0.9126 - classification_loss: 0.1290 471/500 [===========================>..] - ETA: 11s - loss: 1.0413 - regression_loss: 0.9123 - classification_loss: 0.1290 472/500 [===========================>..] - ETA: 10s - loss: 1.0409 - regression_loss: 0.9119 - classification_loss: 0.1290 473/500 [===========================>..] - ETA: 10s - loss: 1.0410 - regression_loss: 0.9120 - classification_loss: 0.1291 474/500 [===========================>..] - ETA: 10s - loss: 1.0420 - regression_loss: 0.9125 - classification_loss: 0.1295 475/500 [===========================>..] - ETA: 9s - loss: 1.0423 - regression_loss: 0.9127 - classification_loss: 0.1296  476/500 [===========================>..] - ETA: 9s - loss: 1.0428 - regression_loss: 0.9131 - classification_loss: 0.1297 477/500 [===========================>..] - ETA: 8s - loss: 1.0427 - regression_loss: 0.9130 - classification_loss: 0.1297 478/500 [===========================>..] - ETA: 8s - loss: 1.0422 - regression_loss: 0.9126 - classification_loss: 0.1296 479/500 [===========================>..] - ETA: 8s - loss: 1.0413 - regression_loss: 0.9117 - classification_loss: 0.1296 480/500 [===========================>..] - ETA: 7s - loss: 1.0412 - regression_loss: 0.9117 - classification_loss: 0.1295 481/500 [===========================>..] - ETA: 7s - loss: 1.0413 - regression_loss: 0.9117 - classification_loss: 0.1296 482/500 [===========================>..] - ETA: 6s - loss: 1.0405 - regression_loss: 0.9111 - classification_loss: 0.1294 483/500 [===========================>..] - ETA: 6s - loss: 1.0408 - regression_loss: 0.9113 - classification_loss: 0.1294 484/500 [============================>.] - ETA: 6s - loss: 1.0402 - regression_loss: 0.9108 - classification_loss: 0.1294 485/500 [============================>.] - ETA: 5s - loss: 1.0403 - regression_loss: 0.9109 - classification_loss: 0.1294 486/500 [============================>.] - ETA: 5s - loss: 1.0401 - regression_loss: 0.9107 - classification_loss: 0.1294 487/500 [============================>.] - ETA: 5s - loss: 1.0398 - regression_loss: 0.9105 - classification_loss: 0.1294 488/500 [============================>.] - ETA: 4s - loss: 1.0398 - regression_loss: 0.9104 - classification_loss: 0.1294 489/500 [============================>.] - ETA: 4s - loss: 1.0397 - regression_loss: 0.9103 - classification_loss: 0.1294 490/500 [============================>.] - ETA: 3s - loss: 1.0394 - regression_loss: 0.9100 - classification_loss: 0.1294 491/500 [============================>.] - ETA: 3s - loss: 1.0394 - regression_loss: 0.9101 - classification_loss: 0.1294 492/500 [============================>.] - ETA: 3s - loss: 1.0384 - regression_loss: 0.9092 - classification_loss: 0.1292 493/500 [============================>.] - ETA: 2s - loss: 1.0378 - regression_loss: 0.9087 - classification_loss: 0.1291 494/500 [============================>.] - ETA: 2s - loss: 1.0368 - regression_loss: 0.9079 - classification_loss: 0.1289 495/500 [============================>.] - ETA: 1s - loss: 1.0368 - regression_loss: 0.9079 - classification_loss: 0.1289 496/500 [============================>.] - ETA: 1s - loss: 1.0365 - regression_loss: 0.9077 - classification_loss: 0.1288 497/500 [============================>.] - ETA: 1s - loss: 1.0355 - regression_loss: 0.9068 - classification_loss: 0.1287 498/500 [============================>.] - ETA: 0s - loss: 1.0356 - regression_loss: 0.9069 - classification_loss: 0.1286 499/500 [============================>.] - ETA: 0s - loss: 1.0359 - regression_loss: 0.9071 - classification_loss: 0.1287 500/500 [==============================] - 194s 388ms/step - loss: 1.0361 - regression_loss: 0.9073 - classification_loss: 0.1287 1172 instances of class plum with average precision: 0.6816 mAP: 0.6816 Epoch 00009: saving model to ./training/snapshots/resnet101_pascal_09.h5 Epoch 10/20 1/500 [..............................] - ETA: 2:59 - loss: 0.8562 - regression_loss: 0.7541 - classification_loss: 0.1021 2/500 [..............................] - ETA: 3:08 - loss: 1.0927 - regression_loss: 0.9625 - classification_loss: 0.1302 3/500 [..............................] - ETA: 3:07 - loss: 0.8878 - regression_loss: 0.7853 - classification_loss: 0.1024 4/500 [..............................] - ETA: 3:09 - loss: 1.0288 - regression_loss: 0.9056 - classification_loss: 0.1232 5/500 [..............................] - ETA: 3:09 - loss: 0.9132 - regression_loss: 0.7998 - classification_loss: 0.1134 6/500 [..............................] - ETA: 3:10 - loss: 0.8729 - regression_loss: 0.7593 - classification_loss: 0.1136 7/500 [..............................] - ETA: 3:11 - loss: 0.8414 - regression_loss: 0.7332 - classification_loss: 0.1083 8/500 [..............................] - ETA: 3:11 - loss: 0.8938 - regression_loss: 0.7808 - classification_loss: 0.1130 9/500 [..............................] - ETA: 3:10 - loss: 0.9264 - regression_loss: 0.8071 - classification_loss: 0.1193 10/500 [..............................] - ETA: 3:11 - loss: 0.9330 - regression_loss: 0.8137 - classification_loss: 0.1194 11/500 [..............................] - ETA: 3:11 - loss: 0.9107 - regression_loss: 0.7954 - classification_loss: 0.1153 12/500 [..............................] - ETA: 3:11 - loss: 0.9158 - regression_loss: 0.7990 - classification_loss: 0.1169 13/500 [..............................] - ETA: 3:11 - loss: 0.8940 - regression_loss: 0.7824 - classification_loss: 0.1116 14/500 [..............................] - ETA: 3:11 - loss: 0.9191 - regression_loss: 0.8039 - classification_loss: 0.1152 15/500 [..............................] - ETA: 3:11 - loss: 0.8964 - regression_loss: 0.7854 - classification_loss: 0.1110 16/500 [..............................] - ETA: 3:10 - loss: 0.8827 - regression_loss: 0.7743 - classification_loss: 0.1084 17/500 [>.............................] - ETA: 3:10 - loss: 0.8688 - regression_loss: 0.7647 - classification_loss: 0.1041 18/500 [>.............................] - ETA: 3:10 - loss: 0.8753 - regression_loss: 0.7706 - classification_loss: 0.1046 19/500 [>.............................] - ETA: 3:10 - loss: 0.8610 - regression_loss: 0.7591 - classification_loss: 0.1019 20/500 [>.............................] - ETA: 3:09 - loss: 0.8671 - regression_loss: 0.7630 - classification_loss: 0.1041 21/500 [>.............................] - ETA: 3:08 - loss: 0.8805 - regression_loss: 0.7749 - classification_loss: 0.1057 22/500 [>.............................] - ETA: 3:08 - loss: 0.9099 - regression_loss: 0.8020 - classification_loss: 0.1079 23/500 [>.............................] - ETA: 3:07 - loss: 0.9106 - regression_loss: 0.8031 - classification_loss: 0.1075 24/500 [>.............................] - ETA: 3:06 - loss: 0.8940 - regression_loss: 0.7870 - classification_loss: 0.1069 25/500 [>.............................] - ETA: 3:06 - loss: 0.9001 - regression_loss: 0.7936 - classification_loss: 0.1065 26/500 [>.............................] - ETA: 3:05 - loss: 0.9057 - regression_loss: 0.7989 - classification_loss: 0.1069 27/500 [>.............................] - ETA: 3:04 - loss: 0.9076 - regression_loss: 0.7993 - classification_loss: 0.1083 28/500 [>.............................] - ETA: 3:04 - loss: 0.8976 - regression_loss: 0.7913 - classification_loss: 0.1064 29/500 [>.............................] - ETA: 3:04 - loss: 0.8894 - regression_loss: 0.7833 - classification_loss: 0.1060 30/500 [>.............................] - ETA: 3:03 - loss: 0.8860 - regression_loss: 0.7803 - classification_loss: 0.1056 31/500 [>.............................] - ETA: 3:03 - loss: 0.9010 - regression_loss: 0.7939 - classification_loss: 0.1070 32/500 [>.............................] - ETA: 3:03 - loss: 0.9037 - regression_loss: 0.7959 - classification_loss: 0.1078 33/500 [>.............................] - ETA: 3:03 - loss: 0.8967 - regression_loss: 0.7894 - classification_loss: 0.1073 34/500 [=>............................] - ETA: 3:03 - loss: 0.9050 - regression_loss: 0.7967 - classification_loss: 0.1083 35/500 [=>............................] - ETA: 3:02 - loss: 0.8945 - regression_loss: 0.7877 - classification_loss: 0.1068 36/500 [=>............................] - ETA: 3:02 - loss: 0.9162 - regression_loss: 0.8045 - classification_loss: 0.1118 37/500 [=>............................] - ETA: 3:01 - loss: 0.9030 - regression_loss: 0.7928 - classification_loss: 0.1102 38/500 [=>............................] - ETA: 3:01 - loss: 0.8917 - regression_loss: 0.7833 - classification_loss: 0.1083 39/500 [=>............................] - ETA: 3:01 - loss: 0.9085 - regression_loss: 0.7988 - classification_loss: 0.1097 40/500 [=>............................] - ETA: 3:01 - loss: 0.9182 - regression_loss: 0.8063 - classification_loss: 0.1119 41/500 [=>............................] - ETA: 3:00 - loss: 0.9239 - regression_loss: 0.8091 - classification_loss: 0.1148 42/500 [=>............................] - ETA: 3:00 - loss: 0.9346 - regression_loss: 0.8182 - classification_loss: 0.1164 43/500 [=>............................] - ETA: 2:59 - loss: 0.9419 - regression_loss: 0.8247 - classification_loss: 0.1173 44/500 [=>............................] - ETA: 2:59 - loss: 0.9412 - regression_loss: 0.8244 - classification_loss: 0.1168 45/500 [=>............................] - ETA: 2:58 - loss: 0.9483 - regression_loss: 0.8295 - classification_loss: 0.1188 46/500 [=>............................] - ETA: 2:58 - loss: 0.9523 - regression_loss: 0.8332 - classification_loss: 0.1192 47/500 [=>............................] - ETA: 2:58 - loss: 0.9477 - regression_loss: 0.8295 - classification_loss: 0.1182 48/500 [=>............................] - ETA: 2:57 - loss: 0.9513 - regression_loss: 0.8317 - classification_loss: 0.1196 49/500 [=>............................] - ETA: 2:57 - loss: 0.9479 - regression_loss: 0.8286 - classification_loss: 0.1193 50/500 [==>...........................] - ETA: 2:57 - loss: 0.9475 - regression_loss: 0.8283 - classification_loss: 0.1192 51/500 [==>...........................] - ETA: 2:56 - loss: 0.9499 - regression_loss: 0.8314 - classification_loss: 0.1186 52/500 [==>...........................] - ETA: 2:55 - loss: 0.9421 - regression_loss: 0.8248 - classification_loss: 0.1173 53/500 [==>...........................] - ETA: 2:55 - loss: 0.9346 - regression_loss: 0.8184 - classification_loss: 0.1162 54/500 [==>...........................] - ETA: 2:54 - loss: 0.9239 - regression_loss: 0.8085 - classification_loss: 0.1154 55/500 [==>...........................] - ETA: 2:54 - loss: 0.9338 - regression_loss: 0.8169 - classification_loss: 0.1169 56/500 [==>...........................] - ETA: 2:53 - loss: 0.9327 - regression_loss: 0.8153 - classification_loss: 0.1174 57/500 [==>...........................] - ETA: 2:53 - loss: 0.9392 - regression_loss: 0.8211 - classification_loss: 0.1181 58/500 [==>...........................] - ETA: 2:53 - loss: 0.9415 - regression_loss: 0.8221 - classification_loss: 0.1194 59/500 [==>...........................] - ETA: 2:52 - loss: 0.9346 - regression_loss: 0.8163 - classification_loss: 0.1183 60/500 [==>...........................] - ETA: 2:52 - loss: 0.9401 - regression_loss: 0.8218 - classification_loss: 0.1183 61/500 [==>...........................] - ETA: 2:52 - loss: 0.9446 - regression_loss: 0.8258 - classification_loss: 0.1189 62/500 [==>...........................] - ETA: 2:51 - loss: 0.9483 - regression_loss: 0.8289 - classification_loss: 0.1194 63/500 [==>...........................] - ETA: 2:51 - loss: 0.9480 - regression_loss: 0.8285 - classification_loss: 0.1195 64/500 [==>...........................] - ETA: 2:50 - loss: 0.9525 - regression_loss: 0.8329 - classification_loss: 0.1197 65/500 [==>...........................] - ETA: 2:50 - loss: 0.9518 - regression_loss: 0.8322 - classification_loss: 0.1196 66/500 [==>...........................] - ETA: 2:49 - loss: 0.9565 - regression_loss: 0.8364 - classification_loss: 0.1201 67/500 [===>..........................] - ETA: 2:49 - loss: 0.9621 - regression_loss: 0.8420 - classification_loss: 0.1201 68/500 [===>..........................] - ETA: 2:49 - loss: 0.9600 - regression_loss: 0.8400 - classification_loss: 0.1201 69/500 [===>..........................] - ETA: 2:48 - loss: 0.9719 - regression_loss: 0.8509 - classification_loss: 0.1210 70/500 [===>..........................] - ETA: 2:48 - loss: 0.9726 - regression_loss: 0.8505 - classification_loss: 0.1221 71/500 [===>..........................] - ETA: 2:47 - loss: 0.9725 - regression_loss: 0.8507 - classification_loss: 0.1218 72/500 [===>..........................] - ETA: 2:47 - loss: 0.9710 - regression_loss: 0.8502 - classification_loss: 0.1208 73/500 [===>..........................] - ETA: 2:46 - loss: 0.9643 - regression_loss: 0.8445 - classification_loss: 0.1198 74/500 [===>..........................] - ETA: 2:46 - loss: 0.9670 - regression_loss: 0.8466 - classification_loss: 0.1204 75/500 [===>..........................] - ETA: 2:45 - loss: 0.9708 - regression_loss: 0.8498 - classification_loss: 0.1210 76/500 [===>..........................] - ETA: 2:45 - loss: 0.9738 - regression_loss: 0.8524 - classification_loss: 0.1214 77/500 [===>..........................] - ETA: 2:44 - loss: 0.9728 - regression_loss: 0.8516 - classification_loss: 0.1213 78/500 [===>..........................] - ETA: 2:44 - loss: 0.9784 - regression_loss: 0.8557 - classification_loss: 0.1227 79/500 [===>..........................] - ETA: 2:44 - loss: 0.9730 - regression_loss: 0.8511 - classification_loss: 0.1219 80/500 [===>..........................] - ETA: 2:43 - loss: 0.9812 - regression_loss: 0.8576 - classification_loss: 0.1235 81/500 [===>..........................] - ETA: 2:43 - loss: 0.9842 - regression_loss: 0.8598 - classification_loss: 0.1244 82/500 [===>..........................] - ETA: 2:42 - loss: 0.9825 - regression_loss: 0.8588 - classification_loss: 0.1237 83/500 [===>..........................] - ETA: 2:42 - loss: 0.9839 - regression_loss: 0.8600 - classification_loss: 0.1239 84/500 [====>.........................] - ETA: 2:42 - loss: 0.9870 - regression_loss: 0.8622 - classification_loss: 0.1247 85/500 [====>.........................] - ETA: 2:42 - loss: 0.9812 - regression_loss: 0.8574 - classification_loss: 0.1238 86/500 [====>.........................] - ETA: 2:41 - loss: 0.9818 - regression_loss: 0.8575 - classification_loss: 0.1243 87/500 [====>.........................] - ETA: 2:41 - loss: 0.9827 - regression_loss: 0.8575 - classification_loss: 0.1252 88/500 [====>.........................] - ETA: 2:41 - loss: 0.9802 - regression_loss: 0.8545 - classification_loss: 0.1257 89/500 [====>.........................] - ETA: 2:40 - loss: 0.9768 - regression_loss: 0.8513 - classification_loss: 0.1255 90/500 [====>.........................] - ETA: 2:40 - loss: 0.9730 - regression_loss: 0.8480 - classification_loss: 0.1250 91/500 [====>.........................] - ETA: 2:39 - loss: 0.9754 - regression_loss: 0.8504 - classification_loss: 0.1249 92/500 [====>.........................] - ETA: 2:39 - loss: 0.9783 - regression_loss: 0.8529 - classification_loss: 0.1254 93/500 [====>.........................] - ETA: 2:39 - loss: 0.9764 - regression_loss: 0.8511 - classification_loss: 0.1253 94/500 [====>.........................] - ETA: 2:38 - loss: 0.9764 - regression_loss: 0.8511 - classification_loss: 0.1253 95/500 [====>.........................] - ETA: 2:38 - loss: 0.9774 - regression_loss: 0.8523 - classification_loss: 0.1250 96/500 [====>.........................] - ETA: 2:37 - loss: 0.9727 - regression_loss: 0.8482 - classification_loss: 0.1245 97/500 [====>.........................] - ETA: 2:37 - loss: 0.9724 - regression_loss: 0.8480 - classification_loss: 0.1244 98/500 [====>.........................] - ETA: 2:37 - loss: 0.9719 - regression_loss: 0.8477 - classification_loss: 0.1242 99/500 [====>.........................] - ETA: 2:36 - loss: 0.9771 - regression_loss: 0.8520 - classification_loss: 0.1251 100/500 [=====>........................] - ETA: 2:36 - loss: 0.9779 - regression_loss: 0.8528 - classification_loss: 0.1251 101/500 [=====>........................] - ETA: 2:35 - loss: 0.9839 - regression_loss: 0.8578 - classification_loss: 0.1261 102/500 [=====>........................] - ETA: 2:35 - loss: 0.9838 - regression_loss: 0.8581 - classification_loss: 0.1257 103/500 [=====>........................] - ETA: 2:35 - loss: 0.9820 - regression_loss: 0.8565 - classification_loss: 0.1254 104/500 [=====>........................] - ETA: 2:34 - loss: 0.9785 - regression_loss: 0.8533 - classification_loss: 0.1252 105/500 [=====>........................] - ETA: 2:34 - loss: 0.9760 - regression_loss: 0.8512 - classification_loss: 0.1248 106/500 [=====>........................] - ETA: 2:33 - loss: 0.9771 - regression_loss: 0.8524 - classification_loss: 0.1247 107/500 [=====>........................] - ETA: 2:33 - loss: 0.9766 - regression_loss: 0.8515 - classification_loss: 0.1251 108/500 [=====>........................] - ETA: 2:32 - loss: 0.9742 - regression_loss: 0.8498 - classification_loss: 0.1244 109/500 [=====>........................] - ETA: 2:32 - loss: 0.9697 - regression_loss: 0.8460 - classification_loss: 0.1237 110/500 [=====>........................] - ETA: 2:31 - loss: 0.9733 - regression_loss: 0.8492 - classification_loss: 0.1241 111/500 [=====>........................] - ETA: 2:31 - loss: 0.9736 - regression_loss: 0.8497 - classification_loss: 0.1239 112/500 [=====>........................] - ETA: 2:31 - loss: 0.9717 - regression_loss: 0.8482 - classification_loss: 0.1235 113/500 [=====>........................] - ETA: 2:30 - loss: 0.9738 - regression_loss: 0.8502 - classification_loss: 0.1236 114/500 [=====>........................] - ETA: 2:30 - loss: 0.9788 - regression_loss: 0.8544 - classification_loss: 0.1244 115/500 [=====>........................] - ETA: 2:29 - loss: 0.9795 - regression_loss: 0.8551 - classification_loss: 0.1244 116/500 [=====>........................] - ETA: 2:29 - loss: 0.9782 - regression_loss: 0.8540 - classification_loss: 0.1242 117/500 [======>.......................] - ETA: 2:29 - loss: 0.9767 - regression_loss: 0.8530 - classification_loss: 0.1238 118/500 [======>.......................] - ETA: 2:28 - loss: 0.9781 - regression_loss: 0.8541 - classification_loss: 0.1240 119/500 [======>.......................] - ETA: 2:28 - loss: 0.9739 - regression_loss: 0.8506 - classification_loss: 0.1233 120/500 [======>.......................] - ETA: 2:28 - loss: 0.9735 - regression_loss: 0.8504 - classification_loss: 0.1231 121/500 [======>.......................] - ETA: 2:27 - loss: 0.9803 - regression_loss: 0.8561 - classification_loss: 0.1242 122/500 [======>.......................] - ETA: 2:27 - loss: 0.9860 - regression_loss: 0.8614 - classification_loss: 0.1246 123/500 [======>.......................] - ETA: 2:27 - loss: 0.9836 - regression_loss: 0.8592 - classification_loss: 0.1245 124/500 [======>.......................] - ETA: 2:26 - loss: 0.9859 - regression_loss: 0.8614 - classification_loss: 0.1245 125/500 [======>.......................] - ETA: 2:26 - loss: 0.9815 - regression_loss: 0.8576 - classification_loss: 0.1239 126/500 [======>.......................] - ETA: 2:25 - loss: 0.9849 - regression_loss: 0.8611 - classification_loss: 0.1238 127/500 [======>.......................] - ETA: 2:25 - loss: 0.9854 - regression_loss: 0.8621 - classification_loss: 0.1233 128/500 [======>.......................] - ETA: 2:25 - loss: 0.9874 - regression_loss: 0.8639 - classification_loss: 0.1236 129/500 [======>.......................] - ETA: 2:24 - loss: 0.9836 - regression_loss: 0.8604 - classification_loss: 0.1232 130/500 [======>.......................] - ETA: 2:24 - loss: 0.9822 - regression_loss: 0.8591 - classification_loss: 0.1231 131/500 [======>.......................] - ETA: 2:23 - loss: 0.9866 - regression_loss: 0.8631 - classification_loss: 0.1235 132/500 [======>.......................] - ETA: 2:23 - loss: 0.9842 - regression_loss: 0.8614 - classification_loss: 0.1229 133/500 [======>.......................] - ETA: 2:23 - loss: 0.9832 - regression_loss: 0.8606 - classification_loss: 0.1226 134/500 [=======>......................] - ETA: 2:22 - loss: 0.9819 - regression_loss: 0.8595 - classification_loss: 0.1225 135/500 [=======>......................] - ETA: 2:22 - loss: 0.9839 - regression_loss: 0.8611 - classification_loss: 0.1228 136/500 [=======>......................] - ETA: 2:21 - loss: 0.9828 - regression_loss: 0.8601 - classification_loss: 0.1227 137/500 [=======>......................] - ETA: 2:21 - loss: 0.9835 - regression_loss: 0.8608 - classification_loss: 0.1227 138/500 [=======>......................] - ETA: 2:21 - loss: 0.9821 - regression_loss: 0.8596 - classification_loss: 0.1224 139/500 [=======>......................] - ETA: 2:20 - loss: 0.9817 - regression_loss: 0.8596 - classification_loss: 0.1221 140/500 [=======>......................] - ETA: 2:20 - loss: 0.9815 - regression_loss: 0.8596 - classification_loss: 0.1219 141/500 [=======>......................] - ETA: 2:20 - loss: 0.9779 - regression_loss: 0.8565 - classification_loss: 0.1214 142/500 [=======>......................] - ETA: 2:19 - loss: 0.9802 - regression_loss: 0.8585 - classification_loss: 0.1218 143/500 [=======>......................] - ETA: 2:19 - loss: 0.9781 - regression_loss: 0.8564 - classification_loss: 0.1217 144/500 [=======>......................] - ETA: 2:18 - loss: 0.9762 - regression_loss: 0.8548 - classification_loss: 0.1214 145/500 [=======>......................] - ETA: 2:18 - loss: 0.9738 - regression_loss: 0.8527 - classification_loss: 0.1211 146/500 [=======>......................] - ETA: 2:18 - loss: 0.9761 - regression_loss: 0.8547 - classification_loss: 0.1214 147/500 [=======>......................] - ETA: 2:17 - loss: 0.9757 - regression_loss: 0.8544 - classification_loss: 0.1213 148/500 [=======>......................] - ETA: 2:17 - loss: 0.9741 - regression_loss: 0.8530 - classification_loss: 0.1211 149/500 [=======>......................] - ETA: 2:17 - loss: 0.9730 - regression_loss: 0.8522 - classification_loss: 0.1208 150/500 [========>.....................] - ETA: 2:16 - loss: 0.9750 - regression_loss: 0.8542 - classification_loss: 0.1208 151/500 [========>.....................] - ETA: 2:16 - loss: 0.9762 - regression_loss: 0.8555 - classification_loss: 0.1207 152/500 [========>.....................] - ETA: 2:16 - loss: 0.9739 - regression_loss: 0.8534 - classification_loss: 0.1205 153/500 [========>.....................] - ETA: 2:15 - loss: 0.9693 - regression_loss: 0.8493 - classification_loss: 0.1200 154/500 [========>.....................] - ETA: 2:15 - loss: 0.9709 - regression_loss: 0.8509 - classification_loss: 0.1200 155/500 [========>.....................] - ETA: 2:14 - loss: 0.9734 - regression_loss: 0.8535 - classification_loss: 0.1199 156/500 [========>.....................] - ETA: 2:14 - loss: 0.9743 - regression_loss: 0.8546 - classification_loss: 0.1197 157/500 [========>.....................] - ETA: 2:14 - loss: 0.9733 - regression_loss: 0.8536 - classification_loss: 0.1197 158/500 [========>.....................] - ETA: 2:13 - loss: 0.9740 - regression_loss: 0.8542 - classification_loss: 0.1198 159/500 [========>.....................] - ETA: 2:13 - loss: 0.9738 - regression_loss: 0.8539 - classification_loss: 0.1199 160/500 [========>.....................] - ETA: 2:12 - loss: 0.9753 - regression_loss: 0.8555 - classification_loss: 0.1198 161/500 [========>.....................] - ETA: 2:12 - loss: 0.9732 - regression_loss: 0.8536 - classification_loss: 0.1196 162/500 [========>.....................] - ETA: 2:12 - loss: 0.9711 - regression_loss: 0.8518 - classification_loss: 0.1192 163/500 [========>.....................] - ETA: 2:11 - loss: 0.9744 - regression_loss: 0.8549 - classification_loss: 0.1195 164/500 [========>.....................] - ETA: 2:11 - loss: 0.9759 - regression_loss: 0.8560 - classification_loss: 0.1199 165/500 [========>.....................] - ETA: 2:10 - loss: 0.9774 - regression_loss: 0.8572 - classification_loss: 0.1202 166/500 [========>.....................] - ETA: 2:10 - loss: 0.9748 - regression_loss: 0.8549 - classification_loss: 0.1199 167/500 [=========>....................] - ETA: 2:10 - loss: 0.9755 - regression_loss: 0.8553 - classification_loss: 0.1202 168/500 [=========>....................] - ETA: 2:09 - loss: 0.9763 - regression_loss: 0.8561 - classification_loss: 0.1203 169/500 [=========>....................] - ETA: 2:09 - loss: 0.9751 - regression_loss: 0.8553 - classification_loss: 0.1199 170/500 [=========>....................] - ETA: 2:08 - loss: 0.9726 - regression_loss: 0.8531 - classification_loss: 0.1195 171/500 [=========>....................] - ETA: 2:08 - loss: 0.9696 - regression_loss: 0.8505 - classification_loss: 0.1191 172/500 [=========>....................] - ETA: 2:08 - loss: 0.9692 - regression_loss: 0.8502 - classification_loss: 0.1190 173/500 [=========>....................] - ETA: 2:07 - loss: 0.9709 - regression_loss: 0.8516 - classification_loss: 0.1193 174/500 [=========>....................] - ETA: 2:07 - loss: 0.9715 - regression_loss: 0.8521 - classification_loss: 0.1193 175/500 [=========>....................] - ETA: 2:06 - loss: 0.9725 - regression_loss: 0.8530 - classification_loss: 0.1195 176/500 [=========>....................] - ETA: 2:06 - loss: 0.9730 - regression_loss: 0.8537 - classification_loss: 0.1193 177/500 [=========>....................] - ETA: 2:06 - loss: 0.9757 - regression_loss: 0.8558 - classification_loss: 0.1199 178/500 [=========>....................] - ETA: 2:05 - loss: 0.9750 - regression_loss: 0.8553 - classification_loss: 0.1197 179/500 [=========>....................] - ETA: 2:05 - loss: 0.9775 - regression_loss: 0.8576 - classification_loss: 0.1199 180/500 [=========>....................] - ETA: 2:04 - loss: 0.9784 - regression_loss: 0.8584 - classification_loss: 0.1199 181/500 [=========>....................] - ETA: 2:04 - loss: 0.9825 - regression_loss: 0.8619 - classification_loss: 0.1207 182/500 [=========>....................] - ETA: 2:04 - loss: 0.9812 - regression_loss: 0.8607 - classification_loss: 0.1205 183/500 [=========>....................] - ETA: 2:03 - loss: 0.9812 - regression_loss: 0.8607 - classification_loss: 0.1205 184/500 [==========>...................] - ETA: 2:03 - loss: 0.9807 - regression_loss: 0.8604 - classification_loss: 0.1203 185/500 [==========>...................] - ETA: 2:02 - loss: 0.9769 - regression_loss: 0.8571 - classification_loss: 0.1198 186/500 [==========>...................] - ETA: 2:02 - loss: 0.9770 - regression_loss: 0.8573 - classification_loss: 0.1197 187/500 [==========>...................] - ETA: 2:02 - loss: 0.9777 - regression_loss: 0.8580 - classification_loss: 0.1197 188/500 [==========>...................] - ETA: 2:01 - loss: 0.9788 - regression_loss: 0.8588 - classification_loss: 0.1200 189/500 [==========>...................] - ETA: 2:01 - loss: 0.9752 - regression_loss: 0.8557 - classification_loss: 0.1195 190/500 [==========>...................] - ETA: 2:00 - loss: 0.9742 - regression_loss: 0.8549 - classification_loss: 0.1194 191/500 [==========>...................] - ETA: 2:00 - loss: 0.9738 - regression_loss: 0.8543 - classification_loss: 0.1196 192/500 [==========>...................] - ETA: 2:00 - loss: 0.9733 - regression_loss: 0.8538 - classification_loss: 0.1195 193/500 [==========>...................] - ETA: 1:59 - loss: 0.9730 - regression_loss: 0.8534 - classification_loss: 0.1196 194/500 [==========>...................] - ETA: 1:59 - loss: 0.9742 - regression_loss: 0.8544 - classification_loss: 0.1198 195/500 [==========>...................] - ETA: 1:59 - loss: 0.9746 - regression_loss: 0.8548 - classification_loss: 0.1198 196/500 [==========>...................] - ETA: 1:58 - loss: 0.9738 - regression_loss: 0.8542 - classification_loss: 0.1196 197/500 [==========>...................] - ETA: 1:58 - loss: 0.9732 - regression_loss: 0.8537 - classification_loss: 0.1195 198/500 [==========>...................] - ETA: 1:57 - loss: 0.9718 - regression_loss: 0.8526 - classification_loss: 0.1192 199/500 [==========>...................] - ETA: 1:57 - loss: 0.9729 - regression_loss: 0.8537 - classification_loss: 0.1192 200/500 [===========>..................] - ETA: 1:57 - loss: 0.9715 - regression_loss: 0.8526 - classification_loss: 0.1189 201/500 [===========>..................] - ETA: 1:56 - loss: 0.9712 - regression_loss: 0.8524 - classification_loss: 0.1188 202/500 [===========>..................] - ETA: 1:56 - loss: 0.9717 - regression_loss: 0.8527 - classification_loss: 0.1189 203/500 [===========>..................] - ETA: 1:56 - loss: 0.9714 - regression_loss: 0.8526 - classification_loss: 0.1189 204/500 [===========>..................] - ETA: 1:55 - loss: 0.9706 - regression_loss: 0.8518 - classification_loss: 0.1188 205/500 [===========>..................] - ETA: 1:55 - loss: 0.9723 - regression_loss: 0.8532 - classification_loss: 0.1190 206/500 [===========>..................] - ETA: 1:54 - loss: 0.9729 - regression_loss: 0.8538 - classification_loss: 0.1192 207/500 [===========>..................] - ETA: 1:54 - loss: 0.9731 - regression_loss: 0.8539 - classification_loss: 0.1192 208/500 [===========>..................] - ETA: 1:54 - loss: 0.9707 - regression_loss: 0.8517 - classification_loss: 0.1190 209/500 [===========>..................] - ETA: 1:53 - loss: 0.9685 - regression_loss: 0.8498 - classification_loss: 0.1187 210/500 [===========>..................] - ETA: 1:53 - loss: 0.9658 - regression_loss: 0.8474 - classification_loss: 0.1184 211/500 [===========>..................] - ETA: 1:52 - loss: 0.9654 - regression_loss: 0.8472 - classification_loss: 0.1183 212/500 [===========>..................] - ETA: 1:52 - loss: 0.9657 - regression_loss: 0.8475 - classification_loss: 0.1182 213/500 [===========>..................] - ETA: 1:52 - loss: 0.9663 - regression_loss: 0.8481 - classification_loss: 0.1182 214/500 [===========>..................] - ETA: 1:51 - loss: 0.9665 - regression_loss: 0.8483 - classification_loss: 0.1183 215/500 [===========>..................] - ETA: 1:51 - loss: 0.9665 - regression_loss: 0.8482 - classification_loss: 0.1183 216/500 [===========>..................] - ETA: 1:51 - loss: 0.9654 - regression_loss: 0.8472 - classification_loss: 0.1182 217/500 [============>.................] - ETA: 1:50 - loss: 0.9642 - regression_loss: 0.8461 - classification_loss: 0.1181 218/500 [============>.................] - ETA: 1:50 - loss: 0.9631 - regression_loss: 0.8451 - classification_loss: 0.1180 219/500 [============>.................] - ETA: 1:49 - loss: 0.9640 - regression_loss: 0.8461 - classification_loss: 0.1179 220/500 [============>.................] - ETA: 1:49 - loss: 0.9632 - regression_loss: 0.8454 - classification_loss: 0.1179 221/500 [============>.................] - ETA: 1:49 - loss: 0.9638 - regression_loss: 0.8460 - classification_loss: 0.1179 222/500 [============>.................] - ETA: 1:48 - loss: 0.9619 - regression_loss: 0.8442 - classification_loss: 0.1177 223/500 [============>.................] - ETA: 1:48 - loss: 0.9615 - regression_loss: 0.8439 - classification_loss: 0.1176 224/500 [============>.................] - ETA: 1:47 - loss: 0.9621 - regression_loss: 0.8446 - classification_loss: 0.1176 225/500 [============>.................] - ETA: 1:47 - loss: 0.9613 - regression_loss: 0.8438 - classification_loss: 0.1175 226/500 [============>.................] - ETA: 1:47 - loss: 0.9610 - regression_loss: 0.8434 - classification_loss: 0.1176 227/500 [============>.................] - ETA: 1:46 - loss: 0.9597 - regression_loss: 0.8424 - classification_loss: 0.1173 228/500 [============>.................] - ETA: 1:46 - loss: 0.9600 - regression_loss: 0.8428 - classification_loss: 0.1172 229/500 [============>.................] - ETA: 1:45 - loss: 0.9599 - regression_loss: 0.8425 - classification_loss: 0.1173 230/500 [============>.................] - ETA: 1:45 - loss: 0.9573 - regression_loss: 0.8403 - classification_loss: 0.1170 231/500 [============>.................] - ETA: 1:45 - loss: 0.9560 - regression_loss: 0.8392 - classification_loss: 0.1168 232/500 [============>.................] - ETA: 1:44 - loss: 0.9577 - regression_loss: 0.8406 - classification_loss: 0.1170 233/500 [============>.................] - ETA: 1:44 - loss: 0.9574 - regression_loss: 0.8405 - classification_loss: 0.1169 234/500 [=============>................] - ETA: 1:43 - loss: 0.9587 - regression_loss: 0.8417 - classification_loss: 0.1171 235/500 [=============>................] - ETA: 1:43 - loss: 0.9596 - regression_loss: 0.8424 - classification_loss: 0.1171 236/500 [=============>................] - ETA: 1:43 - loss: 0.9588 - regression_loss: 0.8418 - classification_loss: 0.1171 237/500 [=============>................] - ETA: 1:42 - loss: 0.9578 - regression_loss: 0.8408 - classification_loss: 0.1169 238/500 [=============>................] - ETA: 1:42 - loss: 0.9577 - regression_loss: 0.8409 - classification_loss: 0.1168 239/500 [=============>................] - ETA: 1:41 - loss: 0.9574 - regression_loss: 0.8406 - classification_loss: 0.1168 240/500 [=============>................] - ETA: 1:41 - loss: 0.9575 - regression_loss: 0.8407 - classification_loss: 0.1167 241/500 [=============>................] - ETA: 1:41 - loss: 0.9577 - regression_loss: 0.8409 - classification_loss: 0.1168 242/500 [=============>................] - ETA: 1:40 - loss: 0.9556 - regression_loss: 0.8390 - classification_loss: 0.1166 243/500 [=============>................] - ETA: 1:40 - loss: 0.9566 - regression_loss: 0.8399 - classification_loss: 0.1167 244/500 [=============>................] - ETA: 1:39 - loss: 0.9556 - regression_loss: 0.8390 - classification_loss: 0.1166 245/500 [=============>................] - ETA: 1:39 - loss: 0.9553 - regression_loss: 0.8387 - classification_loss: 0.1166 246/500 [=============>................] - ETA: 1:39 - loss: 0.9558 - regression_loss: 0.8392 - classification_loss: 0.1166 247/500 [=============>................] - ETA: 1:38 - loss: 0.9548 - regression_loss: 0.8382 - classification_loss: 0.1166 248/500 [=============>................] - ETA: 1:38 - loss: 0.9552 - regression_loss: 0.8386 - classification_loss: 0.1166 249/500 [=============>................] - ETA: 1:37 - loss: 0.9550 - regression_loss: 0.8385 - classification_loss: 0.1165 250/500 [==============>...............] - ETA: 1:37 - loss: 0.9547 - regression_loss: 0.8382 - classification_loss: 0.1164 251/500 [==============>...............] - ETA: 1:37 - loss: 0.9569 - regression_loss: 0.8400 - classification_loss: 0.1168 252/500 [==============>...............] - ETA: 1:36 - loss: 0.9568 - regression_loss: 0.8400 - classification_loss: 0.1168 253/500 [==============>...............] - ETA: 1:36 - loss: 0.9561 - regression_loss: 0.8395 - classification_loss: 0.1167 254/500 [==============>...............] - ETA: 1:35 - loss: 0.9561 - regression_loss: 0.8395 - classification_loss: 0.1166 255/500 [==============>...............] - ETA: 1:35 - loss: 0.9552 - regression_loss: 0.8387 - classification_loss: 0.1165 256/500 [==============>...............] - ETA: 1:35 - loss: 0.9532 - regression_loss: 0.8369 - classification_loss: 0.1163 257/500 [==============>...............] - ETA: 1:34 - loss: 0.9524 - regression_loss: 0.8362 - classification_loss: 0.1162 258/500 [==============>...............] - ETA: 1:34 - loss: 0.9503 - regression_loss: 0.8345 - classification_loss: 0.1159 259/500 [==============>...............] - ETA: 1:34 - loss: 0.9500 - regression_loss: 0.8342 - classification_loss: 0.1157 260/500 [==============>...............] - ETA: 1:33 - loss: 0.9478 - regression_loss: 0.8324 - classification_loss: 0.1154 261/500 [==============>...............] - ETA: 1:33 - loss: 0.9459 - regression_loss: 0.8308 - classification_loss: 0.1151 262/500 [==============>...............] - ETA: 1:32 - loss: 0.9482 - regression_loss: 0.8329 - classification_loss: 0.1153 263/500 [==============>...............] - ETA: 1:32 - loss: 0.9490 - regression_loss: 0.8336 - classification_loss: 0.1154 264/500 [==============>...............] - ETA: 1:32 - loss: 0.9482 - regression_loss: 0.8330 - classification_loss: 0.1152 265/500 [==============>...............] - ETA: 1:31 - loss: 0.9476 - regression_loss: 0.8325 - classification_loss: 0.1152 266/500 [==============>...............] - ETA: 1:31 - loss: 0.9486 - regression_loss: 0.8334 - classification_loss: 0.1152 267/500 [===============>..............] - ETA: 1:30 - loss: 0.9517 - regression_loss: 0.8365 - classification_loss: 0.1152 268/500 [===============>..............] - ETA: 1:30 - loss: 0.9523 - regression_loss: 0.8372 - classification_loss: 0.1151 269/500 [===============>..............] - ETA: 1:30 - loss: 0.9532 - regression_loss: 0.8378 - classification_loss: 0.1154 270/500 [===============>..............] - ETA: 1:29 - loss: 0.9517 - regression_loss: 0.8366 - classification_loss: 0.1151 271/500 [===============>..............] - ETA: 1:29 - loss: 0.9505 - regression_loss: 0.8356 - classification_loss: 0.1149 272/500 [===============>..............] - ETA: 1:28 - loss: 0.9495 - regression_loss: 0.8347 - classification_loss: 0.1148 273/500 [===============>..............] - ETA: 1:28 - loss: 0.9481 - regression_loss: 0.8336 - classification_loss: 0.1146 274/500 [===============>..............] - ETA: 1:28 - loss: 0.9495 - regression_loss: 0.8347 - classification_loss: 0.1148 275/500 [===============>..............] - ETA: 1:27 - loss: 0.9509 - regression_loss: 0.8359 - classification_loss: 0.1150 276/500 [===============>..............] - ETA: 1:27 - loss: 0.9507 - regression_loss: 0.8359 - classification_loss: 0.1148 277/500 [===============>..............] - ETA: 1:26 - loss: 0.9514 - regression_loss: 0.8365 - classification_loss: 0.1148 278/500 [===============>..............] - ETA: 1:26 - loss: 0.9502 - regression_loss: 0.8355 - classification_loss: 0.1146 279/500 [===============>..............] - ETA: 1:26 - loss: 0.9517 - regression_loss: 0.8368 - classification_loss: 0.1149 280/500 [===============>..............] - ETA: 1:25 - loss: 0.9536 - regression_loss: 0.8384 - classification_loss: 0.1152 281/500 [===============>..............] - ETA: 1:25 - loss: 0.9527 - regression_loss: 0.8376 - classification_loss: 0.1151 282/500 [===============>..............] - ETA: 1:25 - loss: 0.9529 - regression_loss: 0.8378 - classification_loss: 0.1150 283/500 [===============>..............] - ETA: 1:24 - loss: 0.9520 - regression_loss: 0.8371 - classification_loss: 0.1149 284/500 [================>.............] - ETA: 1:24 - loss: 0.9526 - regression_loss: 0.8376 - classification_loss: 0.1150 285/500 [================>.............] - ETA: 1:23 - loss: 0.9528 - regression_loss: 0.8378 - classification_loss: 0.1150 286/500 [================>.............] - ETA: 1:23 - loss: 0.9536 - regression_loss: 0.8385 - classification_loss: 0.1151 287/500 [================>.............] - ETA: 1:23 - loss: 0.9538 - regression_loss: 0.8388 - classification_loss: 0.1150 288/500 [================>.............] - ETA: 1:22 - loss: 0.9536 - regression_loss: 0.8385 - classification_loss: 0.1151 289/500 [================>.............] - ETA: 1:22 - loss: 0.9535 - regression_loss: 0.8385 - classification_loss: 0.1151 290/500 [================>.............] - ETA: 1:21 - loss: 0.9521 - regression_loss: 0.8372 - classification_loss: 0.1149 291/500 [================>.............] - ETA: 1:21 - loss: 0.9517 - regression_loss: 0.8369 - classification_loss: 0.1148 292/500 [================>.............] - ETA: 1:21 - loss: 0.9520 - regression_loss: 0.8370 - classification_loss: 0.1150 293/500 [================>.............] - ETA: 1:20 - loss: 0.9525 - regression_loss: 0.8376 - classification_loss: 0.1149 294/500 [================>.............] - ETA: 1:20 - loss: 0.9530 - regression_loss: 0.8382 - classification_loss: 0.1148 295/500 [================>.............] - ETA: 1:20 - loss: 0.9532 - regression_loss: 0.8383 - classification_loss: 0.1149 296/500 [================>.............] - ETA: 1:19 - loss: 0.9529 - regression_loss: 0.8380 - classification_loss: 0.1148 297/500 [================>.............] - ETA: 1:19 - loss: 0.9528 - regression_loss: 0.8379 - classification_loss: 0.1149 298/500 [================>.............] - ETA: 1:18 - loss: 0.9542 - regression_loss: 0.8392 - classification_loss: 0.1150 299/500 [================>.............] - ETA: 1:18 - loss: 0.9526 - regression_loss: 0.8378 - classification_loss: 0.1148 300/500 [=================>............] - ETA: 1:18 - loss: 0.9525 - regression_loss: 0.8377 - classification_loss: 0.1148 301/500 [=================>............] - ETA: 1:17 - loss: 0.9542 - regression_loss: 0.8392 - classification_loss: 0.1150 302/500 [=================>............] - ETA: 1:17 - loss: 0.9548 - regression_loss: 0.8398 - classification_loss: 0.1150 303/500 [=================>............] - ETA: 1:16 - loss: 0.9550 - regression_loss: 0.8401 - classification_loss: 0.1150 304/500 [=================>............] - ETA: 1:16 - loss: 0.9551 - regression_loss: 0.8401 - classification_loss: 0.1150 305/500 [=================>............] - ETA: 1:16 - loss: 0.9534 - regression_loss: 0.8385 - classification_loss: 0.1149 306/500 [=================>............] - ETA: 1:15 - loss: 0.9543 - regression_loss: 0.8392 - classification_loss: 0.1151 307/500 [=================>............] - ETA: 1:15 - loss: 0.9540 - regression_loss: 0.8388 - classification_loss: 0.1151 308/500 [=================>............] - ETA: 1:14 - loss: 0.9555 - regression_loss: 0.8401 - classification_loss: 0.1154 309/500 [=================>............] - ETA: 1:14 - loss: 0.9576 - regression_loss: 0.8421 - classification_loss: 0.1155 310/500 [=================>............] - ETA: 1:14 - loss: 0.9577 - regression_loss: 0.8422 - classification_loss: 0.1155 311/500 [=================>............] - ETA: 1:13 - loss: 0.9579 - regression_loss: 0.8422 - classification_loss: 0.1156 312/500 [=================>............] - ETA: 1:13 - loss: 0.9591 - regression_loss: 0.8433 - classification_loss: 0.1158 313/500 [=================>............] - ETA: 1:12 - loss: 0.9578 - regression_loss: 0.8422 - classification_loss: 0.1156 314/500 [=================>............] - ETA: 1:12 - loss: 0.9588 - regression_loss: 0.8431 - classification_loss: 0.1157 315/500 [=================>............] - ETA: 1:12 - loss: 0.9595 - regression_loss: 0.8436 - classification_loss: 0.1159 316/500 [=================>............] - ETA: 1:11 - loss: 0.9580 - regression_loss: 0.8424 - classification_loss: 0.1157 317/500 [==================>...........] - ETA: 1:11 - loss: 0.9592 - regression_loss: 0.8434 - classification_loss: 0.1158 318/500 [==================>...........] - ETA: 1:10 - loss: 0.9581 - regression_loss: 0.8425 - classification_loss: 0.1157 319/500 [==================>...........] - ETA: 1:10 - loss: 0.9560 - regression_loss: 0.8406 - classification_loss: 0.1154 320/500 [==================>...........] - ETA: 1:10 - loss: 0.9555 - regression_loss: 0.8401 - classification_loss: 0.1154 321/500 [==================>...........] - ETA: 1:09 - loss: 0.9536 - regression_loss: 0.8385 - classification_loss: 0.1152 322/500 [==================>...........] - ETA: 1:09 - loss: 0.9531 - regression_loss: 0.8380 - classification_loss: 0.1151 323/500 [==================>...........] - ETA: 1:09 - loss: 0.9545 - regression_loss: 0.8392 - classification_loss: 0.1153 324/500 [==================>...........] - ETA: 1:08 - loss: 0.9547 - regression_loss: 0.8393 - classification_loss: 0.1153 325/500 [==================>...........] - ETA: 1:08 - loss: 0.9534 - regression_loss: 0.8382 - classification_loss: 0.1152 326/500 [==================>...........] - ETA: 1:07 - loss: 0.9531 - regression_loss: 0.8379 - classification_loss: 0.1152 327/500 [==================>...........] - ETA: 1:07 - loss: 0.9534 - regression_loss: 0.8380 - classification_loss: 0.1153 328/500 [==================>...........] - ETA: 1:07 - loss: 0.9545 - regression_loss: 0.8389 - classification_loss: 0.1156 329/500 [==================>...........] - ETA: 1:06 - loss: 0.9538 - regression_loss: 0.8384 - classification_loss: 0.1154 330/500 [==================>...........] - ETA: 1:06 - loss: 0.9544 - regression_loss: 0.8389 - classification_loss: 0.1155 331/500 [==================>...........] - ETA: 1:05 - loss: 0.9565 - regression_loss: 0.8406 - classification_loss: 0.1159 332/500 [==================>...........] - ETA: 1:05 - loss: 0.9567 - regression_loss: 0.8407 - classification_loss: 0.1160 333/500 [==================>...........] - ETA: 1:05 - loss: 0.9572 - regression_loss: 0.8412 - classification_loss: 0.1161 334/500 [===================>..........] - ETA: 1:04 - loss: 0.9562 - regression_loss: 0.8403 - classification_loss: 0.1160 335/500 [===================>..........] - ETA: 1:04 - loss: 0.9561 - regression_loss: 0.8403 - classification_loss: 0.1158 336/500 [===================>..........] - ETA: 1:03 - loss: 0.9555 - regression_loss: 0.8398 - classification_loss: 0.1157 337/500 [===================>..........] - ETA: 1:03 - loss: 0.9550 - regression_loss: 0.8392 - classification_loss: 0.1157 338/500 [===================>..........] - ETA: 1:03 - loss: 0.9550 - regression_loss: 0.8392 - classification_loss: 0.1157 339/500 [===================>..........] - ETA: 1:02 - loss: 0.9541 - regression_loss: 0.8385 - classification_loss: 0.1156 340/500 [===================>..........] - ETA: 1:02 - loss: 0.9539 - regression_loss: 0.8384 - classification_loss: 0.1155 341/500 [===================>..........] - ETA: 1:02 - loss: 0.9538 - regression_loss: 0.8383 - classification_loss: 0.1155 342/500 [===================>..........] - ETA: 1:01 - loss: 0.9524 - regression_loss: 0.8371 - classification_loss: 0.1153 343/500 [===================>..........] - ETA: 1:01 - loss: 0.9534 - regression_loss: 0.8380 - classification_loss: 0.1154 344/500 [===================>..........] - ETA: 1:00 - loss: 0.9531 - regression_loss: 0.8377 - classification_loss: 0.1153 345/500 [===================>..........] - ETA: 1:00 - loss: 0.9526 - regression_loss: 0.8373 - classification_loss: 0.1153 346/500 [===================>..........] - ETA: 1:00 - loss: 0.9522 - regression_loss: 0.8370 - classification_loss: 0.1152 347/500 [===================>..........] - ETA: 59s - loss: 0.9518 - regression_loss: 0.8367 - classification_loss: 0.1151  348/500 [===================>..........] - ETA: 59s - loss: 0.9528 - regression_loss: 0.8375 - classification_loss: 0.1153 349/500 [===================>..........] - ETA: 58s - loss: 0.9536 - regression_loss: 0.8382 - classification_loss: 0.1154 350/500 [====================>.........] - ETA: 58s - loss: 0.9526 - regression_loss: 0.8373 - classification_loss: 0.1152 351/500 [====================>.........] - ETA: 58s - loss: 0.9540 - regression_loss: 0.8385 - classification_loss: 0.1155 352/500 [====================>.........] - ETA: 57s - loss: 0.9547 - regression_loss: 0.8391 - classification_loss: 0.1156 353/500 [====================>.........] - ETA: 57s - loss: 0.9548 - regression_loss: 0.8391 - classification_loss: 0.1157 354/500 [====================>.........] - ETA: 56s - loss: 0.9547 - regression_loss: 0.8391 - classification_loss: 0.1156 355/500 [====================>.........] - ETA: 56s - loss: 0.9538 - regression_loss: 0.8383 - classification_loss: 0.1155 356/500 [====================>.........] - ETA: 56s - loss: 0.9522 - regression_loss: 0.8370 - classification_loss: 0.1153 357/500 [====================>.........] - ETA: 55s - loss: 0.9512 - regression_loss: 0.8361 - classification_loss: 0.1151 358/500 [====================>.........] - ETA: 55s - loss: 0.9507 - regression_loss: 0.8356 - classification_loss: 0.1151 359/500 [====================>.........] - ETA: 55s - loss: 0.9491 - regression_loss: 0.8343 - classification_loss: 0.1149 360/500 [====================>.........] - ETA: 54s - loss: 0.9477 - regression_loss: 0.8330 - classification_loss: 0.1147 361/500 [====================>.........] - ETA: 54s - loss: 0.9475 - regression_loss: 0.8329 - classification_loss: 0.1146 362/500 [====================>.........] - ETA: 53s - loss: 0.9484 - regression_loss: 0.8337 - classification_loss: 0.1148 363/500 [====================>.........] - ETA: 53s - loss: 0.9475 - regression_loss: 0.8329 - classification_loss: 0.1146 364/500 [====================>.........] - ETA: 53s - loss: 0.9485 - regression_loss: 0.8336 - classification_loss: 0.1148 365/500 [====================>.........] - ETA: 52s - loss: 0.9487 - regression_loss: 0.8339 - classification_loss: 0.1149 366/500 [====================>.........] - ETA: 52s - loss: 0.9489 - regression_loss: 0.8341 - classification_loss: 0.1148 367/500 [=====================>........] - ETA: 51s - loss: 0.9479 - regression_loss: 0.8332 - classification_loss: 0.1147 368/500 [=====================>........] - ETA: 51s - loss: 0.9478 - regression_loss: 0.8331 - classification_loss: 0.1147 369/500 [=====================>........] - ETA: 51s - loss: 0.9479 - regression_loss: 0.8331 - classification_loss: 0.1147 370/500 [=====================>........] - ETA: 50s - loss: 0.9476 - regression_loss: 0.8329 - classification_loss: 0.1147 371/500 [=====================>........] - ETA: 50s - loss: 0.9490 - regression_loss: 0.8341 - classification_loss: 0.1149 372/500 [=====================>........] - ETA: 49s - loss: 0.9485 - regression_loss: 0.8336 - classification_loss: 0.1149 373/500 [=====================>........] - ETA: 49s - loss: 0.9486 - regression_loss: 0.8337 - classification_loss: 0.1149 374/500 [=====================>........] - ETA: 49s - loss: 0.9475 - regression_loss: 0.8328 - classification_loss: 0.1147 375/500 [=====================>........] - ETA: 48s - loss: 0.9473 - regression_loss: 0.8326 - classification_loss: 0.1146 376/500 [=====================>........] - ETA: 48s - loss: 0.9465 - regression_loss: 0.8320 - classification_loss: 0.1145 377/500 [=====================>........] - ETA: 47s - loss: 0.9478 - regression_loss: 0.8332 - classification_loss: 0.1146 378/500 [=====================>........] - ETA: 47s - loss: 0.9500 - regression_loss: 0.8353 - classification_loss: 0.1147 379/500 [=====================>........] - ETA: 47s - loss: 0.9517 - regression_loss: 0.8368 - classification_loss: 0.1149 380/500 [=====================>........] - ETA: 46s - loss: 0.9530 - regression_loss: 0.8377 - classification_loss: 0.1153 381/500 [=====================>........] - ETA: 46s - loss: 0.9545 - regression_loss: 0.8388 - classification_loss: 0.1157 382/500 [=====================>........] - ETA: 45s - loss: 0.9563 - regression_loss: 0.8403 - classification_loss: 0.1161 383/500 [=====================>........] - ETA: 45s - loss: 0.9571 - regression_loss: 0.8409 - classification_loss: 0.1162 384/500 [======================>.......] - ETA: 45s - loss: 0.9562 - regression_loss: 0.8402 - classification_loss: 0.1160 385/500 [======================>.......] - ETA: 44s - loss: 0.9571 - regression_loss: 0.8412 - classification_loss: 0.1159 386/500 [======================>.......] - ETA: 44s - loss: 0.9570 - regression_loss: 0.8411 - classification_loss: 0.1160 387/500 [======================>.......] - ETA: 44s - loss: 0.9571 - regression_loss: 0.8411 - classification_loss: 0.1160 388/500 [======================>.......] - ETA: 43s - loss: 0.9574 - regression_loss: 0.8414 - classification_loss: 0.1160 389/500 [======================>.......] - ETA: 43s - loss: 0.9562 - regression_loss: 0.8403 - classification_loss: 0.1159 390/500 [======================>.......] - ETA: 42s - loss: 0.9580 - regression_loss: 0.8415 - classification_loss: 0.1165 391/500 [======================>.......] - ETA: 42s - loss: 0.9589 - regression_loss: 0.8423 - classification_loss: 0.1166 392/500 [======================>.......] - ETA: 42s - loss: 0.9597 - regression_loss: 0.8430 - classification_loss: 0.1167 393/500 [======================>.......] - ETA: 41s - loss: 0.9601 - regression_loss: 0.8432 - classification_loss: 0.1169 394/500 [======================>.......] - ETA: 41s - loss: 0.9603 - regression_loss: 0.8435 - classification_loss: 0.1169 395/500 [======================>.......] - ETA: 40s - loss: 0.9612 - regression_loss: 0.8442 - classification_loss: 0.1170 396/500 [======================>.......] - ETA: 40s - loss: 0.9608 - regression_loss: 0.8438 - classification_loss: 0.1170 397/500 [======================>.......] - ETA: 40s - loss: 0.9600 - regression_loss: 0.8430 - classification_loss: 0.1170 398/500 [======================>.......] - ETA: 39s - loss: 0.9594 - regression_loss: 0.8424 - classification_loss: 0.1170 399/500 [======================>.......] - ETA: 39s - loss: 0.9598 - regression_loss: 0.8427 - classification_loss: 0.1171 400/500 [=======================>......] - ETA: 38s - loss: 0.9606 - regression_loss: 0.8434 - classification_loss: 0.1172 401/500 [=======================>......] - ETA: 38s - loss: 0.9607 - regression_loss: 0.8435 - classification_loss: 0.1173 402/500 [=======================>......] - ETA: 38s - loss: 0.9608 - regression_loss: 0.8435 - classification_loss: 0.1173 403/500 [=======================>......] - ETA: 37s - loss: 0.9608 - regression_loss: 0.8435 - classification_loss: 0.1173 404/500 [=======================>......] - ETA: 37s - loss: 0.9605 - regression_loss: 0.8432 - classification_loss: 0.1173 405/500 [=======================>......] - ETA: 37s - loss: 0.9601 - regression_loss: 0.8428 - classification_loss: 0.1174 406/500 [=======================>......] - ETA: 36s - loss: 0.9609 - regression_loss: 0.8435 - classification_loss: 0.1175 407/500 [=======================>......] - ETA: 36s - loss: 0.9608 - regression_loss: 0.8434 - classification_loss: 0.1174 408/500 [=======================>......] - ETA: 35s - loss: 0.9610 - regression_loss: 0.8436 - classification_loss: 0.1174 409/500 [=======================>......] - ETA: 35s - loss: 0.9614 - regression_loss: 0.8439 - classification_loss: 0.1175 410/500 [=======================>......] - ETA: 35s - loss: 0.9621 - regression_loss: 0.8445 - classification_loss: 0.1177 411/500 [=======================>......] - ETA: 34s - loss: 0.9632 - regression_loss: 0.8453 - classification_loss: 0.1178 412/500 [=======================>......] - ETA: 34s - loss: 0.9627 - regression_loss: 0.8449 - classification_loss: 0.1178 413/500 [=======================>......] - ETA: 33s - loss: 0.9637 - regression_loss: 0.8459 - classification_loss: 0.1179 414/500 [=======================>......] - ETA: 33s - loss: 0.9640 - regression_loss: 0.8461 - classification_loss: 0.1179 415/500 [=======================>......] - ETA: 33s - loss: 0.9640 - regression_loss: 0.8461 - classification_loss: 0.1179 416/500 [=======================>......] - ETA: 32s - loss: 0.9633 - regression_loss: 0.8455 - classification_loss: 0.1178 417/500 [========================>.....] - ETA: 32s - loss: 0.9637 - regression_loss: 0.8459 - classification_loss: 0.1178 418/500 [========================>.....] - ETA: 31s - loss: 0.9625 - regression_loss: 0.8448 - classification_loss: 0.1177 419/500 [========================>.....] - ETA: 31s - loss: 0.9634 - regression_loss: 0.8457 - classification_loss: 0.1178 420/500 [========================>.....] - ETA: 31s - loss: 0.9620 - regression_loss: 0.8444 - classification_loss: 0.1176 421/500 [========================>.....] - ETA: 30s - loss: 0.9621 - regression_loss: 0.8446 - classification_loss: 0.1176 422/500 [========================>.....] - ETA: 30s - loss: 0.9621 - regression_loss: 0.8445 - classification_loss: 0.1175 423/500 [========================>.....] - ETA: 29s - loss: 0.9620 - regression_loss: 0.8445 - classification_loss: 0.1175 424/500 [========================>.....] - ETA: 29s - loss: 0.9613 - regression_loss: 0.8440 - classification_loss: 0.1173 425/500 [========================>.....] - ETA: 29s - loss: 0.9616 - regression_loss: 0.8442 - classification_loss: 0.1174 426/500 [========================>.....] - ETA: 28s - loss: 0.9618 - regression_loss: 0.8444 - classification_loss: 0.1174 427/500 [========================>.....] - ETA: 28s - loss: 0.9612 - regression_loss: 0.8438 - classification_loss: 0.1174 428/500 [========================>.....] - ETA: 28s - loss: 0.9609 - regression_loss: 0.8437 - classification_loss: 0.1173 429/500 [========================>.....] - ETA: 27s - loss: 0.9604 - regression_loss: 0.8433 - classification_loss: 0.1172 430/500 [========================>.....] - ETA: 27s - loss: 0.9608 - regression_loss: 0.8436 - classification_loss: 0.1172 431/500 [========================>.....] - ETA: 26s - loss: 0.9606 - regression_loss: 0.8434 - classification_loss: 0.1172 432/500 [========================>.....] - ETA: 26s - loss: 0.9620 - regression_loss: 0.8447 - classification_loss: 0.1173 433/500 [========================>.....] - ETA: 26s - loss: 0.9624 - regression_loss: 0.8450 - classification_loss: 0.1174 434/500 [=========================>....] - ETA: 25s - loss: 0.9618 - regression_loss: 0.8445 - classification_loss: 0.1173 435/500 [=========================>....] - ETA: 25s - loss: 0.9622 - regression_loss: 0.8448 - classification_loss: 0.1174 436/500 [=========================>....] - ETA: 24s - loss: 0.9613 - regression_loss: 0.8441 - classification_loss: 0.1172 437/500 [=========================>....] - ETA: 24s - loss: 0.9612 - regression_loss: 0.8440 - classification_loss: 0.1172 438/500 [=========================>....] - ETA: 24s - loss: 0.9611 - regression_loss: 0.8440 - classification_loss: 0.1171 439/500 [=========================>....] - ETA: 23s - loss: 0.9613 - regression_loss: 0.8443 - classification_loss: 0.1170 440/500 [=========================>....] - ETA: 23s - loss: 0.9616 - regression_loss: 0.8446 - classification_loss: 0.1169 441/500 [=========================>....] - ETA: 22s - loss: 0.9611 - regression_loss: 0.8443 - classification_loss: 0.1168 442/500 [=========================>....] - ETA: 22s - loss: 0.9617 - regression_loss: 0.8448 - classification_loss: 0.1169 443/500 [=========================>....] - ETA: 22s - loss: 0.9623 - regression_loss: 0.8454 - classification_loss: 0.1168 444/500 [=========================>....] - ETA: 21s - loss: 0.9626 - regression_loss: 0.8458 - classification_loss: 0.1168 445/500 [=========================>....] - ETA: 21s - loss: 0.9622 - regression_loss: 0.8455 - classification_loss: 0.1167 446/500 [=========================>....] - ETA: 20s - loss: 0.9625 - regression_loss: 0.8456 - classification_loss: 0.1169 447/500 [=========================>....] - ETA: 20s - loss: 0.9615 - regression_loss: 0.8447 - classification_loss: 0.1168 448/500 [=========================>....] - ETA: 20s - loss: 0.9618 - regression_loss: 0.8449 - classification_loss: 0.1169 449/500 [=========================>....] - ETA: 19s - loss: 0.9612 - regression_loss: 0.8444 - classification_loss: 0.1169 450/500 [==========================>...] - ETA: 19s - loss: 0.9615 - regression_loss: 0.8446 - classification_loss: 0.1169 451/500 [==========================>...] - ETA: 19s - loss: 0.9607 - regression_loss: 0.8439 - classification_loss: 0.1168 452/500 [==========================>...] - ETA: 18s - loss: 0.9594 - regression_loss: 0.8428 - classification_loss: 0.1166 453/500 [==========================>...] - ETA: 18s - loss: 0.9593 - regression_loss: 0.8428 - classification_loss: 0.1166 454/500 [==========================>...] - ETA: 17s - loss: 0.9585 - regression_loss: 0.8421 - classification_loss: 0.1164 455/500 [==========================>...] - ETA: 17s - loss: 0.9573 - regression_loss: 0.8411 - classification_loss: 0.1162 456/500 [==========================>...] - ETA: 17s - loss: 0.9568 - regression_loss: 0.8406 - classification_loss: 0.1161 457/500 [==========================>...] - ETA: 16s - loss: 0.9561 - regression_loss: 0.8401 - classification_loss: 0.1160 458/500 [==========================>...] - ETA: 16s - loss: 0.9553 - regression_loss: 0.8395 - classification_loss: 0.1158 459/500 [==========================>...] - ETA: 15s - loss: 0.9553 - regression_loss: 0.8395 - classification_loss: 0.1158 460/500 [==========================>...] - ETA: 15s - loss: 0.9558 - regression_loss: 0.8398 - classification_loss: 0.1159 461/500 [==========================>...] - ETA: 15s - loss: 0.9554 - regression_loss: 0.8394 - classification_loss: 0.1159 462/500 [==========================>...] - ETA: 14s - loss: 0.9554 - regression_loss: 0.8394 - classification_loss: 0.1160 463/500 [==========================>...] - ETA: 14s - loss: 0.9554 - regression_loss: 0.8395 - classification_loss: 0.1159 464/500 [==========================>...] - ETA: 13s - loss: 0.9551 - regression_loss: 0.8392 - classification_loss: 0.1159 465/500 [==========================>...] - ETA: 13s - loss: 0.9539 - regression_loss: 0.8382 - classification_loss: 0.1157 466/500 [==========================>...] - ETA: 13s - loss: 0.9538 - regression_loss: 0.8380 - classification_loss: 0.1158 467/500 [===========================>..] - ETA: 12s - loss: 0.9529 - regression_loss: 0.8373 - classification_loss: 0.1156 468/500 [===========================>..] - ETA: 12s - loss: 0.9520 - regression_loss: 0.8365 - classification_loss: 0.1155 469/500 [===========================>..] - ETA: 12s - loss: 0.9518 - regression_loss: 0.8364 - classification_loss: 0.1154 470/500 [===========================>..] - ETA: 11s - loss: 0.9522 - regression_loss: 0.8367 - classification_loss: 0.1155 471/500 [===========================>..] - ETA: 11s - loss: 0.9524 - regression_loss: 0.8369 - classification_loss: 0.1156 472/500 [===========================>..] - ETA: 10s - loss: 0.9523 - regression_loss: 0.8368 - classification_loss: 0.1156 473/500 [===========================>..] - ETA: 10s - loss: 0.9520 - regression_loss: 0.8365 - classification_loss: 0.1155 474/500 [===========================>..] - ETA: 10s - loss: 0.9515 - regression_loss: 0.8360 - classification_loss: 0.1155 475/500 [===========================>..] - ETA: 9s - loss: 0.9509 - regression_loss: 0.8355 - classification_loss: 0.1154  476/500 [===========================>..] - ETA: 9s - loss: 0.9507 - regression_loss: 0.8353 - classification_loss: 0.1154 477/500 [===========================>..] - ETA: 8s - loss: 0.9502 - regression_loss: 0.8349 - classification_loss: 0.1154 478/500 [===========================>..] - ETA: 8s - loss: 0.9491 - regression_loss: 0.8339 - classification_loss: 0.1152 479/500 [===========================>..] - ETA: 8s - loss: 0.9492 - regression_loss: 0.8340 - classification_loss: 0.1152 480/500 [===========================>..] - ETA: 7s - loss: 0.9493 - regression_loss: 0.8341 - classification_loss: 0.1152 481/500 [===========================>..] - ETA: 7s - loss: 0.9489 - regression_loss: 0.8336 - classification_loss: 0.1152 482/500 [===========================>..] - ETA: 6s - loss: 0.9488 - regression_loss: 0.8336 - classification_loss: 0.1153 483/500 [===========================>..] - ETA: 6s - loss: 0.9484 - regression_loss: 0.8331 - classification_loss: 0.1152 484/500 [============================>.] - ETA: 6s - loss: 0.9485 - regression_loss: 0.8332 - classification_loss: 0.1153 485/500 [============================>.] - ETA: 5s - loss: 0.9488 - regression_loss: 0.8334 - classification_loss: 0.1153 486/500 [============================>.] - ETA: 5s - loss: 0.9493 - regression_loss: 0.8339 - classification_loss: 0.1154 487/500 [============================>.] - ETA: 5s - loss: 0.9488 - regression_loss: 0.8334 - classification_loss: 0.1154 488/500 [============================>.] - ETA: 4s - loss: 0.9482 - regression_loss: 0.8329 - classification_loss: 0.1153 489/500 [============================>.] - ETA: 4s - loss: 0.9484 - regression_loss: 0.8331 - classification_loss: 0.1153 490/500 [============================>.] - ETA: 3s - loss: 0.9490 - regression_loss: 0.8336 - classification_loss: 0.1154 491/500 [============================>.] - ETA: 3s - loss: 0.9490 - regression_loss: 0.8336 - classification_loss: 0.1154 492/500 [============================>.] - ETA: 3s - loss: 0.9496 - regression_loss: 0.8341 - classification_loss: 0.1155 493/500 [============================>.] - ETA: 2s - loss: 0.9492 - regression_loss: 0.8338 - classification_loss: 0.1154 494/500 [============================>.] - ETA: 2s - loss: 0.9494 - regression_loss: 0.8340 - classification_loss: 0.1154 495/500 [============================>.] - ETA: 1s - loss: 0.9492 - regression_loss: 0.8339 - classification_loss: 0.1154 496/500 [============================>.] - ETA: 1s - loss: 0.9497 - regression_loss: 0.8342 - classification_loss: 0.1155 497/500 [============================>.] - ETA: 1s - loss: 0.9494 - regression_loss: 0.8339 - classification_loss: 0.1155 498/500 [============================>.] - ETA: 0s - loss: 0.9502 - regression_loss: 0.8346 - classification_loss: 0.1156 499/500 [============================>.] - ETA: 0s - loss: 0.9505 - regression_loss: 0.8349 - classification_loss: 0.1156 500/500 [==============================] - 194s 389ms/step - loss: 0.9504 - regression_loss: 0.8348 - classification_loss: 0.1156 1172 instances of class plum with average precision: 0.6819 mAP: 0.6819 Epoch 00010: saving model to ./training/snapshots/resnet101_pascal_10.h5 Epoch 11/20 1/500 [..............................] - ETA: 3:03 - loss: 0.5234 - regression_loss: 0.4277 - classification_loss: 0.0957 2/500 [..............................] - ETA: 3:04 - loss: 0.8671 - regression_loss: 0.7522 - classification_loss: 0.1149 3/500 [..............................] - ETA: 3:05 - loss: 0.8869 - regression_loss: 0.7475 - classification_loss: 0.1394 4/500 [..............................] - ETA: 3:08 - loss: 0.8181 - regression_loss: 0.6857 - classification_loss: 0.1324 5/500 [..............................] - ETA: 3:08 - loss: 0.7589 - regression_loss: 0.6378 - classification_loss: 0.1212 6/500 [..............................] - ETA: 3:09 - loss: 0.8004 - regression_loss: 0.6768 - classification_loss: 0.1236 7/500 [..............................] - ETA: 3:11 - loss: 0.8985 - regression_loss: 0.7653 - classification_loss: 0.1332 8/500 [..............................] - ETA: 3:12 - loss: 0.8743 - regression_loss: 0.7493 - classification_loss: 0.1250 9/500 [..............................] - ETA: 3:12 - loss: 0.8712 - regression_loss: 0.7528 - classification_loss: 0.1184 10/500 [..............................] - ETA: 3:12 - loss: 0.8739 - regression_loss: 0.7584 - classification_loss: 0.1155 11/500 [..............................] - ETA: 3:10 - loss: 0.8630 - regression_loss: 0.7508 - classification_loss: 0.1122 12/500 [..............................] - ETA: 3:09 - loss: 0.8848 - regression_loss: 0.7718 - classification_loss: 0.1129 13/500 [..............................] - ETA: 3:09 - loss: 0.9059 - regression_loss: 0.7897 - classification_loss: 0.1162 14/500 [..............................] - ETA: 3:08 - loss: 0.8771 - regression_loss: 0.7635 - classification_loss: 0.1135 15/500 [..............................] - ETA: 3:09 - loss: 0.8554 - regression_loss: 0.7440 - classification_loss: 0.1114 16/500 [..............................] - ETA: 3:08 - loss: 0.8546 - regression_loss: 0.7450 - classification_loss: 0.1095 17/500 [>.............................] - ETA: 3:07 - loss: 0.8702 - regression_loss: 0.7569 - classification_loss: 0.1133 18/500 [>.............................] - ETA: 3:06 - loss: 0.8775 - regression_loss: 0.7633 - classification_loss: 0.1142 19/500 [>.............................] - ETA: 3:06 - loss: 0.8736 - regression_loss: 0.7598 - classification_loss: 0.1137 20/500 [>.............................] - ETA: 3:06 - loss: 0.8597 - regression_loss: 0.7477 - classification_loss: 0.1120 21/500 [>.............................] - ETA: 3:06 - loss: 0.8710 - regression_loss: 0.7594 - classification_loss: 0.1116 22/500 [>.............................] - ETA: 3:06 - loss: 0.8561 - regression_loss: 0.7458 - classification_loss: 0.1103 23/500 [>.............................] - ETA: 3:05 - loss: 0.8472 - regression_loss: 0.7391 - classification_loss: 0.1080 24/500 [>.............................] - ETA: 3:05 - loss: 0.8360 - regression_loss: 0.7307 - classification_loss: 0.1053 25/500 [>.............................] - ETA: 3:04 - loss: 0.8336 - regression_loss: 0.7304 - classification_loss: 0.1032 26/500 [>.............................] - ETA: 3:04 - loss: 0.8440 - regression_loss: 0.7398 - classification_loss: 0.1041 27/500 [>.............................] - ETA: 3:03 - loss: 0.8470 - regression_loss: 0.7427 - classification_loss: 0.1043 28/500 [>.............................] - ETA: 3:03 - loss: 0.8526 - regression_loss: 0.7486 - classification_loss: 0.1040 29/500 [>.............................] - ETA: 3:03 - loss: 0.8583 - regression_loss: 0.7546 - classification_loss: 0.1038 30/500 [>.............................] - ETA: 3:03 - loss: 0.8597 - regression_loss: 0.7566 - classification_loss: 0.1032 31/500 [>.............................] - ETA: 3:02 - loss: 0.8428 - regression_loss: 0.7411 - classification_loss: 0.1017 32/500 [>.............................] - ETA: 3:02 - loss: 0.8335 - regression_loss: 0.7327 - classification_loss: 0.1008 33/500 [>.............................] - ETA: 3:01 - loss: 0.8308 - regression_loss: 0.7307 - classification_loss: 0.1000 34/500 [=>............................] - ETA: 3:01 - loss: 0.8279 - regression_loss: 0.7272 - classification_loss: 0.1007 35/500 [=>............................] - ETA: 3:00 - loss: 0.8300 - regression_loss: 0.7295 - classification_loss: 0.1005 36/500 [=>............................] - ETA: 3:00 - loss: 0.8390 - regression_loss: 0.7384 - classification_loss: 0.1006 37/500 [=>............................] - ETA: 2:59 - loss: 0.8480 - regression_loss: 0.7466 - classification_loss: 0.1014 38/500 [=>............................] - ETA: 2:59 - loss: 0.8510 - regression_loss: 0.7496 - classification_loss: 0.1013 39/500 [=>............................] - ETA: 2:59 - loss: 0.8646 - regression_loss: 0.7630 - classification_loss: 0.1016 40/500 [=>............................] - ETA: 2:58 - loss: 0.8688 - regression_loss: 0.7670 - classification_loss: 0.1018 41/500 [=>............................] - ETA: 2:58 - loss: 0.8632 - regression_loss: 0.7621 - classification_loss: 0.1011 42/500 [=>............................] - ETA: 2:57 - loss: 0.8698 - regression_loss: 0.7685 - classification_loss: 0.1013 43/500 [=>............................] - ETA: 2:57 - loss: 0.8788 - regression_loss: 0.7772 - classification_loss: 0.1016 44/500 [=>............................] - ETA: 2:56 - loss: 0.8852 - regression_loss: 0.7825 - classification_loss: 0.1027 45/500 [=>............................] - ETA: 2:56 - loss: 0.8787 - regression_loss: 0.7769 - classification_loss: 0.1018 46/500 [=>............................] - ETA: 2:56 - loss: 0.8839 - regression_loss: 0.7818 - classification_loss: 0.1021 47/500 [=>............................] - ETA: 2:55 - loss: 0.8775 - regression_loss: 0.7764 - classification_loss: 0.1011 48/500 [=>............................] - ETA: 2:55 - loss: 0.8818 - regression_loss: 0.7798 - classification_loss: 0.1019 49/500 [=>............................] - ETA: 2:54 - loss: 0.8770 - regression_loss: 0.7753 - classification_loss: 0.1017 50/500 [==>...........................] - ETA: 2:54 - loss: 0.8754 - regression_loss: 0.7739 - classification_loss: 0.1015 51/500 [==>...........................] - ETA: 2:53 - loss: 0.8884 - regression_loss: 0.7844 - classification_loss: 0.1040 52/500 [==>...........................] - ETA: 2:53 - loss: 0.8901 - regression_loss: 0.7864 - classification_loss: 0.1037 53/500 [==>...........................] - ETA: 2:53 - loss: 0.8878 - regression_loss: 0.7848 - classification_loss: 0.1030 54/500 [==>...........................] - ETA: 2:52 - loss: 0.8930 - regression_loss: 0.7899 - classification_loss: 0.1031 55/500 [==>...........................] - ETA: 2:52 - loss: 0.8847 - regression_loss: 0.7828 - classification_loss: 0.1019 56/500 [==>...........................] - ETA: 2:51 - loss: 0.8856 - regression_loss: 0.7844 - classification_loss: 0.1012 57/500 [==>...........................] - ETA: 2:51 - loss: 0.8821 - regression_loss: 0.7820 - classification_loss: 0.1001 58/500 [==>...........................] - ETA: 2:51 - loss: 0.8904 - regression_loss: 0.7892 - classification_loss: 0.1012 59/500 [==>...........................] - ETA: 2:50 - loss: 0.8993 - regression_loss: 0.7966 - classification_loss: 0.1027 60/500 [==>...........................] - ETA: 2:50 - loss: 0.8947 - regression_loss: 0.7925 - classification_loss: 0.1021 61/500 [==>...........................] - ETA: 2:50 - loss: 0.8972 - regression_loss: 0.7947 - classification_loss: 0.1025 62/500 [==>...........................] - ETA: 2:50 - loss: 0.8962 - regression_loss: 0.7938 - classification_loss: 0.1023 63/500 [==>...........................] - ETA: 2:49 - loss: 0.8985 - regression_loss: 0.7960 - classification_loss: 0.1025 64/500 [==>...........................] - ETA: 2:49 - loss: 0.8923 - regression_loss: 0.7906 - classification_loss: 0.1017 65/500 [==>...........................] - ETA: 2:49 - loss: 0.8980 - regression_loss: 0.7949 - classification_loss: 0.1031 66/500 [==>...........................] - ETA: 2:48 - loss: 0.8962 - regression_loss: 0.7930 - classification_loss: 0.1032 67/500 [===>..........................] - ETA: 2:48 - loss: 0.9005 - regression_loss: 0.7968 - classification_loss: 0.1037 68/500 [===>..........................] - ETA: 2:47 - loss: 0.8987 - regression_loss: 0.7957 - classification_loss: 0.1030 69/500 [===>..........................] - ETA: 2:47 - loss: 0.9035 - regression_loss: 0.8002 - classification_loss: 0.1033 70/500 [===>..........................] - ETA: 2:46 - loss: 0.9014 - regression_loss: 0.7983 - classification_loss: 0.1031 71/500 [===>..........................] - ETA: 2:46 - loss: 0.8972 - regression_loss: 0.7948 - classification_loss: 0.1024 72/500 [===>..........................] - ETA: 2:46 - loss: 0.8918 - regression_loss: 0.7903 - classification_loss: 0.1016 73/500 [===>..........................] - ETA: 2:45 - loss: 0.8880 - regression_loss: 0.7872 - classification_loss: 0.1008 74/500 [===>..........................] - ETA: 2:45 - loss: 0.8854 - regression_loss: 0.7849 - classification_loss: 0.1004 75/500 [===>..........................] - ETA: 2:44 - loss: 0.8849 - regression_loss: 0.7844 - classification_loss: 0.1005 76/500 [===>..........................] - ETA: 2:44 - loss: 0.8901 - regression_loss: 0.7890 - classification_loss: 0.1011 77/500 [===>..........................] - ETA: 2:44 - loss: 0.8867 - regression_loss: 0.7863 - classification_loss: 0.1004 78/500 [===>..........................] - ETA: 2:43 - loss: 0.8879 - regression_loss: 0.7868 - classification_loss: 0.1010 79/500 [===>..........................] - ETA: 2:43 - loss: 0.8868 - regression_loss: 0.7856 - classification_loss: 0.1012 80/500 [===>..........................] - ETA: 2:42 - loss: 0.8853 - regression_loss: 0.7841 - classification_loss: 0.1011 81/500 [===>..........................] - ETA: 2:42 - loss: 0.8917 - regression_loss: 0.7890 - classification_loss: 0.1026 82/500 [===>..........................] - ETA: 2:42 - loss: 0.8881 - regression_loss: 0.7863 - classification_loss: 0.1018 83/500 [===>..........................] - ETA: 2:41 - loss: 0.8936 - regression_loss: 0.7915 - classification_loss: 0.1021 84/500 [====>.........................] - ETA: 2:41 - loss: 0.8925 - regression_loss: 0.7904 - classification_loss: 0.1021 85/500 [====>.........................] - ETA: 2:40 - loss: 0.8887 - regression_loss: 0.7872 - classification_loss: 0.1015 86/500 [====>.........................] - ETA: 2:40 - loss: 0.8901 - regression_loss: 0.7885 - classification_loss: 0.1016 87/500 [====>.........................] - ETA: 2:39 - loss: 0.8967 - regression_loss: 0.7942 - classification_loss: 0.1024 88/500 [====>.........................] - ETA: 2:39 - loss: 0.8959 - regression_loss: 0.7932 - classification_loss: 0.1027 89/500 [====>.........................] - ETA: 2:38 - loss: 0.8981 - regression_loss: 0.7951 - classification_loss: 0.1030 90/500 [====>.........................] - ETA: 2:38 - loss: 0.9035 - regression_loss: 0.7995 - classification_loss: 0.1040 91/500 [====>.........................] - ETA: 2:37 - loss: 0.9021 - regression_loss: 0.7982 - classification_loss: 0.1039 92/500 [====>.........................] - ETA: 2:37 - loss: 0.9033 - regression_loss: 0.7998 - classification_loss: 0.1036 93/500 [====>.........................] - ETA: 2:37 - loss: 0.9044 - regression_loss: 0.8006 - classification_loss: 0.1038 94/500 [====>.........................] - ETA: 2:37 - loss: 0.9042 - regression_loss: 0.8003 - classification_loss: 0.1039 95/500 [====>.........................] - ETA: 2:36 - loss: 0.9049 - regression_loss: 0.8007 - classification_loss: 0.1042 96/500 [====>.........................] - ETA: 2:36 - loss: 0.9030 - regression_loss: 0.7987 - classification_loss: 0.1042 97/500 [====>.........................] - ETA: 2:36 - loss: 0.9067 - regression_loss: 0.8018 - classification_loss: 0.1049 98/500 [====>.........................] - ETA: 2:35 - loss: 0.9058 - regression_loss: 0.8011 - classification_loss: 0.1047 99/500 [====>.........................] - ETA: 2:35 - loss: 0.9057 - regression_loss: 0.8009 - classification_loss: 0.1048 100/500 [=====>........................] - ETA: 2:35 - loss: 0.9029 - regression_loss: 0.7984 - classification_loss: 0.1044 101/500 [=====>........................] - ETA: 2:34 - loss: 0.9044 - regression_loss: 0.8001 - classification_loss: 0.1043 102/500 [=====>........................] - ETA: 2:34 - loss: 0.9030 - regression_loss: 0.7986 - classification_loss: 0.1045 103/500 [=====>........................] - ETA: 2:34 - loss: 0.9014 - regression_loss: 0.7968 - classification_loss: 0.1046 104/500 [=====>........................] - ETA: 2:33 - loss: 0.8988 - regression_loss: 0.7943 - classification_loss: 0.1046 105/500 [=====>........................] - ETA: 2:33 - loss: 0.8959 - regression_loss: 0.7913 - classification_loss: 0.1046 106/500 [=====>........................] - ETA: 2:32 - loss: 0.8966 - regression_loss: 0.7920 - classification_loss: 0.1046 107/500 [=====>........................] - ETA: 2:32 - loss: 0.8921 - regression_loss: 0.7880 - classification_loss: 0.1041 108/500 [=====>........................] - ETA: 2:31 - loss: 0.8923 - regression_loss: 0.7881 - classification_loss: 0.1042 109/500 [=====>........................] - ETA: 2:31 - loss: 0.8920 - regression_loss: 0.7882 - classification_loss: 0.1037 110/500 [=====>........................] - ETA: 2:30 - loss: 0.8925 - regression_loss: 0.7887 - classification_loss: 0.1038 111/500 [=====>........................] - ETA: 2:30 - loss: 0.8935 - regression_loss: 0.7892 - classification_loss: 0.1043 112/500 [=====>........................] - ETA: 2:30 - loss: 0.8934 - regression_loss: 0.7891 - classification_loss: 0.1043 113/500 [=====>........................] - ETA: 2:29 - loss: 0.8917 - regression_loss: 0.7875 - classification_loss: 0.1042 114/500 [=====>........................] - ETA: 2:29 - loss: 0.8899 - regression_loss: 0.7862 - classification_loss: 0.1037 115/500 [=====>........................] - ETA: 2:29 - loss: 0.8962 - regression_loss: 0.7914 - classification_loss: 0.1048 116/500 [=====>........................] - ETA: 2:28 - loss: 0.9003 - regression_loss: 0.7944 - classification_loss: 0.1060 117/500 [======>.......................] - ETA: 2:28 - loss: 0.9019 - regression_loss: 0.7957 - classification_loss: 0.1062 118/500 [======>.......................] - ETA: 2:28 - loss: 0.9004 - regression_loss: 0.7944 - classification_loss: 0.1060 119/500 [======>.......................] - ETA: 2:27 - loss: 0.8995 - regression_loss: 0.7935 - classification_loss: 0.1060 120/500 [======>.......................] - ETA: 2:27 - loss: 0.8979 - regression_loss: 0.7920 - classification_loss: 0.1059 121/500 [======>.......................] - ETA: 2:26 - loss: 0.8951 - regression_loss: 0.7893 - classification_loss: 0.1058 122/500 [======>.......................] - ETA: 2:26 - loss: 0.8963 - regression_loss: 0.7903 - classification_loss: 0.1060 123/500 [======>.......................] - ETA: 2:26 - loss: 0.8975 - regression_loss: 0.7915 - classification_loss: 0.1060 124/500 [======>.......................] - ETA: 2:25 - loss: 0.8960 - regression_loss: 0.7904 - classification_loss: 0.1056 125/500 [======>.......................] - ETA: 2:25 - loss: 0.8979 - regression_loss: 0.7921 - classification_loss: 0.1058 126/500 [======>.......................] - ETA: 2:24 - loss: 0.9014 - regression_loss: 0.7952 - classification_loss: 0.1063 127/500 [======>.......................] - ETA: 2:24 - loss: 0.9021 - regression_loss: 0.7955 - classification_loss: 0.1065 128/500 [======>.......................] - ETA: 2:24 - loss: 0.9043 - regression_loss: 0.7975 - classification_loss: 0.1068 129/500 [======>.......................] - ETA: 2:23 - loss: 0.9062 - regression_loss: 0.7987 - classification_loss: 0.1075 130/500 [======>.......................] - ETA: 2:23 - loss: 0.9063 - regression_loss: 0.7987 - classification_loss: 0.1076 131/500 [======>.......................] - ETA: 2:22 - loss: 0.9099 - regression_loss: 0.8018 - classification_loss: 0.1081 132/500 [======>.......................] - ETA: 2:22 - loss: 0.9098 - regression_loss: 0.8017 - classification_loss: 0.1081 133/500 [======>.......................] - ETA: 2:21 - loss: 0.9123 - regression_loss: 0.8037 - classification_loss: 0.1086 134/500 [=======>......................] - ETA: 2:21 - loss: 0.9148 - regression_loss: 0.8060 - classification_loss: 0.1088 135/500 [=======>......................] - ETA: 2:21 - loss: 0.9131 - regression_loss: 0.8046 - classification_loss: 0.1085 136/500 [=======>......................] - ETA: 2:20 - loss: 0.9109 - regression_loss: 0.8026 - classification_loss: 0.1083 137/500 [=======>......................] - ETA: 2:20 - loss: 0.9093 - regression_loss: 0.8011 - classification_loss: 0.1083 138/500 [=======>......................] - ETA: 2:20 - loss: 0.9073 - regression_loss: 0.7993 - classification_loss: 0.1080 139/500 [=======>......................] - ETA: 2:19 - loss: 0.9071 - regression_loss: 0.7990 - classification_loss: 0.1081 140/500 [=======>......................] - ETA: 2:19 - loss: 0.9064 - regression_loss: 0.7985 - classification_loss: 0.1079 141/500 [=======>......................] - ETA: 2:18 - loss: 0.9056 - regression_loss: 0.7981 - classification_loss: 0.1075 142/500 [=======>......................] - ETA: 2:18 - loss: 0.9083 - regression_loss: 0.8005 - classification_loss: 0.1078 143/500 [=======>......................] - ETA: 2:18 - loss: 0.9128 - regression_loss: 0.8042 - classification_loss: 0.1086 144/500 [=======>......................] - ETA: 2:17 - loss: 0.9136 - regression_loss: 0.8048 - classification_loss: 0.1088 145/500 [=======>......................] - ETA: 2:17 - loss: 0.9146 - regression_loss: 0.8057 - classification_loss: 0.1089 146/500 [=======>......................] - ETA: 2:16 - loss: 0.9151 - regression_loss: 0.8062 - classification_loss: 0.1089 147/500 [=======>......................] - ETA: 2:16 - loss: 0.9172 - regression_loss: 0.8082 - classification_loss: 0.1090 148/500 [=======>......................] - ETA: 2:16 - loss: 0.9209 - regression_loss: 0.8116 - classification_loss: 0.1093 149/500 [=======>......................] - ETA: 2:15 - loss: 0.9224 - regression_loss: 0.8131 - classification_loss: 0.1092 150/500 [========>.....................] - ETA: 2:15 - loss: 0.9283 - regression_loss: 0.8182 - classification_loss: 0.1100 151/500 [========>.....................] - ETA: 2:14 - loss: 0.9292 - regression_loss: 0.8190 - classification_loss: 0.1101 152/500 [========>.....................] - ETA: 2:14 - loss: 0.9281 - regression_loss: 0.8180 - classification_loss: 0.1101 153/500 [========>.....................] - ETA: 2:14 - loss: 0.9248 - regression_loss: 0.8152 - classification_loss: 0.1096 154/500 [========>.....................] - ETA: 2:13 - loss: 0.9239 - regression_loss: 0.8145 - classification_loss: 0.1094 155/500 [========>.....................] - ETA: 2:13 - loss: 0.9228 - regression_loss: 0.8135 - classification_loss: 0.1092 156/500 [========>.....................] - ETA: 2:12 - loss: 0.9228 - regression_loss: 0.8133 - classification_loss: 0.1095 157/500 [========>.....................] - ETA: 2:12 - loss: 0.9295 - regression_loss: 0.8197 - classification_loss: 0.1098 158/500 [========>.....................] - ETA: 2:12 - loss: 0.9334 - regression_loss: 0.8232 - classification_loss: 0.1102 159/500 [========>.....................] - ETA: 2:11 - loss: 0.9334 - regression_loss: 0.8232 - classification_loss: 0.1102 160/500 [========>.....................] - ETA: 2:11 - loss: 0.9340 - regression_loss: 0.8242 - classification_loss: 0.1098 161/500 [========>.....................] - ETA: 2:10 - loss: 0.9355 - regression_loss: 0.8255 - classification_loss: 0.1100 162/500 [========>.....................] - ETA: 2:10 - loss: 0.9358 - regression_loss: 0.8257 - classification_loss: 0.1101 163/500 [========>.....................] - ETA: 2:10 - loss: 0.9340 - regression_loss: 0.8243 - classification_loss: 0.1097 164/500 [========>.....................] - ETA: 2:09 - loss: 0.9347 - regression_loss: 0.8248 - classification_loss: 0.1099 165/500 [========>.....................] - ETA: 2:09 - loss: 0.9320 - regression_loss: 0.8225 - classification_loss: 0.1095 166/500 [========>.....................] - ETA: 2:09 - loss: 0.9336 - regression_loss: 0.8240 - classification_loss: 0.1096 167/500 [=========>....................] - ETA: 2:08 - loss: 0.9349 - regression_loss: 0.8255 - classification_loss: 0.1094 168/500 [=========>....................] - ETA: 2:08 - loss: 0.9352 - regression_loss: 0.8259 - classification_loss: 0.1093 169/500 [=========>....................] - ETA: 2:07 - loss: 0.9362 - regression_loss: 0.8265 - classification_loss: 0.1097 170/500 [=========>....................] - ETA: 2:07 - loss: 0.9350 - regression_loss: 0.8257 - classification_loss: 0.1093 171/500 [=========>....................] - ETA: 2:07 - loss: 0.9322 - regression_loss: 0.8232 - classification_loss: 0.1090 172/500 [=========>....................] - ETA: 2:06 - loss: 0.9344 - regression_loss: 0.8253 - classification_loss: 0.1091 173/500 [=========>....................] - ETA: 2:06 - loss: 0.9345 - regression_loss: 0.8256 - classification_loss: 0.1090 174/500 [=========>....................] - ETA: 2:05 - loss: 0.9326 - regression_loss: 0.8240 - classification_loss: 0.1086 175/500 [=========>....................] - ETA: 2:05 - loss: 0.9330 - regression_loss: 0.8243 - classification_loss: 0.1086 176/500 [=========>....................] - ETA: 2:05 - loss: 0.9314 - regression_loss: 0.8230 - classification_loss: 0.1084 177/500 [=========>....................] - ETA: 2:04 - loss: 0.9333 - regression_loss: 0.8247 - classification_loss: 0.1086 178/500 [=========>....................] - ETA: 2:04 - loss: 0.9324 - regression_loss: 0.8238 - classification_loss: 0.1086 179/500 [=========>....................] - ETA: 2:04 - loss: 0.9335 - regression_loss: 0.8251 - classification_loss: 0.1084 180/500 [=========>....................] - ETA: 2:03 - loss: 0.9324 - regression_loss: 0.8242 - classification_loss: 0.1082 181/500 [=========>....................] - ETA: 2:03 - loss: 0.9334 - regression_loss: 0.8252 - classification_loss: 0.1082 182/500 [=========>....................] - ETA: 2:02 - loss: 0.9323 - regression_loss: 0.8244 - classification_loss: 0.1079 183/500 [=========>....................] - ETA: 2:02 - loss: 0.9315 - regression_loss: 0.8236 - classification_loss: 0.1080 184/500 [==========>...................] - ETA: 2:02 - loss: 0.9313 - regression_loss: 0.8232 - classification_loss: 0.1080 185/500 [==========>...................] - ETA: 2:01 - loss: 0.9312 - regression_loss: 0.8229 - classification_loss: 0.1083 186/500 [==========>...................] - ETA: 2:01 - loss: 0.9326 - regression_loss: 0.8241 - classification_loss: 0.1085 187/500 [==========>...................] - ETA: 2:00 - loss: 0.9316 - regression_loss: 0.8232 - classification_loss: 0.1084 188/500 [==========>...................] - ETA: 2:00 - loss: 0.9307 - regression_loss: 0.8224 - classification_loss: 0.1083 189/500 [==========>...................] - ETA: 2:00 - loss: 0.9274 - regression_loss: 0.8195 - classification_loss: 0.1079 190/500 [==========>...................] - ETA: 1:59 - loss: 0.9251 - regression_loss: 0.8175 - classification_loss: 0.1075 191/500 [==========>...................] - ETA: 1:59 - loss: 0.9306 - regression_loss: 0.8214 - classification_loss: 0.1092 192/500 [==========>...................] - ETA: 1:59 - loss: 0.9311 - regression_loss: 0.8219 - classification_loss: 0.1092 193/500 [==========>...................] - ETA: 1:58 - loss: 0.9316 - regression_loss: 0.8224 - classification_loss: 0.1092 194/500 [==========>...................] - ETA: 1:58 - loss: 0.9296 - regression_loss: 0.8205 - classification_loss: 0.1091 195/500 [==========>...................] - ETA: 1:57 - loss: 0.9314 - regression_loss: 0.8217 - classification_loss: 0.1097 196/500 [==========>...................] - ETA: 1:57 - loss: 0.9306 - regression_loss: 0.8210 - classification_loss: 0.1096 197/500 [==========>...................] - ETA: 1:57 - loss: 0.9289 - regression_loss: 0.8195 - classification_loss: 0.1095 198/500 [==========>...................] - ETA: 1:56 - loss: 0.9291 - regression_loss: 0.8197 - classification_loss: 0.1095 199/500 [==========>...................] - ETA: 1:56 - loss: 0.9293 - regression_loss: 0.8199 - classification_loss: 0.1095 200/500 [===========>..................] - ETA: 1:55 - loss: 0.9290 - regression_loss: 0.8198 - classification_loss: 0.1092 201/500 [===========>..................] - ETA: 1:55 - loss: 0.9268 - regression_loss: 0.8177 - classification_loss: 0.1091 202/500 [===========>..................] - ETA: 1:55 - loss: 0.9263 - regression_loss: 0.8172 - classification_loss: 0.1091 203/500 [===========>..................] - ETA: 1:54 - loss: 0.9240 - regression_loss: 0.8153 - classification_loss: 0.1088 204/500 [===========>..................] - ETA: 1:54 - loss: 0.9220 - regression_loss: 0.8134 - classification_loss: 0.1085 205/500 [===========>..................] - ETA: 1:53 - loss: 0.9212 - regression_loss: 0.8128 - classification_loss: 0.1084 206/500 [===========>..................] - ETA: 1:53 - loss: 0.9191 - regression_loss: 0.8109 - classification_loss: 0.1082 207/500 [===========>..................] - ETA: 1:53 - loss: 0.9186 - regression_loss: 0.8106 - classification_loss: 0.1080 208/500 [===========>..................] - ETA: 1:52 - loss: 0.9185 - regression_loss: 0.8105 - classification_loss: 0.1079 209/500 [===========>..................] - ETA: 1:52 - loss: 0.9182 - regression_loss: 0.8103 - classification_loss: 0.1078 210/500 [===========>..................] - ETA: 1:52 - loss: 0.9200 - regression_loss: 0.8123 - classification_loss: 0.1078 211/500 [===========>..................] - ETA: 1:51 - loss: 0.9209 - regression_loss: 0.8132 - classification_loss: 0.1077 212/500 [===========>..................] - ETA: 1:51 - loss: 0.9236 - regression_loss: 0.8153 - classification_loss: 0.1083 213/500 [===========>..................] - ETA: 1:50 - loss: 0.9225 - regression_loss: 0.8143 - classification_loss: 0.1081 214/500 [===========>..................] - ETA: 1:50 - loss: 0.9203 - regression_loss: 0.8121 - classification_loss: 0.1081 215/500 [===========>..................] - ETA: 1:50 - loss: 0.9183 - regression_loss: 0.8104 - classification_loss: 0.1078 216/500 [===========>..................] - ETA: 1:49 - loss: 0.9232 - regression_loss: 0.8146 - classification_loss: 0.1087 217/500 [============>.................] - ETA: 1:49 - loss: 0.9247 - regression_loss: 0.8159 - classification_loss: 0.1088 218/500 [============>.................] - ETA: 1:48 - loss: 0.9249 - regression_loss: 0.8161 - classification_loss: 0.1088 219/500 [============>.................] - ETA: 1:48 - loss: 0.9250 - regression_loss: 0.8161 - classification_loss: 0.1090 220/500 [============>.................] - ETA: 1:48 - loss: 0.9256 - regression_loss: 0.8165 - classification_loss: 0.1090 221/500 [============>.................] - ETA: 1:47 - loss: 0.9253 - regression_loss: 0.8162 - classification_loss: 0.1091 222/500 [============>.................] - ETA: 1:47 - loss: 0.9242 - regression_loss: 0.8152 - classification_loss: 0.1090 223/500 [============>.................] - ETA: 1:47 - loss: 0.9254 - regression_loss: 0.8163 - classification_loss: 0.1092 224/500 [============>.................] - ETA: 1:46 - loss: 0.9249 - regression_loss: 0.8159 - classification_loss: 0.1090 225/500 [============>.................] - ETA: 1:46 - loss: 0.9255 - regression_loss: 0.8164 - classification_loss: 0.1092 226/500 [============>.................] - ETA: 1:45 - loss: 0.9253 - regression_loss: 0.8163 - classification_loss: 0.1090 227/500 [============>.................] - ETA: 1:45 - loss: 0.9234 - regression_loss: 0.8147 - classification_loss: 0.1087 228/500 [============>.................] - ETA: 1:45 - loss: 0.9230 - regression_loss: 0.8145 - classification_loss: 0.1085 229/500 [============>.................] - ETA: 1:44 - loss: 0.9235 - regression_loss: 0.8150 - classification_loss: 0.1085 230/500 [============>.................] - ETA: 1:44 - loss: 0.9225 - regression_loss: 0.8140 - classification_loss: 0.1085 231/500 [============>.................] - ETA: 1:43 - loss: 0.9203 - regression_loss: 0.8121 - classification_loss: 0.1082 232/500 [============>.................] - ETA: 1:43 - loss: 0.9214 - regression_loss: 0.8131 - classification_loss: 0.1083 233/500 [============>.................] - ETA: 1:43 - loss: 0.9226 - regression_loss: 0.8142 - classification_loss: 0.1084 234/500 [=============>................] - ETA: 1:42 - loss: 0.9238 - regression_loss: 0.8154 - classification_loss: 0.1084 235/500 [=============>................] - ETA: 1:42 - loss: 0.9235 - regression_loss: 0.8150 - classification_loss: 0.1085 236/500 [=============>................] - ETA: 1:41 - loss: 0.9224 - regression_loss: 0.8141 - classification_loss: 0.1083 237/500 [=============>................] - ETA: 1:41 - loss: 0.9223 - regression_loss: 0.8140 - classification_loss: 0.1083 238/500 [=============>................] - ETA: 1:41 - loss: 0.9215 - regression_loss: 0.8135 - classification_loss: 0.1081 239/500 [=============>................] - ETA: 1:40 - loss: 0.9217 - regression_loss: 0.8136 - classification_loss: 0.1082 240/500 [=============>................] - ETA: 1:40 - loss: 0.9213 - regression_loss: 0.8131 - classification_loss: 0.1082 241/500 [=============>................] - ETA: 1:39 - loss: 0.9218 - regression_loss: 0.8134 - classification_loss: 0.1084 242/500 [=============>................] - ETA: 1:39 - loss: 0.9195 - regression_loss: 0.8114 - classification_loss: 0.1081 243/500 [=============>................] - ETA: 1:39 - loss: 0.9183 - regression_loss: 0.8103 - classification_loss: 0.1079 244/500 [=============>................] - ETA: 1:38 - loss: 0.9190 - regression_loss: 0.8112 - classification_loss: 0.1077 245/500 [=============>................] - ETA: 1:38 - loss: 0.9195 - regression_loss: 0.8116 - classification_loss: 0.1079 246/500 [=============>................] - ETA: 1:38 - loss: 0.9173 - regression_loss: 0.8097 - classification_loss: 0.1076 247/500 [=============>................] - ETA: 1:37 - loss: 0.9181 - regression_loss: 0.8105 - classification_loss: 0.1076 248/500 [=============>................] - ETA: 1:37 - loss: 0.9174 - regression_loss: 0.8100 - classification_loss: 0.1074 249/500 [=============>................] - ETA: 1:36 - loss: 0.9181 - regression_loss: 0.8105 - classification_loss: 0.1077 250/500 [==============>...............] - ETA: 1:36 - loss: 0.9181 - regression_loss: 0.8104 - classification_loss: 0.1077 251/500 [==============>...............] - ETA: 1:36 - loss: 0.9187 - regression_loss: 0.8108 - classification_loss: 0.1078 252/500 [==============>...............] - ETA: 1:35 - loss: 0.9181 - regression_loss: 0.8104 - classification_loss: 0.1077 253/500 [==============>...............] - ETA: 1:35 - loss: 0.9187 - regression_loss: 0.8112 - classification_loss: 0.1076 254/500 [==============>...............] - ETA: 1:35 - loss: 0.9194 - regression_loss: 0.8118 - classification_loss: 0.1076 255/500 [==============>...............] - ETA: 1:34 - loss: 0.9194 - regression_loss: 0.8119 - classification_loss: 0.1075 256/500 [==============>...............] - ETA: 1:34 - loss: 0.9211 - regression_loss: 0.8133 - classification_loss: 0.1078 257/500 [==============>...............] - ETA: 1:33 - loss: 0.9217 - regression_loss: 0.8138 - classification_loss: 0.1078 258/500 [==============>...............] - ETA: 1:33 - loss: 0.9205 - regression_loss: 0.8128 - classification_loss: 0.1077 259/500 [==============>...............] - ETA: 1:33 - loss: 0.9194 - regression_loss: 0.8118 - classification_loss: 0.1076 260/500 [==============>...............] - ETA: 1:32 - loss: 0.9210 - regression_loss: 0.8132 - classification_loss: 0.1078 261/500 [==============>...............] - ETA: 1:32 - loss: 0.9221 - regression_loss: 0.8140 - classification_loss: 0.1080 262/500 [==============>...............] - ETA: 1:31 - loss: 0.9231 - regression_loss: 0.8147 - classification_loss: 0.1084 263/500 [==============>...............] - ETA: 1:31 - loss: 0.9212 - regression_loss: 0.8131 - classification_loss: 0.1082 264/500 [==============>...............] - ETA: 1:31 - loss: 0.9210 - regression_loss: 0.8129 - classification_loss: 0.1082 265/500 [==============>...............] - ETA: 1:30 - loss: 0.9205 - regression_loss: 0.8124 - classification_loss: 0.1082 266/500 [==============>...............] - ETA: 1:30 - loss: 0.9226 - regression_loss: 0.8142 - classification_loss: 0.1084 267/500 [===============>..............] - ETA: 1:30 - loss: 0.9239 - regression_loss: 0.8152 - classification_loss: 0.1087 268/500 [===============>..............] - ETA: 1:29 - loss: 0.9227 - regression_loss: 0.8142 - classification_loss: 0.1085 269/500 [===============>..............] - ETA: 1:29 - loss: 0.9226 - regression_loss: 0.8141 - classification_loss: 0.1085 270/500 [===============>..............] - ETA: 1:28 - loss: 0.9216 - regression_loss: 0.8132 - classification_loss: 0.1084 271/500 [===============>..............] - ETA: 1:28 - loss: 0.9207 - regression_loss: 0.8124 - classification_loss: 0.1083 272/500 [===============>..............] - ETA: 1:28 - loss: 0.9213 - regression_loss: 0.8128 - classification_loss: 0.1085 273/500 [===============>..............] - ETA: 1:27 - loss: 0.9208 - regression_loss: 0.8124 - classification_loss: 0.1084 274/500 [===============>..............] - ETA: 1:27 - loss: 0.9202 - regression_loss: 0.8119 - classification_loss: 0.1083 275/500 [===============>..............] - ETA: 1:27 - loss: 0.9208 - regression_loss: 0.8125 - classification_loss: 0.1083 276/500 [===============>..............] - ETA: 1:26 - loss: 0.9207 - regression_loss: 0.8124 - classification_loss: 0.1083 277/500 [===============>..............] - ETA: 1:26 - loss: 0.9216 - regression_loss: 0.8132 - classification_loss: 0.1085 278/500 [===============>..............] - ETA: 1:25 - loss: 0.9207 - regression_loss: 0.8124 - classification_loss: 0.1083 279/500 [===============>..............] - ETA: 1:25 - loss: 0.9205 - regression_loss: 0.8124 - classification_loss: 0.1082 280/500 [===============>..............] - ETA: 1:25 - loss: 0.9195 - regression_loss: 0.8116 - classification_loss: 0.1080 281/500 [===============>..............] - ETA: 1:24 - loss: 0.9182 - regression_loss: 0.8104 - classification_loss: 0.1078 282/500 [===============>..............] - ETA: 1:24 - loss: 0.9185 - regression_loss: 0.8106 - classification_loss: 0.1079 283/500 [===============>..............] - ETA: 1:23 - loss: 0.9175 - regression_loss: 0.8097 - classification_loss: 0.1077 284/500 [================>.............] - ETA: 1:23 - loss: 0.9184 - regression_loss: 0.8107 - classification_loss: 0.1078 285/500 [================>.............] - ETA: 1:23 - loss: 0.9169 - regression_loss: 0.8093 - classification_loss: 0.1075 286/500 [================>.............] - ETA: 1:22 - loss: 0.9175 - regression_loss: 0.8099 - classification_loss: 0.1076 287/500 [================>.............] - ETA: 1:22 - loss: 0.9171 - regression_loss: 0.8095 - classification_loss: 0.1076 288/500 [================>.............] - ETA: 1:22 - loss: 0.9192 - regression_loss: 0.8113 - classification_loss: 0.1079 289/500 [================>.............] - ETA: 1:21 - loss: 0.9200 - regression_loss: 0.8119 - classification_loss: 0.1081 290/500 [================>.............] - ETA: 1:21 - loss: 0.9210 - regression_loss: 0.8127 - classification_loss: 0.1083 291/500 [================>.............] - ETA: 1:20 - loss: 0.9210 - regression_loss: 0.8127 - classification_loss: 0.1083 292/500 [================>.............] - ETA: 1:20 - loss: 0.9196 - regression_loss: 0.8114 - classification_loss: 0.1082 293/500 [================>.............] - ETA: 1:20 - loss: 0.9200 - regression_loss: 0.8117 - classification_loss: 0.1082 294/500 [================>.............] - ETA: 1:19 - loss: 0.9209 - regression_loss: 0.8127 - classification_loss: 0.1082 295/500 [================>.............] - ETA: 1:19 - loss: 0.9206 - regression_loss: 0.8124 - classification_loss: 0.1081 296/500 [================>.............] - ETA: 1:18 - loss: 0.9229 - regression_loss: 0.8140 - classification_loss: 0.1088 297/500 [================>.............] - ETA: 1:18 - loss: 0.9241 - regression_loss: 0.8153 - classification_loss: 0.1088 298/500 [================>.............] - ETA: 1:18 - loss: 0.9229 - regression_loss: 0.8142 - classification_loss: 0.1087 299/500 [================>.............] - ETA: 1:17 - loss: 0.9211 - regression_loss: 0.8126 - classification_loss: 0.1085 300/500 [=================>............] - ETA: 1:17 - loss: 0.9218 - regression_loss: 0.8133 - classification_loss: 0.1085 301/500 [=================>............] - ETA: 1:16 - loss: 0.9233 - regression_loss: 0.8146 - classification_loss: 0.1087 302/500 [=================>............] - ETA: 1:16 - loss: 0.9235 - regression_loss: 0.8148 - classification_loss: 0.1087 303/500 [=================>............] - ETA: 1:16 - loss: 0.9245 - regression_loss: 0.8158 - classification_loss: 0.1087 304/500 [=================>............] - ETA: 1:15 - loss: 0.9254 - regression_loss: 0.8165 - classification_loss: 0.1089 305/500 [=================>............] - ETA: 1:15 - loss: 0.9253 - regression_loss: 0.8165 - classification_loss: 0.1088 306/500 [=================>............] - ETA: 1:15 - loss: 0.9250 - regression_loss: 0.8163 - classification_loss: 0.1087 307/500 [=================>............] - ETA: 1:14 - loss: 0.9235 - regression_loss: 0.8149 - classification_loss: 0.1086 308/500 [=================>............] - ETA: 1:14 - loss: 0.9239 - regression_loss: 0.8153 - classification_loss: 0.1086 309/500 [=================>............] - ETA: 1:13 - loss: 0.9243 - regression_loss: 0.8157 - classification_loss: 0.1086 310/500 [=================>............] - ETA: 1:13 - loss: 0.9238 - regression_loss: 0.8154 - classification_loss: 0.1084 311/500 [=================>............] - ETA: 1:13 - loss: 0.9226 - regression_loss: 0.8144 - classification_loss: 0.1082 312/500 [=================>............] - ETA: 1:12 - loss: 0.9237 - regression_loss: 0.8154 - classification_loss: 0.1083 313/500 [=================>............] - ETA: 1:12 - loss: 0.9243 - regression_loss: 0.8160 - classification_loss: 0.1083 314/500 [=================>............] - ETA: 1:11 - loss: 0.9228 - regression_loss: 0.8147 - classification_loss: 0.1081 315/500 [=================>............] - ETA: 1:11 - loss: 0.9229 - regression_loss: 0.8149 - classification_loss: 0.1081 316/500 [=================>............] - ETA: 1:11 - loss: 0.9214 - regression_loss: 0.8135 - classification_loss: 0.1078 317/500 [==================>...........] - ETA: 1:10 - loss: 0.9234 - regression_loss: 0.8153 - classification_loss: 0.1081 318/500 [==================>...........] - ETA: 1:10 - loss: 0.9246 - regression_loss: 0.8163 - classification_loss: 0.1083 319/500 [==================>...........] - ETA: 1:10 - loss: 0.9243 - regression_loss: 0.8160 - classification_loss: 0.1083 320/500 [==================>...........] - ETA: 1:09 - loss: 0.9232 - regression_loss: 0.8151 - classification_loss: 0.1081 321/500 [==================>...........] - ETA: 1:09 - loss: 0.9232 - regression_loss: 0.8151 - classification_loss: 0.1081 322/500 [==================>...........] - ETA: 1:08 - loss: 0.9235 - regression_loss: 0.8152 - classification_loss: 0.1083 323/500 [==================>...........] - ETA: 1:08 - loss: 0.9227 - regression_loss: 0.8145 - classification_loss: 0.1082 324/500 [==================>...........] - ETA: 1:08 - loss: 0.9220 - regression_loss: 0.8139 - classification_loss: 0.1081 325/500 [==================>...........] - ETA: 1:07 - loss: 0.9226 - regression_loss: 0.8145 - classification_loss: 0.1081 326/500 [==================>...........] - ETA: 1:07 - loss: 0.9221 - regression_loss: 0.8140 - classification_loss: 0.1081 327/500 [==================>...........] - ETA: 1:06 - loss: 0.9228 - regression_loss: 0.8147 - classification_loss: 0.1081 328/500 [==================>...........] - ETA: 1:06 - loss: 0.9243 - regression_loss: 0.8160 - classification_loss: 0.1083 329/500 [==================>...........] - ETA: 1:06 - loss: 0.9234 - regression_loss: 0.8153 - classification_loss: 0.1081 330/500 [==================>...........] - ETA: 1:05 - loss: 0.9220 - regression_loss: 0.8140 - classification_loss: 0.1079 331/500 [==================>...........] - ETA: 1:05 - loss: 0.9215 - regression_loss: 0.8136 - classification_loss: 0.1079 332/500 [==================>...........] - ETA: 1:05 - loss: 0.9211 - regression_loss: 0.8133 - classification_loss: 0.1078 333/500 [==================>...........] - ETA: 1:04 - loss: 0.9196 - regression_loss: 0.8120 - classification_loss: 0.1076 334/500 [===================>..........] - ETA: 1:04 - loss: 0.9204 - regression_loss: 0.8126 - classification_loss: 0.1078 335/500 [===================>..........] - ETA: 1:03 - loss: 0.9201 - regression_loss: 0.8123 - classification_loss: 0.1078 336/500 [===================>..........] - ETA: 1:03 - loss: 0.9198 - regression_loss: 0.8121 - classification_loss: 0.1077 337/500 [===================>..........] - ETA: 1:03 - loss: 0.9208 - regression_loss: 0.8129 - classification_loss: 0.1079 338/500 [===================>..........] - ETA: 1:02 - loss: 0.9210 - regression_loss: 0.8131 - classification_loss: 0.1079 339/500 [===================>..........] - ETA: 1:02 - loss: 0.9228 - regression_loss: 0.8148 - classification_loss: 0.1080 340/500 [===================>..........] - ETA: 1:01 - loss: 0.9226 - regression_loss: 0.8146 - classification_loss: 0.1080 341/500 [===================>..........] - ETA: 1:01 - loss: 0.9224 - regression_loss: 0.8145 - classification_loss: 0.1079 342/500 [===================>..........] - ETA: 1:01 - loss: 0.9219 - regression_loss: 0.8140 - classification_loss: 0.1079 343/500 [===================>..........] - ETA: 1:00 - loss: 0.9212 - regression_loss: 0.8135 - classification_loss: 0.1078 344/500 [===================>..........] - ETA: 1:00 - loss: 0.9208 - regression_loss: 0.8131 - classification_loss: 0.1077 345/500 [===================>..........] - ETA: 59s - loss: 0.9196 - regression_loss: 0.8121 - classification_loss: 0.1075  346/500 [===================>..........] - ETA: 59s - loss: 0.9181 - regression_loss: 0.8108 - classification_loss: 0.1074 347/500 [===================>..........] - ETA: 59s - loss: 0.9185 - regression_loss: 0.8111 - classification_loss: 0.1074 348/500 [===================>..........] - ETA: 58s - loss: 0.9180 - regression_loss: 0.8108 - classification_loss: 0.1072 349/500 [===================>..........] - ETA: 58s - loss: 0.9184 - regression_loss: 0.8110 - classification_loss: 0.1074 350/500 [====================>.........] - ETA: 58s - loss: 0.9176 - regression_loss: 0.8104 - classification_loss: 0.1073 351/500 [====================>.........] - ETA: 57s - loss: 0.9164 - regression_loss: 0.8093 - classification_loss: 0.1071 352/500 [====================>.........] - ETA: 57s - loss: 0.9169 - regression_loss: 0.8097 - classification_loss: 0.1072 353/500 [====================>.........] - ETA: 56s - loss: 0.9160 - regression_loss: 0.8090 - classification_loss: 0.1070 354/500 [====================>.........] - ETA: 56s - loss: 0.9152 - regression_loss: 0.8083 - classification_loss: 0.1069 355/500 [====================>.........] - ETA: 56s - loss: 0.9152 - regression_loss: 0.8081 - classification_loss: 0.1071 356/500 [====================>.........] - ETA: 55s - loss: 0.9155 - regression_loss: 0.8084 - classification_loss: 0.1071 357/500 [====================>.........] - ETA: 55s - loss: 0.9158 - regression_loss: 0.8086 - classification_loss: 0.1072 358/500 [====================>.........] - ETA: 54s - loss: 0.9149 - regression_loss: 0.8079 - classification_loss: 0.1070 359/500 [====================>.........] - ETA: 54s - loss: 0.9147 - regression_loss: 0.8078 - classification_loss: 0.1070 360/500 [====================>.........] - ETA: 54s - loss: 0.9153 - regression_loss: 0.8084 - classification_loss: 0.1069 361/500 [====================>.........] - ETA: 53s - loss: 0.9147 - regression_loss: 0.8078 - classification_loss: 0.1069 362/500 [====================>.........] - ETA: 53s - loss: 0.9151 - regression_loss: 0.8080 - classification_loss: 0.1071 363/500 [====================>.........] - ETA: 53s - loss: 0.9151 - regression_loss: 0.8080 - classification_loss: 0.1072 364/500 [====================>.........] - ETA: 52s - loss: 0.9151 - regression_loss: 0.8081 - classification_loss: 0.1070 365/500 [====================>.........] - ETA: 52s - loss: 0.9139 - regression_loss: 0.8070 - classification_loss: 0.1069 366/500 [====================>.........] - ETA: 51s - loss: 0.9146 - regression_loss: 0.8076 - classification_loss: 0.1070 367/500 [=====================>........] - ETA: 51s - loss: 0.9156 - regression_loss: 0.8085 - classification_loss: 0.1071 368/500 [=====================>........] - ETA: 51s - loss: 0.9155 - regression_loss: 0.8085 - classification_loss: 0.1070 369/500 [=====================>........] - ETA: 50s - loss: 0.9159 - regression_loss: 0.8089 - classification_loss: 0.1070 370/500 [=====================>........] - ETA: 50s - loss: 0.9151 - regression_loss: 0.8081 - classification_loss: 0.1069 371/500 [=====================>........] - ETA: 49s - loss: 0.9152 - regression_loss: 0.8082 - classification_loss: 0.1070 372/500 [=====================>........] - ETA: 49s - loss: 0.9140 - regression_loss: 0.8069 - classification_loss: 0.1071 373/500 [=====================>........] - ETA: 49s - loss: 0.9128 - regression_loss: 0.8059 - classification_loss: 0.1070 374/500 [=====================>........] - ETA: 48s - loss: 0.9125 - regression_loss: 0.8056 - classification_loss: 0.1068 375/500 [=====================>........] - ETA: 48s - loss: 0.9128 - regression_loss: 0.8060 - classification_loss: 0.1068 376/500 [=====================>........] - ETA: 48s - loss: 0.9127 - regression_loss: 0.8059 - classification_loss: 0.1067 377/500 [=====================>........] - ETA: 47s - loss: 0.9127 - regression_loss: 0.8060 - classification_loss: 0.1067 378/500 [=====================>........] - ETA: 47s - loss: 0.9134 - regression_loss: 0.8066 - classification_loss: 0.1068 379/500 [=====================>........] - ETA: 46s - loss: 0.9151 - regression_loss: 0.8079 - classification_loss: 0.1072 380/500 [=====================>........] - ETA: 46s - loss: 0.9141 - regression_loss: 0.8070 - classification_loss: 0.1071 381/500 [=====================>........] - ETA: 46s - loss: 0.9149 - regression_loss: 0.8077 - classification_loss: 0.1072 382/500 [=====================>........] - ETA: 45s - loss: 0.9143 - regression_loss: 0.8071 - classification_loss: 0.1072 383/500 [=====================>........] - ETA: 45s - loss: 0.9139 - regression_loss: 0.8068 - classification_loss: 0.1071 384/500 [======================>.......] - ETA: 44s - loss: 0.9141 - regression_loss: 0.8070 - classification_loss: 0.1070 385/500 [======================>.......] - ETA: 44s - loss: 0.9136 - regression_loss: 0.8067 - classification_loss: 0.1068 386/500 [======================>.......] - ETA: 44s - loss: 0.9139 - regression_loss: 0.8071 - classification_loss: 0.1069 387/500 [======================>.......] - ETA: 43s - loss: 0.9133 - regression_loss: 0.8066 - classification_loss: 0.1067 388/500 [======================>.......] - ETA: 43s - loss: 0.9131 - regression_loss: 0.8065 - classification_loss: 0.1066 389/500 [======================>.......] - ETA: 42s - loss: 0.9129 - regression_loss: 0.8062 - classification_loss: 0.1067 390/500 [======================>.......] - ETA: 42s - loss: 0.9130 - regression_loss: 0.8063 - classification_loss: 0.1067 391/500 [======================>.......] - ETA: 42s - loss: 0.9132 - regression_loss: 0.8064 - classification_loss: 0.1068 392/500 [======================>.......] - ETA: 41s - loss: 0.9128 - regression_loss: 0.8061 - classification_loss: 0.1067 393/500 [======================>.......] - ETA: 41s - loss: 0.9122 - regression_loss: 0.8056 - classification_loss: 0.1066 394/500 [======================>.......] - ETA: 41s - loss: 0.9133 - regression_loss: 0.8066 - classification_loss: 0.1066 395/500 [======================>.......] - ETA: 40s - loss: 0.9129 - regression_loss: 0.8063 - classification_loss: 0.1066 396/500 [======================>.......] - ETA: 40s - loss: 0.9126 - regression_loss: 0.8061 - classification_loss: 0.1066 397/500 [======================>.......] - ETA: 39s - loss: 0.9129 - regression_loss: 0.8063 - classification_loss: 0.1066 398/500 [======================>.......] - ETA: 39s - loss: 0.9128 - regression_loss: 0.8061 - classification_loss: 0.1066 399/500 [======================>.......] - ETA: 39s - loss: 0.9120 - regression_loss: 0.8054 - classification_loss: 0.1065 400/500 [=======================>......] - ETA: 38s - loss: 0.9115 - regression_loss: 0.8050 - classification_loss: 0.1065 401/500 [=======================>......] - ETA: 38s - loss: 0.9117 - regression_loss: 0.8052 - classification_loss: 0.1065 402/500 [=======================>......] - ETA: 37s - loss: 0.9118 - regression_loss: 0.8052 - classification_loss: 0.1066 403/500 [=======================>......] - ETA: 37s - loss: 0.9119 - regression_loss: 0.8053 - classification_loss: 0.1066 404/500 [=======================>......] - ETA: 37s - loss: 0.9131 - regression_loss: 0.8065 - classification_loss: 0.1067 405/500 [=======================>......] - ETA: 36s - loss: 0.9121 - regression_loss: 0.8056 - classification_loss: 0.1065 406/500 [=======================>......] - ETA: 36s - loss: 0.9107 - regression_loss: 0.8043 - classification_loss: 0.1064 407/500 [=======================>......] - ETA: 36s - loss: 0.9099 - regression_loss: 0.8037 - classification_loss: 0.1063 408/500 [=======================>......] - ETA: 35s - loss: 0.9098 - regression_loss: 0.8035 - classification_loss: 0.1063 409/500 [=======================>......] - ETA: 35s - loss: 0.9087 - regression_loss: 0.8025 - classification_loss: 0.1062 410/500 [=======================>......] - ETA: 34s - loss: 0.9090 - regression_loss: 0.8028 - classification_loss: 0.1062 411/500 [=======================>......] - ETA: 34s - loss: 0.9089 - regression_loss: 0.8028 - classification_loss: 0.1061 412/500 [=======================>......] - ETA: 34s - loss: 0.9086 - regression_loss: 0.8024 - classification_loss: 0.1062 413/500 [=======================>......] - ETA: 33s - loss: 0.9072 - regression_loss: 0.8012 - classification_loss: 0.1060 414/500 [=======================>......] - ETA: 33s - loss: 0.9078 - regression_loss: 0.8017 - classification_loss: 0.1061 415/500 [=======================>......] - ETA: 32s - loss: 0.9081 - regression_loss: 0.8020 - classification_loss: 0.1062 416/500 [=======================>......] - ETA: 32s - loss: 0.9078 - regression_loss: 0.8017 - classification_loss: 0.1061 417/500 [========================>.....] - ETA: 32s - loss: 0.9074 - regression_loss: 0.8014 - classification_loss: 0.1060 418/500 [========================>.....] - ETA: 31s - loss: 0.9066 - regression_loss: 0.8008 - classification_loss: 0.1058 419/500 [========================>.....] - ETA: 31s - loss: 0.9059 - regression_loss: 0.8002 - classification_loss: 0.1057 420/500 [========================>.....] - ETA: 30s - loss: 0.9044 - regression_loss: 0.7989 - classification_loss: 0.1055 421/500 [========================>.....] - ETA: 30s - loss: 0.9044 - regression_loss: 0.7989 - classification_loss: 0.1056 422/500 [========================>.....] - ETA: 30s - loss: 0.9035 - regression_loss: 0.7980 - classification_loss: 0.1055 423/500 [========================>.....] - ETA: 29s - loss: 0.9033 - regression_loss: 0.7979 - classification_loss: 0.1054 424/500 [========================>.....] - ETA: 29s - loss: 0.9042 - regression_loss: 0.7987 - classification_loss: 0.1055 425/500 [========================>.....] - ETA: 29s - loss: 0.9048 - regression_loss: 0.7992 - classification_loss: 0.1056 426/500 [========================>.....] - ETA: 28s - loss: 0.9048 - regression_loss: 0.7992 - classification_loss: 0.1056 427/500 [========================>.....] - ETA: 28s - loss: 0.9038 - regression_loss: 0.7983 - classification_loss: 0.1055 428/500 [========================>.....] - ETA: 27s - loss: 0.9040 - regression_loss: 0.7984 - classification_loss: 0.1056 429/500 [========================>.....] - ETA: 27s - loss: 0.9038 - regression_loss: 0.7982 - classification_loss: 0.1056 430/500 [========================>.....] - ETA: 27s - loss: 0.9032 - regression_loss: 0.7977 - classification_loss: 0.1055 431/500 [========================>.....] - ETA: 26s - loss: 0.9026 - regression_loss: 0.7972 - classification_loss: 0.1054 432/500 [========================>.....] - ETA: 26s - loss: 0.9024 - regression_loss: 0.7970 - classification_loss: 0.1055 433/500 [========================>.....] - ETA: 25s - loss: 0.9015 - regression_loss: 0.7962 - classification_loss: 0.1053 434/500 [=========================>....] - ETA: 25s - loss: 0.9011 - regression_loss: 0.7959 - classification_loss: 0.1052 435/500 [=========================>....] - ETA: 25s - loss: 0.9012 - regression_loss: 0.7960 - classification_loss: 0.1052 436/500 [=========================>....] - ETA: 24s - loss: 0.9012 - regression_loss: 0.7960 - classification_loss: 0.1053 437/500 [=========================>....] - ETA: 24s - loss: 0.9013 - regression_loss: 0.7962 - classification_loss: 0.1051 438/500 [=========================>....] - ETA: 24s - loss: 0.9017 - regression_loss: 0.7966 - classification_loss: 0.1052 439/500 [=========================>....] - ETA: 23s - loss: 0.9017 - regression_loss: 0.7966 - classification_loss: 0.1051 440/500 [=========================>....] - ETA: 23s - loss: 0.9027 - regression_loss: 0.7974 - classification_loss: 0.1053 441/500 [=========================>....] - ETA: 22s - loss: 0.9023 - regression_loss: 0.7970 - classification_loss: 0.1053 442/500 [=========================>....] - ETA: 22s - loss: 0.9026 - regression_loss: 0.7973 - classification_loss: 0.1053 443/500 [=========================>....] - ETA: 22s - loss: 0.9019 - regression_loss: 0.7966 - classification_loss: 0.1053 444/500 [=========================>....] - ETA: 21s - loss: 0.9012 - regression_loss: 0.7961 - classification_loss: 0.1051 445/500 [=========================>....] - ETA: 21s - loss: 0.9005 - regression_loss: 0.7954 - classification_loss: 0.1050 446/500 [=========================>....] - ETA: 20s - loss: 0.9005 - regression_loss: 0.7954 - classification_loss: 0.1051 447/500 [=========================>....] - ETA: 20s - loss: 0.9008 - regression_loss: 0.7957 - classification_loss: 0.1051 448/500 [=========================>....] - ETA: 20s - loss: 0.9001 - regression_loss: 0.7950 - classification_loss: 0.1050 449/500 [=========================>....] - ETA: 19s - loss: 0.9009 - regression_loss: 0.7957 - classification_loss: 0.1051 450/500 [==========================>...] - ETA: 19s - loss: 0.9004 - regression_loss: 0.7952 - classification_loss: 0.1051 451/500 [==========================>...] - ETA: 18s - loss: 0.9004 - regression_loss: 0.7953 - classification_loss: 0.1051 452/500 [==========================>...] - ETA: 18s - loss: 0.9007 - regression_loss: 0.7956 - classification_loss: 0.1051 453/500 [==========================>...] - ETA: 18s - loss: 0.9004 - regression_loss: 0.7953 - classification_loss: 0.1051 454/500 [==========================>...] - ETA: 17s - loss: 0.8998 - regression_loss: 0.7947 - classification_loss: 0.1051 455/500 [==========================>...] - ETA: 17s - loss: 0.8990 - regression_loss: 0.7940 - classification_loss: 0.1050 456/500 [==========================>...] - ETA: 17s - loss: 0.8982 - regression_loss: 0.7933 - classification_loss: 0.1049 457/500 [==========================>...] - ETA: 16s - loss: 0.8975 - regression_loss: 0.7928 - classification_loss: 0.1047 458/500 [==========================>...] - ETA: 16s - loss: 0.8976 - regression_loss: 0.7928 - classification_loss: 0.1048 459/500 [==========================>...] - ETA: 15s - loss: 0.8977 - regression_loss: 0.7930 - classification_loss: 0.1047 460/500 [==========================>...] - ETA: 15s - loss: 0.8974 - regression_loss: 0.7928 - classification_loss: 0.1047 461/500 [==========================>...] - ETA: 15s - loss: 0.8977 - regression_loss: 0.7930 - classification_loss: 0.1047 462/500 [==========================>...] - ETA: 14s - loss: 0.8969 - regression_loss: 0.7923 - classification_loss: 0.1047 463/500 [==========================>...] - ETA: 14s - loss: 0.8973 - regression_loss: 0.7925 - classification_loss: 0.1048 464/500 [==========================>...] - ETA: 13s - loss: 0.8974 - regression_loss: 0.7926 - classification_loss: 0.1047 465/500 [==========================>...] - ETA: 13s - loss: 0.8977 - regression_loss: 0.7930 - classification_loss: 0.1047 466/500 [==========================>...] - ETA: 13s - loss: 0.8977 - regression_loss: 0.7929 - classification_loss: 0.1048 467/500 [===========================>..] - ETA: 12s - loss: 0.8973 - regression_loss: 0.7926 - classification_loss: 0.1047 468/500 [===========================>..] - ETA: 12s - loss: 0.8986 - regression_loss: 0.7937 - classification_loss: 0.1049 469/500 [===========================>..] - ETA: 12s - loss: 0.8990 - regression_loss: 0.7939 - classification_loss: 0.1050 470/500 [===========================>..] - ETA: 11s - loss: 0.8996 - regression_loss: 0.7946 - classification_loss: 0.1050 471/500 [===========================>..] - ETA: 11s - loss: 0.8992 - regression_loss: 0.7943 - classification_loss: 0.1049 472/500 [===========================>..] - ETA: 10s - loss: 0.9002 - regression_loss: 0.7951 - classification_loss: 0.1050 473/500 [===========================>..] - ETA: 10s - loss: 0.8996 - regression_loss: 0.7947 - classification_loss: 0.1049 474/500 [===========================>..] - ETA: 10s - loss: 0.8988 - regression_loss: 0.7940 - classification_loss: 0.1048 475/500 [===========================>..] - ETA: 9s - loss: 0.8993 - regression_loss: 0.7944 - classification_loss: 0.1049  476/500 [===========================>..] - ETA: 9s - loss: 0.8993 - regression_loss: 0.7945 - classification_loss: 0.1048 477/500 [===========================>..] - ETA: 8s - loss: 0.8993 - regression_loss: 0.7944 - classification_loss: 0.1049 478/500 [===========================>..] - ETA: 8s - loss: 0.8996 - regression_loss: 0.7947 - classification_loss: 0.1049 479/500 [===========================>..] - ETA: 8s - loss: 0.8987 - regression_loss: 0.7940 - classification_loss: 0.1048 480/500 [===========================>..] - ETA: 7s - loss: 0.8995 - regression_loss: 0.7946 - classification_loss: 0.1049 481/500 [===========================>..] - ETA: 7s - loss: 0.8990 - regression_loss: 0.7942 - classification_loss: 0.1048 482/500 [===========================>..] - ETA: 6s - loss: 0.8987 - regression_loss: 0.7938 - classification_loss: 0.1048 483/500 [===========================>..] - ETA: 6s - loss: 0.8989 - regression_loss: 0.7941 - classification_loss: 0.1049 484/500 [============================>.] - ETA: 6s - loss: 0.8997 - regression_loss: 0.7945 - classification_loss: 0.1051 485/500 [============================>.] - ETA: 5s - loss: 0.8999 - regression_loss: 0.7948 - classification_loss: 0.1050 486/500 [============================>.] - ETA: 5s - loss: 0.8991 - regression_loss: 0.7942 - classification_loss: 0.1049 487/500 [============================>.] - ETA: 5s - loss: 0.8995 - regression_loss: 0.7945 - classification_loss: 0.1050 488/500 [============================>.] - ETA: 4s - loss: 0.9015 - regression_loss: 0.7962 - classification_loss: 0.1053 489/500 [============================>.] - ETA: 4s - loss: 0.9018 - regression_loss: 0.7964 - classification_loss: 0.1054 490/500 [============================>.] - ETA: 3s - loss: 0.9009 - regression_loss: 0.7957 - classification_loss: 0.1053 491/500 [============================>.] - ETA: 3s - loss: 0.9006 - regression_loss: 0.7953 - classification_loss: 0.1053 492/500 [============================>.] - ETA: 3s - loss: 0.8998 - regression_loss: 0.7946 - classification_loss: 0.1052 493/500 [============================>.] - ETA: 2s - loss: 0.8998 - regression_loss: 0.7946 - classification_loss: 0.1052 494/500 [============================>.] - ETA: 2s - loss: 0.8992 - regression_loss: 0.7942 - classification_loss: 0.1051 495/500 [============================>.] - ETA: 1s - loss: 0.8993 - regression_loss: 0.7942 - classification_loss: 0.1051 496/500 [============================>.] - ETA: 1s - loss: 0.8982 - regression_loss: 0.7933 - classification_loss: 0.1049 497/500 [============================>.] - ETA: 1s - loss: 0.8978 - regression_loss: 0.7930 - classification_loss: 0.1048 498/500 [============================>.] - ETA: 0s - loss: 0.8980 - regression_loss: 0.7932 - classification_loss: 0.1048 499/500 [============================>.] - ETA: 0s - loss: 0.8979 - regression_loss: 0.7930 - classification_loss: 0.1049 500/500 [==============================] - 194s 388ms/step - loss: 0.8969 - regression_loss: 0.7922 - classification_loss: 0.1047 1172 instances of class plum with average precision: 0.7093 mAP: 0.7093 Epoch 00011: saving model to ./training/snapshots/resnet101_pascal_11.h5 Epoch 12/20 1/500 [..............................] - ETA: 3:12 - loss: 1.2640 - regression_loss: 1.1027 - classification_loss: 0.1613 2/500 [..............................] - ETA: 3:08 - loss: 0.9670 - regression_loss: 0.8439 - classification_loss: 0.1231 3/500 [..............................] - ETA: 3:07 - loss: 0.9873 - regression_loss: 0.8619 - classification_loss: 0.1254 4/500 [..............................] - ETA: 3:09 - loss: 1.0653 - regression_loss: 0.9435 - classification_loss: 0.1219 5/500 [..............................] - ETA: 3:10 - loss: 0.9606 - regression_loss: 0.8497 - classification_loss: 0.1110 6/500 [..............................] - ETA: 3:11 - loss: 0.9054 - regression_loss: 0.8028 - classification_loss: 0.1026 7/500 [..............................] - ETA: 3:10 - loss: 0.8614 - regression_loss: 0.7622 - classification_loss: 0.0992 8/500 [..............................] - ETA: 3:08 - loss: 0.8669 - regression_loss: 0.7688 - classification_loss: 0.0981 9/500 [..............................] - ETA: 3:08 - loss: 0.8477 - regression_loss: 0.7525 - classification_loss: 0.0952 10/500 [..............................] - ETA: 3:07 - loss: 0.8394 - regression_loss: 0.7482 - classification_loss: 0.0912 11/500 [..............................] - ETA: 3:07 - loss: 0.8448 - regression_loss: 0.7534 - classification_loss: 0.0914 12/500 [..............................] - ETA: 3:07 - loss: 0.8404 - regression_loss: 0.7489 - classification_loss: 0.0914 13/500 [..............................] - ETA: 3:07 - loss: 0.8422 - regression_loss: 0.7505 - classification_loss: 0.0917 14/500 [..............................] - ETA: 3:06 - loss: 0.8555 - regression_loss: 0.7639 - classification_loss: 0.0916 15/500 [..............................] - ETA: 3:05 - loss: 0.8789 - regression_loss: 0.7854 - classification_loss: 0.0935 16/500 [..............................] - ETA: 3:05 - loss: 0.8588 - regression_loss: 0.7689 - classification_loss: 0.0899 17/500 [>.............................] - ETA: 3:05 - loss: 0.8745 - regression_loss: 0.7832 - classification_loss: 0.0913 18/500 [>.............................] - ETA: 3:05 - loss: 0.8781 - regression_loss: 0.7837 - classification_loss: 0.0944 19/500 [>.............................] - ETA: 3:05 - loss: 0.8754 - regression_loss: 0.7803 - classification_loss: 0.0951 20/500 [>.............................] - ETA: 3:05 - loss: 0.8843 - regression_loss: 0.7878 - classification_loss: 0.0965 21/500 [>.............................] - ETA: 3:05 - loss: 0.8633 - regression_loss: 0.7695 - classification_loss: 0.0938 22/500 [>.............................] - ETA: 3:05 - loss: 0.8914 - regression_loss: 0.7928 - classification_loss: 0.0986 23/500 [>.............................] - ETA: 3:05 - loss: 0.8736 - regression_loss: 0.7761 - classification_loss: 0.0974 24/500 [>.............................] - ETA: 3:05 - loss: 0.8603 - regression_loss: 0.7650 - classification_loss: 0.0953 25/500 [>.............................] - ETA: 3:04 - loss: 0.8471 - regression_loss: 0.7534 - classification_loss: 0.0937 26/500 [>.............................] - ETA: 3:04 - loss: 0.8518 - regression_loss: 0.7555 - classification_loss: 0.0963 27/500 [>.............................] - ETA: 3:03 - loss: 0.8474 - regression_loss: 0.7520 - classification_loss: 0.0954 28/500 [>.............................] - ETA: 3:03 - loss: 0.8518 - regression_loss: 0.7568 - classification_loss: 0.0950 29/500 [>.............................] - ETA: 3:02 - loss: 0.8501 - regression_loss: 0.7554 - classification_loss: 0.0947 30/500 [>.............................] - ETA: 3:02 - loss: 0.8570 - regression_loss: 0.7599 - classification_loss: 0.0971 31/500 [>.............................] - ETA: 3:02 - loss: 0.8516 - regression_loss: 0.7550 - classification_loss: 0.0966 32/500 [>.............................] - ETA: 3:02 - loss: 0.8412 - regression_loss: 0.7462 - classification_loss: 0.0950 33/500 [>.............................] - ETA: 3:01 - loss: 0.8238 - regression_loss: 0.7303 - classification_loss: 0.0935 34/500 [=>............................] - ETA: 3:01 - loss: 0.8094 - regression_loss: 0.7171 - classification_loss: 0.0922 35/500 [=>............................] - ETA: 3:00 - loss: 0.8031 - regression_loss: 0.7105 - classification_loss: 0.0926 36/500 [=>............................] - ETA: 3:00 - loss: 0.7983 - regression_loss: 0.7067 - classification_loss: 0.0916 37/500 [=>............................] - ETA: 2:59 - loss: 0.8060 - regression_loss: 0.7136 - classification_loss: 0.0925 38/500 [=>............................] - ETA: 2:59 - loss: 0.8141 - regression_loss: 0.7205 - classification_loss: 0.0936 39/500 [=>............................] - ETA: 2:59 - loss: 0.8162 - regression_loss: 0.7233 - classification_loss: 0.0930 40/500 [=>............................] - ETA: 2:58 - loss: 0.8158 - regression_loss: 0.7235 - classification_loss: 0.0923 41/500 [=>............................] - ETA: 2:57 - loss: 0.8298 - regression_loss: 0.7365 - classification_loss: 0.0933 42/500 [=>............................] - ETA: 2:57 - loss: 0.8326 - regression_loss: 0.7389 - classification_loss: 0.0937 43/500 [=>............................] - ETA: 2:57 - loss: 0.8360 - regression_loss: 0.7415 - classification_loss: 0.0945 44/500 [=>............................] - ETA: 2:57 - loss: 0.8423 - regression_loss: 0.7462 - classification_loss: 0.0961 45/500 [=>............................] - ETA: 2:56 - loss: 0.8372 - regression_loss: 0.7405 - classification_loss: 0.0967 46/500 [=>............................] - ETA: 2:56 - loss: 0.8324 - regression_loss: 0.7362 - classification_loss: 0.0962 47/500 [=>............................] - ETA: 2:55 - loss: 0.8290 - regression_loss: 0.7332 - classification_loss: 0.0958 48/500 [=>............................] - ETA: 2:55 - loss: 0.8277 - regression_loss: 0.7315 - classification_loss: 0.0962 49/500 [=>............................] - ETA: 2:54 - loss: 0.8265 - regression_loss: 0.7305 - classification_loss: 0.0960 50/500 [==>...........................] - ETA: 2:54 - loss: 0.8267 - regression_loss: 0.7311 - classification_loss: 0.0956 51/500 [==>...........................] - ETA: 2:53 - loss: 0.8288 - regression_loss: 0.7325 - classification_loss: 0.0964 52/500 [==>...........................] - ETA: 2:53 - loss: 0.8341 - regression_loss: 0.7366 - classification_loss: 0.0976 53/500 [==>...........................] - ETA: 2:53 - loss: 0.8383 - regression_loss: 0.7400 - classification_loss: 0.0983 54/500 [==>...........................] - ETA: 2:52 - loss: 0.8352 - regression_loss: 0.7374 - classification_loss: 0.0978 55/500 [==>...........................] - ETA: 2:52 - loss: 0.8348 - regression_loss: 0.7375 - classification_loss: 0.0973 56/500 [==>...........................] - ETA: 2:52 - loss: 0.8408 - regression_loss: 0.7429 - classification_loss: 0.0979 57/500 [==>...........................] - ETA: 2:51 - loss: 0.8397 - regression_loss: 0.7420 - classification_loss: 0.0977 58/500 [==>...........................] - ETA: 2:51 - loss: 0.8462 - regression_loss: 0.7489 - classification_loss: 0.0972 59/500 [==>...........................] - ETA: 2:50 - loss: 0.8473 - regression_loss: 0.7488 - classification_loss: 0.0985 60/500 [==>...........................] - ETA: 2:50 - loss: 0.8392 - regression_loss: 0.7417 - classification_loss: 0.0975 61/500 [==>...........................] - ETA: 2:49 - loss: 0.8418 - regression_loss: 0.7442 - classification_loss: 0.0976 62/500 [==>...........................] - ETA: 2:49 - loss: 0.8452 - regression_loss: 0.7472 - classification_loss: 0.0980 63/500 [==>...........................] - ETA: 2:49 - loss: 0.8441 - regression_loss: 0.7463 - classification_loss: 0.0978 64/500 [==>...........................] - ETA: 2:48 - loss: 0.8492 - regression_loss: 0.7504 - classification_loss: 0.0987 65/500 [==>...........................] - ETA: 2:48 - loss: 0.8461 - regression_loss: 0.7481 - classification_loss: 0.0980 66/500 [==>...........................] - ETA: 2:47 - loss: 0.8485 - regression_loss: 0.7504 - classification_loss: 0.0981 67/500 [===>..........................] - ETA: 2:47 - loss: 0.8433 - regression_loss: 0.7462 - classification_loss: 0.0970 68/500 [===>..........................] - ETA: 2:46 - loss: 0.8427 - regression_loss: 0.7459 - classification_loss: 0.0968 69/500 [===>..........................] - ETA: 2:46 - loss: 0.8514 - regression_loss: 0.7542 - classification_loss: 0.0972 70/500 [===>..........................] - ETA: 2:45 - loss: 0.8635 - regression_loss: 0.7655 - classification_loss: 0.0980 71/500 [===>..........................] - ETA: 2:45 - loss: 0.8752 - regression_loss: 0.7759 - classification_loss: 0.0992 72/500 [===>..........................] - ETA: 2:45 - loss: 0.8817 - regression_loss: 0.7820 - classification_loss: 0.0997 73/500 [===>..........................] - ETA: 2:44 - loss: 0.8879 - regression_loss: 0.7875 - classification_loss: 0.1004 74/500 [===>..........................] - ETA: 2:44 - loss: 0.8925 - regression_loss: 0.7918 - classification_loss: 0.1007 75/500 [===>..........................] - ETA: 2:44 - loss: 0.8893 - regression_loss: 0.7892 - classification_loss: 0.1001 76/500 [===>..........................] - ETA: 2:43 - loss: 0.8955 - regression_loss: 0.7940 - classification_loss: 0.1015 77/500 [===>..........................] - ETA: 2:43 - loss: 0.8952 - regression_loss: 0.7936 - classification_loss: 0.1015 78/500 [===>..........................] - ETA: 2:42 - loss: 0.9044 - regression_loss: 0.8009 - classification_loss: 0.1035 79/500 [===>..........................] - ETA: 2:42 - loss: 0.8962 - regression_loss: 0.7937 - classification_loss: 0.1025 80/500 [===>..........................] - ETA: 2:41 - loss: 0.8953 - regression_loss: 0.7932 - classification_loss: 0.1021 81/500 [===>..........................] - ETA: 2:41 - loss: 0.9045 - regression_loss: 0.8006 - classification_loss: 0.1039 82/500 [===>..........................] - ETA: 2:41 - loss: 0.8993 - regression_loss: 0.7962 - classification_loss: 0.1031 83/500 [===>..........................] - ETA: 2:40 - loss: 0.8961 - regression_loss: 0.7937 - classification_loss: 0.1024 84/500 [====>.........................] - ETA: 2:40 - loss: 0.9043 - regression_loss: 0.8008 - classification_loss: 0.1035 85/500 [====>.........................] - ETA: 2:39 - loss: 0.9013 - regression_loss: 0.7983 - classification_loss: 0.1030 86/500 [====>.........................] - ETA: 2:39 - loss: 0.9017 - regression_loss: 0.7988 - classification_loss: 0.1029 87/500 [====>.........................] - ETA: 2:38 - loss: 0.8971 - regression_loss: 0.7950 - classification_loss: 0.1021 88/500 [====>.........................] - ETA: 2:38 - loss: 0.8972 - regression_loss: 0.7952 - classification_loss: 0.1021 89/500 [====>.........................] - ETA: 2:38 - loss: 0.9010 - regression_loss: 0.7981 - classification_loss: 0.1029 90/500 [====>.........................] - ETA: 2:38 - loss: 0.8950 - regression_loss: 0.7929 - classification_loss: 0.1022 91/500 [====>.........................] - ETA: 2:37 - loss: 0.8924 - regression_loss: 0.7906 - classification_loss: 0.1018 92/500 [====>.........................] - ETA: 2:37 - loss: 0.8941 - regression_loss: 0.7923 - classification_loss: 0.1018 93/500 [====>.........................] - ETA: 2:36 - loss: 0.8957 - regression_loss: 0.7939 - classification_loss: 0.1018 94/500 [====>.........................] - ETA: 2:36 - loss: 0.8922 - regression_loss: 0.7913 - classification_loss: 0.1009 95/500 [====>.........................] - ETA: 2:35 - loss: 0.8888 - regression_loss: 0.7885 - classification_loss: 0.1004 96/500 [====>.........................] - ETA: 2:35 - loss: 0.8928 - regression_loss: 0.7916 - classification_loss: 0.1011 97/500 [====>.........................] - ETA: 2:35 - loss: 0.8907 - regression_loss: 0.7897 - classification_loss: 0.1009 98/500 [====>.........................] - ETA: 2:34 - loss: 0.8870 - regression_loss: 0.7862 - classification_loss: 0.1008 99/500 [====>.........................] - ETA: 2:34 - loss: 0.8903 - regression_loss: 0.7892 - classification_loss: 0.1011 100/500 [=====>........................] - ETA: 2:33 - loss: 0.8939 - regression_loss: 0.7926 - classification_loss: 0.1013 101/500 [=====>........................] - ETA: 2:33 - loss: 0.8911 - regression_loss: 0.7902 - classification_loss: 0.1009 102/500 [=====>........................] - ETA: 2:33 - loss: 0.8893 - regression_loss: 0.7889 - classification_loss: 0.1004 103/500 [=====>........................] - ETA: 2:32 - loss: 0.8919 - regression_loss: 0.7912 - classification_loss: 0.1007 104/500 [=====>........................] - ETA: 2:32 - loss: 0.8883 - regression_loss: 0.7881 - classification_loss: 0.1001 105/500 [=====>........................] - ETA: 2:32 - loss: 0.8849 - regression_loss: 0.7851 - classification_loss: 0.0998 106/500 [=====>........................] - ETA: 2:31 - loss: 0.8865 - regression_loss: 0.7865 - classification_loss: 0.1000 107/500 [=====>........................] - ETA: 2:31 - loss: 0.8823 - regression_loss: 0.7828 - classification_loss: 0.0996 108/500 [=====>........................] - ETA: 2:30 - loss: 0.8809 - regression_loss: 0.7815 - classification_loss: 0.0994 109/500 [=====>........................] - ETA: 2:30 - loss: 0.8809 - regression_loss: 0.7816 - classification_loss: 0.0993 110/500 [=====>........................] - ETA: 2:30 - loss: 0.8782 - regression_loss: 0.7792 - classification_loss: 0.0990 111/500 [=====>........................] - ETA: 2:29 - loss: 0.8784 - regression_loss: 0.7796 - classification_loss: 0.0988 112/500 [=====>........................] - ETA: 2:29 - loss: 0.8778 - regression_loss: 0.7791 - classification_loss: 0.0987 113/500 [=====>........................] - ETA: 2:29 - loss: 0.8754 - regression_loss: 0.7769 - classification_loss: 0.0985 114/500 [=====>........................] - ETA: 2:28 - loss: 0.8768 - regression_loss: 0.7783 - classification_loss: 0.0984 115/500 [=====>........................] - ETA: 2:28 - loss: 0.8755 - regression_loss: 0.7772 - classification_loss: 0.0983 116/500 [=====>........................] - ETA: 2:28 - loss: 0.8754 - regression_loss: 0.7770 - classification_loss: 0.0984 117/500 [======>.......................] - ETA: 2:27 - loss: 0.8787 - regression_loss: 0.7801 - classification_loss: 0.0986 118/500 [======>.......................] - ETA: 2:27 - loss: 0.8746 - regression_loss: 0.7765 - classification_loss: 0.0981 119/500 [======>.......................] - ETA: 2:27 - loss: 0.8726 - regression_loss: 0.7747 - classification_loss: 0.0980 120/500 [======>.......................] - ETA: 2:26 - loss: 0.8722 - regression_loss: 0.7744 - classification_loss: 0.0978 121/500 [======>.......................] - ETA: 2:26 - loss: 0.8728 - regression_loss: 0.7749 - classification_loss: 0.0979 122/500 [======>.......................] - ETA: 2:25 - loss: 0.8703 - regression_loss: 0.7726 - classification_loss: 0.0976 123/500 [======>.......................] - ETA: 2:25 - loss: 0.8712 - regression_loss: 0.7735 - classification_loss: 0.0977 124/500 [======>.......................] - ETA: 2:24 - loss: 0.8706 - regression_loss: 0.7732 - classification_loss: 0.0975 125/500 [======>.......................] - ETA: 2:24 - loss: 0.8740 - regression_loss: 0.7760 - classification_loss: 0.0980 126/500 [======>.......................] - ETA: 2:24 - loss: 0.8720 - regression_loss: 0.7743 - classification_loss: 0.0977 127/500 [======>.......................] - ETA: 2:23 - loss: 0.8703 - regression_loss: 0.7727 - classification_loss: 0.0976 128/500 [======>.......................] - ETA: 2:23 - loss: 0.8681 - regression_loss: 0.7709 - classification_loss: 0.0973 129/500 [======>.......................] - ETA: 2:22 - loss: 0.8672 - regression_loss: 0.7702 - classification_loss: 0.0971 130/500 [======>.......................] - ETA: 2:22 - loss: 0.8651 - regression_loss: 0.7684 - classification_loss: 0.0967 131/500 [======>.......................] - ETA: 2:22 - loss: 0.8628 - regression_loss: 0.7665 - classification_loss: 0.0963 132/500 [======>.......................] - ETA: 2:21 - loss: 0.8601 - regression_loss: 0.7642 - classification_loss: 0.0959 133/500 [======>.......................] - ETA: 2:21 - loss: 0.8574 - regression_loss: 0.7618 - classification_loss: 0.0957 134/500 [=======>......................] - ETA: 2:20 - loss: 0.8599 - regression_loss: 0.7642 - classification_loss: 0.0957 135/500 [=======>......................] - ETA: 2:20 - loss: 0.8600 - regression_loss: 0.7643 - classification_loss: 0.0957 136/500 [=======>......................] - ETA: 2:20 - loss: 0.8615 - regression_loss: 0.7657 - classification_loss: 0.0958 137/500 [=======>......................] - ETA: 2:19 - loss: 0.8599 - regression_loss: 0.7640 - classification_loss: 0.0958 138/500 [=======>......................] - ETA: 2:19 - loss: 0.8598 - regression_loss: 0.7640 - classification_loss: 0.0958 139/500 [=======>......................] - ETA: 2:18 - loss: 0.8583 - regression_loss: 0.7627 - classification_loss: 0.0956 140/500 [=======>......................] - ETA: 2:18 - loss: 0.8557 - regression_loss: 0.7605 - classification_loss: 0.0952 141/500 [=======>......................] - ETA: 2:18 - loss: 0.8535 - regression_loss: 0.7584 - classification_loss: 0.0951 142/500 [=======>......................] - ETA: 2:17 - loss: 0.8502 - regression_loss: 0.7556 - classification_loss: 0.0946 143/500 [=======>......................] - ETA: 2:17 - loss: 0.8521 - regression_loss: 0.7571 - classification_loss: 0.0950 144/500 [=======>......................] - ETA: 2:17 - loss: 0.8515 - regression_loss: 0.7568 - classification_loss: 0.0947 145/500 [=======>......................] - ETA: 2:16 - loss: 0.8489 - regression_loss: 0.7546 - classification_loss: 0.0943 146/500 [=======>......................] - ETA: 2:16 - loss: 0.8503 - regression_loss: 0.7558 - classification_loss: 0.0945 147/500 [=======>......................] - ETA: 2:16 - loss: 0.8479 - regression_loss: 0.7538 - classification_loss: 0.0942 148/500 [=======>......................] - ETA: 2:15 - loss: 0.8481 - regression_loss: 0.7540 - classification_loss: 0.0941 149/500 [=======>......................] - ETA: 2:15 - loss: 0.8480 - regression_loss: 0.7539 - classification_loss: 0.0941 150/500 [========>.....................] - ETA: 2:15 - loss: 0.8478 - regression_loss: 0.7539 - classification_loss: 0.0939 151/500 [========>.....................] - ETA: 2:14 - loss: 0.8476 - regression_loss: 0.7535 - classification_loss: 0.0941 152/500 [========>.....................] - ETA: 2:14 - loss: 0.8475 - regression_loss: 0.7534 - classification_loss: 0.0942 153/500 [========>.....................] - ETA: 2:13 - loss: 0.8503 - regression_loss: 0.7560 - classification_loss: 0.0943 154/500 [========>.....................] - ETA: 2:13 - loss: 0.8524 - regression_loss: 0.7580 - classification_loss: 0.0945 155/500 [========>.....................] - ETA: 2:13 - loss: 0.8528 - regression_loss: 0.7583 - classification_loss: 0.0944 156/500 [========>.....................] - ETA: 2:12 - loss: 0.8538 - regression_loss: 0.7592 - classification_loss: 0.0947 157/500 [========>.....................] - ETA: 2:12 - loss: 0.8558 - regression_loss: 0.7608 - classification_loss: 0.0950 158/500 [========>.....................] - ETA: 2:11 - loss: 0.8570 - regression_loss: 0.7618 - classification_loss: 0.0951 159/500 [========>.....................] - ETA: 2:11 - loss: 0.8586 - regression_loss: 0.7634 - classification_loss: 0.0952 160/500 [========>.....................] - ETA: 2:11 - loss: 0.8596 - regression_loss: 0.7643 - classification_loss: 0.0954 161/500 [========>.....................] - ETA: 2:10 - loss: 0.8622 - regression_loss: 0.7664 - classification_loss: 0.0959 162/500 [========>.....................] - ETA: 2:10 - loss: 0.8618 - regression_loss: 0.7661 - classification_loss: 0.0957 163/500 [========>.....................] - ETA: 2:10 - loss: 0.8617 - regression_loss: 0.7659 - classification_loss: 0.0958 164/500 [========>.....................] - ETA: 2:09 - loss: 0.8627 - regression_loss: 0.7666 - classification_loss: 0.0960 165/500 [========>.....................] - ETA: 2:09 - loss: 0.8619 - regression_loss: 0.7660 - classification_loss: 0.0959 166/500 [========>.....................] - ETA: 2:08 - loss: 0.8622 - regression_loss: 0.7662 - classification_loss: 0.0960 167/500 [=========>....................] - ETA: 2:08 - loss: 0.8625 - regression_loss: 0.7663 - classification_loss: 0.0962 168/500 [=========>....................] - ETA: 2:08 - loss: 0.8595 - regression_loss: 0.7637 - classification_loss: 0.0958 169/500 [=========>....................] - ETA: 2:07 - loss: 0.8607 - regression_loss: 0.7648 - classification_loss: 0.0960 170/500 [=========>....................] - ETA: 2:07 - loss: 0.8596 - regression_loss: 0.7638 - classification_loss: 0.0958 171/500 [=========>....................] - ETA: 2:06 - loss: 0.8583 - regression_loss: 0.7628 - classification_loss: 0.0955 172/500 [=========>....................] - ETA: 2:06 - loss: 0.8598 - regression_loss: 0.7643 - classification_loss: 0.0956 173/500 [=========>....................] - ETA: 2:06 - loss: 0.8611 - regression_loss: 0.7653 - classification_loss: 0.0958 174/500 [=========>....................] - ETA: 2:05 - loss: 0.8601 - regression_loss: 0.7643 - classification_loss: 0.0957 175/500 [=========>....................] - ETA: 2:05 - loss: 0.8590 - regression_loss: 0.7635 - classification_loss: 0.0954 176/500 [=========>....................] - ETA: 2:05 - loss: 0.8565 - regression_loss: 0.7613 - classification_loss: 0.0953 177/500 [=========>....................] - ETA: 2:04 - loss: 0.8571 - regression_loss: 0.7617 - classification_loss: 0.0954 178/500 [=========>....................] - ETA: 2:04 - loss: 0.8585 - regression_loss: 0.7628 - classification_loss: 0.0957 179/500 [=========>....................] - ETA: 2:03 - loss: 0.8587 - regression_loss: 0.7630 - classification_loss: 0.0957 180/500 [=========>....................] - ETA: 2:03 - loss: 0.8583 - regression_loss: 0.7626 - classification_loss: 0.0957 181/500 [=========>....................] - ETA: 2:03 - loss: 0.8581 - regression_loss: 0.7623 - classification_loss: 0.0958 182/500 [=========>....................] - ETA: 2:02 - loss: 0.8561 - regression_loss: 0.7603 - classification_loss: 0.0958 183/500 [=========>....................] - ETA: 2:02 - loss: 0.8565 - regression_loss: 0.7606 - classification_loss: 0.0959 184/500 [==========>...................] - ETA: 2:01 - loss: 0.8574 - regression_loss: 0.7612 - classification_loss: 0.0963 185/500 [==========>...................] - ETA: 2:01 - loss: 0.8612 - regression_loss: 0.7642 - classification_loss: 0.0970 186/500 [==========>...................] - ETA: 2:01 - loss: 0.8584 - regression_loss: 0.7618 - classification_loss: 0.0966 187/500 [==========>...................] - ETA: 2:00 - loss: 0.8579 - regression_loss: 0.7614 - classification_loss: 0.0964 188/500 [==========>...................] - ETA: 2:00 - loss: 0.8570 - regression_loss: 0.7608 - classification_loss: 0.0962 189/500 [==========>...................] - ETA: 2:00 - loss: 0.8574 - regression_loss: 0.7611 - classification_loss: 0.0962 190/500 [==========>...................] - ETA: 1:59 - loss: 0.8577 - regression_loss: 0.7614 - classification_loss: 0.0962 191/500 [==========>...................] - ETA: 1:59 - loss: 0.8579 - regression_loss: 0.7616 - classification_loss: 0.0964 192/500 [==========>...................] - ETA: 1:58 - loss: 0.8579 - regression_loss: 0.7614 - classification_loss: 0.0965 193/500 [==========>...................] - ETA: 1:58 - loss: 0.8584 - regression_loss: 0.7619 - classification_loss: 0.0965 194/500 [==========>...................] - ETA: 1:58 - loss: 0.8579 - regression_loss: 0.7614 - classification_loss: 0.0965 195/500 [==========>...................] - ETA: 1:57 - loss: 0.8586 - regression_loss: 0.7619 - classification_loss: 0.0966 196/500 [==========>...................] - ETA: 1:57 - loss: 0.8602 - regression_loss: 0.7632 - classification_loss: 0.0970 197/500 [==========>...................] - ETA: 1:56 - loss: 0.8608 - regression_loss: 0.7636 - classification_loss: 0.0972 198/500 [==========>...................] - ETA: 1:56 - loss: 0.8632 - regression_loss: 0.7655 - classification_loss: 0.0977 199/500 [==========>...................] - ETA: 1:56 - loss: 0.8624 - regression_loss: 0.7647 - classification_loss: 0.0977 200/500 [===========>..................] - ETA: 1:55 - loss: 0.8611 - regression_loss: 0.7636 - classification_loss: 0.0974 201/500 [===========>..................] - ETA: 1:55 - loss: 0.8593 - regression_loss: 0.7621 - classification_loss: 0.0973 202/500 [===========>..................] - ETA: 1:55 - loss: 0.8581 - regression_loss: 0.7609 - classification_loss: 0.0972 203/500 [===========>..................] - ETA: 1:54 - loss: 0.8583 - regression_loss: 0.7611 - classification_loss: 0.0972 204/500 [===========>..................] - ETA: 1:54 - loss: 0.8589 - regression_loss: 0.7617 - classification_loss: 0.0972 205/500 [===========>..................] - ETA: 1:53 - loss: 0.8574 - regression_loss: 0.7604 - classification_loss: 0.0971 206/500 [===========>..................] - ETA: 1:53 - loss: 0.8579 - regression_loss: 0.7607 - classification_loss: 0.0971 207/500 [===========>..................] - ETA: 1:53 - loss: 0.8560 - regression_loss: 0.7590 - classification_loss: 0.0970 208/500 [===========>..................] - ETA: 1:52 - loss: 0.8569 - regression_loss: 0.7596 - classification_loss: 0.0973 209/500 [===========>..................] - ETA: 1:52 - loss: 0.8564 - regression_loss: 0.7593 - classification_loss: 0.0971 210/500 [===========>..................] - ETA: 1:51 - loss: 0.8542 - regression_loss: 0.7574 - classification_loss: 0.0968 211/500 [===========>..................] - ETA: 1:51 - loss: 0.8533 - regression_loss: 0.7566 - classification_loss: 0.0968 212/500 [===========>..................] - ETA: 1:51 - loss: 0.8529 - regression_loss: 0.7563 - classification_loss: 0.0966 213/500 [===========>..................] - ETA: 1:50 - loss: 0.8523 - regression_loss: 0.7556 - classification_loss: 0.0967 214/500 [===========>..................] - ETA: 1:50 - loss: 0.8505 - regression_loss: 0.7539 - classification_loss: 0.0966 215/500 [===========>..................] - ETA: 1:50 - loss: 0.8490 - regression_loss: 0.7525 - classification_loss: 0.0964 216/500 [===========>..................] - ETA: 1:49 - loss: 0.8498 - regression_loss: 0.7532 - classification_loss: 0.0966 217/500 [============>.................] - ETA: 1:49 - loss: 0.8484 - regression_loss: 0.7519 - classification_loss: 0.0965 218/500 [============>.................] - ETA: 1:48 - loss: 0.8458 - regression_loss: 0.7496 - classification_loss: 0.0962 219/500 [============>.................] - ETA: 1:48 - loss: 0.8456 - regression_loss: 0.7496 - classification_loss: 0.0959 220/500 [============>.................] - ETA: 1:48 - loss: 0.8447 - regression_loss: 0.7490 - classification_loss: 0.0957 221/500 [============>.................] - ETA: 1:47 - loss: 0.8441 - regression_loss: 0.7485 - classification_loss: 0.0956 222/500 [============>.................] - ETA: 1:47 - loss: 0.8421 - regression_loss: 0.7468 - classification_loss: 0.0954 223/500 [============>.................] - ETA: 1:46 - loss: 0.8428 - regression_loss: 0.7472 - classification_loss: 0.0956 224/500 [============>.................] - ETA: 1:46 - loss: 0.8422 - regression_loss: 0.7467 - classification_loss: 0.0955 225/500 [============>.................] - ETA: 1:46 - loss: 0.8419 - regression_loss: 0.7465 - classification_loss: 0.0955 226/500 [============>.................] - ETA: 1:45 - loss: 0.8394 - regression_loss: 0.7442 - classification_loss: 0.0952 227/500 [============>.................] - ETA: 1:45 - loss: 0.8415 - regression_loss: 0.7459 - classification_loss: 0.0955 228/500 [============>.................] - ETA: 1:45 - loss: 0.8437 - regression_loss: 0.7479 - classification_loss: 0.0959 229/500 [============>.................] - ETA: 1:44 - loss: 0.8424 - regression_loss: 0.7467 - classification_loss: 0.0957 230/500 [============>.................] - ETA: 1:44 - loss: 0.8421 - regression_loss: 0.7465 - classification_loss: 0.0956 231/500 [============>.................] - ETA: 1:43 - loss: 0.8428 - regression_loss: 0.7472 - classification_loss: 0.0956 232/500 [============>.................] - ETA: 1:43 - loss: 0.8438 - regression_loss: 0.7482 - classification_loss: 0.0956 233/500 [============>.................] - ETA: 1:43 - loss: 0.8439 - regression_loss: 0.7482 - classification_loss: 0.0958 234/500 [=============>................] - ETA: 1:42 - loss: 0.8455 - regression_loss: 0.7496 - classification_loss: 0.0959 235/500 [=============>................] - ETA: 1:42 - loss: 0.8448 - regression_loss: 0.7491 - classification_loss: 0.0957 236/500 [=============>................] - ETA: 1:41 - loss: 0.8449 - regression_loss: 0.7492 - classification_loss: 0.0957 237/500 [=============>................] - ETA: 1:41 - loss: 0.8462 - regression_loss: 0.7502 - classification_loss: 0.0960 238/500 [=============>................] - ETA: 1:41 - loss: 0.8457 - regression_loss: 0.7498 - classification_loss: 0.0959 239/500 [=============>................] - ETA: 1:40 - loss: 0.8471 - regression_loss: 0.7509 - classification_loss: 0.0962 240/500 [=============>................] - ETA: 1:40 - loss: 0.8483 - regression_loss: 0.7519 - classification_loss: 0.0964 241/500 [=============>................] - ETA: 1:40 - loss: 0.8476 - regression_loss: 0.7514 - classification_loss: 0.0962 242/500 [=============>................] - ETA: 1:39 - loss: 0.8473 - regression_loss: 0.7512 - classification_loss: 0.0961 243/500 [=============>................] - ETA: 1:39 - loss: 0.8483 - regression_loss: 0.7521 - classification_loss: 0.0962 244/500 [=============>................] - ETA: 1:39 - loss: 0.8481 - regression_loss: 0.7518 - classification_loss: 0.0963 245/500 [=============>................] - ETA: 1:38 - loss: 0.8491 - regression_loss: 0.7528 - classification_loss: 0.0963 246/500 [=============>................] - ETA: 1:38 - loss: 0.8507 - regression_loss: 0.7542 - classification_loss: 0.0964 247/500 [=============>................] - ETA: 1:37 - loss: 0.8509 - regression_loss: 0.7545 - classification_loss: 0.0964 248/500 [=============>................] - ETA: 1:37 - loss: 0.8522 - regression_loss: 0.7558 - classification_loss: 0.0965 249/500 [=============>................] - ETA: 1:37 - loss: 0.8543 - regression_loss: 0.7574 - classification_loss: 0.0969 250/500 [==============>...............] - ETA: 1:36 - loss: 0.8530 - regression_loss: 0.7563 - classification_loss: 0.0967 251/500 [==============>...............] - ETA: 1:36 - loss: 0.8521 - regression_loss: 0.7554 - classification_loss: 0.0968 252/500 [==============>...............] - ETA: 1:35 - loss: 0.8528 - regression_loss: 0.7559 - classification_loss: 0.0969 253/500 [==============>...............] - ETA: 1:35 - loss: 0.8543 - regression_loss: 0.7573 - classification_loss: 0.0970 254/500 [==============>...............] - ETA: 1:35 - loss: 0.8534 - regression_loss: 0.7566 - classification_loss: 0.0969 255/500 [==============>...............] - ETA: 1:34 - loss: 0.8557 - regression_loss: 0.7584 - classification_loss: 0.0973 256/500 [==============>...............] - ETA: 1:34 - loss: 0.8559 - regression_loss: 0.7587 - classification_loss: 0.0973 257/500 [==============>...............] - ETA: 1:33 - loss: 0.8567 - regression_loss: 0.7593 - classification_loss: 0.0975 258/500 [==============>...............] - ETA: 1:33 - loss: 0.8566 - regression_loss: 0.7591 - classification_loss: 0.0975 259/500 [==============>...............] - ETA: 1:33 - loss: 0.8575 - regression_loss: 0.7598 - classification_loss: 0.0976 260/500 [==============>...............] - ETA: 1:32 - loss: 0.8581 - regression_loss: 0.7605 - classification_loss: 0.0976 261/500 [==============>...............] - ETA: 1:32 - loss: 0.8594 - regression_loss: 0.7616 - classification_loss: 0.0978 262/500 [==============>...............] - ETA: 1:32 - loss: 0.8590 - regression_loss: 0.7613 - classification_loss: 0.0977 263/500 [==============>...............] - ETA: 1:31 - loss: 0.8585 - regression_loss: 0.7610 - classification_loss: 0.0975 264/500 [==============>...............] - ETA: 1:31 - loss: 0.8575 - regression_loss: 0.7601 - classification_loss: 0.0974 265/500 [==============>...............] - ETA: 1:30 - loss: 0.8575 - regression_loss: 0.7601 - classification_loss: 0.0974 266/500 [==============>...............] - ETA: 1:30 - loss: 0.8562 - regression_loss: 0.7590 - classification_loss: 0.0972 267/500 [===============>..............] - ETA: 1:30 - loss: 0.8549 - regression_loss: 0.7579 - classification_loss: 0.0970 268/500 [===============>..............] - ETA: 1:29 - loss: 0.8534 - regression_loss: 0.7565 - classification_loss: 0.0969 269/500 [===============>..............] - ETA: 1:29 - loss: 0.8527 - regression_loss: 0.7558 - classification_loss: 0.0969 270/500 [===============>..............] - ETA: 1:28 - loss: 0.8516 - regression_loss: 0.7549 - classification_loss: 0.0966 271/500 [===============>..............] - ETA: 1:28 - loss: 0.8498 - regression_loss: 0.7534 - classification_loss: 0.0964 272/500 [===============>..............] - ETA: 1:28 - loss: 0.8521 - regression_loss: 0.7555 - classification_loss: 0.0967 273/500 [===============>..............] - ETA: 1:27 - loss: 0.8508 - regression_loss: 0.7542 - classification_loss: 0.0965 274/500 [===============>..............] - ETA: 1:27 - loss: 0.8516 - regression_loss: 0.7551 - classification_loss: 0.0965 275/500 [===============>..............] - ETA: 1:27 - loss: 0.8520 - regression_loss: 0.7556 - classification_loss: 0.0964 276/500 [===============>..............] - ETA: 1:26 - loss: 0.8513 - regression_loss: 0.7549 - classification_loss: 0.0963 277/500 [===============>..............] - ETA: 1:26 - loss: 0.8507 - regression_loss: 0.7545 - classification_loss: 0.0962 278/500 [===============>..............] - ETA: 1:25 - loss: 0.8503 - regression_loss: 0.7541 - classification_loss: 0.0962 279/500 [===============>..............] - ETA: 1:25 - loss: 0.8488 - regression_loss: 0.7528 - classification_loss: 0.0959 280/500 [===============>..............] - ETA: 1:25 - loss: 0.8481 - regression_loss: 0.7522 - classification_loss: 0.0959 281/500 [===============>..............] - ETA: 1:24 - loss: 0.8479 - regression_loss: 0.7522 - classification_loss: 0.0957 282/500 [===============>..............] - ETA: 1:24 - loss: 0.8471 - regression_loss: 0.7515 - classification_loss: 0.0956 283/500 [===============>..............] - ETA: 1:23 - loss: 0.8463 - regression_loss: 0.7508 - classification_loss: 0.0955 284/500 [================>.............] - ETA: 1:23 - loss: 0.8470 - regression_loss: 0.7514 - classification_loss: 0.0956 285/500 [================>.............] - ETA: 1:23 - loss: 0.8480 - regression_loss: 0.7526 - classification_loss: 0.0954 286/500 [================>.............] - ETA: 1:22 - loss: 0.8475 - regression_loss: 0.7521 - classification_loss: 0.0954 287/500 [================>.............] - ETA: 1:22 - loss: 0.8473 - regression_loss: 0.7520 - classification_loss: 0.0953 288/500 [================>.............] - ETA: 1:22 - loss: 0.8483 - regression_loss: 0.7529 - classification_loss: 0.0954 289/500 [================>.............] - ETA: 1:21 - loss: 0.8484 - regression_loss: 0.7531 - classification_loss: 0.0953 290/500 [================>.............] - ETA: 1:21 - loss: 0.8477 - regression_loss: 0.7526 - classification_loss: 0.0951 291/500 [================>.............] - ETA: 1:20 - loss: 0.8473 - regression_loss: 0.7522 - classification_loss: 0.0951 292/500 [================>.............] - ETA: 1:20 - loss: 0.8455 - regression_loss: 0.7506 - classification_loss: 0.0949 293/500 [================>.............] - ETA: 1:20 - loss: 0.8464 - regression_loss: 0.7514 - classification_loss: 0.0950 294/500 [================>.............] - ETA: 1:19 - loss: 0.8467 - regression_loss: 0.7517 - classification_loss: 0.0950 295/500 [================>.............] - ETA: 1:19 - loss: 0.8467 - regression_loss: 0.7517 - classification_loss: 0.0950 296/500 [================>.............] - ETA: 1:18 - loss: 0.8451 - regression_loss: 0.7503 - classification_loss: 0.0948 297/500 [================>.............] - ETA: 1:18 - loss: 0.8464 - regression_loss: 0.7514 - classification_loss: 0.0950 298/500 [================>.............] - ETA: 1:18 - loss: 0.8472 - regression_loss: 0.7523 - classification_loss: 0.0950 299/500 [================>.............] - ETA: 1:17 - loss: 0.8478 - regression_loss: 0.7528 - classification_loss: 0.0949 300/500 [=================>............] - ETA: 1:17 - loss: 0.8492 - regression_loss: 0.7541 - classification_loss: 0.0951 301/500 [=================>............] - ETA: 1:17 - loss: 0.8499 - regression_loss: 0.7547 - classification_loss: 0.0952 302/500 [=================>............] - ETA: 1:16 - loss: 0.8482 - regression_loss: 0.7532 - classification_loss: 0.0950 303/500 [=================>............] - ETA: 1:16 - loss: 0.8486 - regression_loss: 0.7537 - classification_loss: 0.0950 304/500 [=================>............] - ETA: 1:15 - loss: 0.8481 - regression_loss: 0.7533 - classification_loss: 0.0949 305/500 [=================>............] - ETA: 1:15 - loss: 0.8464 - regression_loss: 0.7517 - classification_loss: 0.0947 306/500 [=================>............] - ETA: 1:15 - loss: 0.8472 - regression_loss: 0.7522 - classification_loss: 0.0949 307/500 [=================>............] - ETA: 1:14 - loss: 0.8464 - regression_loss: 0.7516 - classification_loss: 0.0948 308/500 [=================>............] - ETA: 1:14 - loss: 0.8456 - regression_loss: 0.7509 - classification_loss: 0.0947 309/500 [=================>............] - ETA: 1:13 - loss: 0.8461 - regression_loss: 0.7512 - classification_loss: 0.0948 310/500 [=================>............] - ETA: 1:13 - loss: 0.8449 - regression_loss: 0.7502 - classification_loss: 0.0948 311/500 [=================>............] - ETA: 1:13 - loss: 0.8459 - regression_loss: 0.7509 - classification_loss: 0.0950 312/500 [=================>............] - ETA: 1:12 - loss: 0.8448 - regression_loss: 0.7500 - classification_loss: 0.0948 313/500 [=================>............] - ETA: 1:12 - loss: 0.8446 - regression_loss: 0.7499 - classification_loss: 0.0947 314/500 [=================>............] - ETA: 1:12 - loss: 0.8437 - regression_loss: 0.7491 - classification_loss: 0.0946 315/500 [=================>............] - ETA: 1:11 - loss: 0.8433 - regression_loss: 0.7488 - classification_loss: 0.0945 316/500 [=================>............] - ETA: 1:11 - loss: 0.8427 - regression_loss: 0.7484 - classification_loss: 0.0944 317/500 [==================>...........] - ETA: 1:10 - loss: 0.8448 - regression_loss: 0.7502 - classification_loss: 0.0946 318/500 [==================>...........] - ETA: 1:10 - loss: 0.8443 - regression_loss: 0.7498 - classification_loss: 0.0945 319/500 [==================>...........] - ETA: 1:10 - loss: 0.8453 - regression_loss: 0.7506 - classification_loss: 0.0946 320/500 [==================>...........] - ETA: 1:09 - loss: 0.8454 - regression_loss: 0.7507 - classification_loss: 0.0947 321/500 [==================>...........] - ETA: 1:09 - loss: 0.8460 - regression_loss: 0.7511 - classification_loss: 0.0948 322/500 [==================>...........] - ETA: 1:08 - loss: 0.8452 - regression_loss: 0.7505 - classification_loss: 0.0947 323/500 [==================>...........] - ETA: 1:08 - loss: 0.8457 - regression_loss: 0.7509 - classification_loss: 0.0948 324/500 [==================>...........] - ETA: 1:08 - loss: 0.8461 - regression_loss: 0.7514 - classification_loss: 0.0947 325/500 [==================>...........] - ETA: 1:07 - loss: 0.8449 - regression_loss: 0.7504 - classification_loss: 0.0946 326/500 [==================>...........] - ETA: 1:07 - loss: 0.8454 - regression_loss: 0.7509 - classification_loss: 0.0945 327/500 [==================>...........] - ETA: 1:06 - loss: 0.8453 - regression_loss: 0.7506 - classification_loss: 0.0946 328/500 [==================>...........] - ETA: 1:06 - loss: 0.8460 - regression_loss: 0.7511 - classification_loss: 0.0948 329/500 [==================>...........] - ETA: 1:06 - loss: 0.8453 - regression_loss: 0.7505 - classification_loss: 0.0948 330/500 [==================>...........] - ETA: 1:05 - loss: 0.8458 - regression_loss: 0.7509 - classification_loss: 0.0949 331/500 [==================>...........] - ETA: 1:05 - loss: 0.8451 - regression_loss: 0.7501 - classification_loss: 0.0950 332/500 [==================>...........] - ETA: 1:05 - loss: 0.8454 - regression_loss: 0.7504 - classification_loss: 0.0950 333/500 [==================>...........] - ETA: 1:04 - loss: 0.8454 - regression_loss: 0.7504 - classification_loss: 0.0950 334/500 [===================>..........] - ETA: 1:04 - loss: 0.8459 - regression_loss: 0.7508 - classification_loss: 0.0951 335/500 [===================>..........] - ETA: 1:03 - loss: 0.8463 - regression_loss: 0.7513 - classification_loss: 0.0949 336/500 [===================>..........] - ETA: 1:03 - loss: 0.8454 - regression_loss: 0.7505 - classification_loss: 0.0949 337/500 [===================>..........] - ETA: 1:03 - loss: 0.8445 - regression_loss: 0.7498 - classification_loss: 0.0947 338/500 [===================>..........] - ETA: 1:02 - loss: 0.8451 - regression_loss: 0.7503 - classification_loss: 0.0947 339/500 [===================>..........] - ETA: 1:02 - loss: 0.8442 - regression_loss: 0.7496 - classification_loss: 0.0946 340/500 [===================>..........] - ETA: 1:01 - loss: 0.8455 - regression_loss: 0.7507 - classification_loss: 0.0947 341/500 [===================>..........] - ETA: 1:01 - loss: 0.8463 - regression_loss: 0.7515 - classification_loss: 0.0948 342/500 [===================>..........] - ETA: 1:01 - loss: 0.8458 - regression_loss: 0.7510 - classification_loss: 0.0948 343/500 [===================>..........] - ETA: 1:00 - loss: 0.8466 - regression_loss: 0.7516 - classification_loss: 0.0950 344/500 [===================>..........] - ETA: 1:00 - loss: 0.8469 - regression_loss: 0.7520 - classification_loss: 0.0949 345/500 [===================>..........] - ETA: 1:00 - loss: 0.8463 - regression_loss: 0.7515 - classification_loss: 0.0948 346/500 [===================>..........] - ETA: 59s - loss: 0.8466 - regression_loss: 0.7517 - classification_loss: 0.0948  347/500 [===================>..........] - ETA: 59s - loss: 0.8484 - regression_loss: 0.7534 - classification_loss: 0.0950 348/500 [===================>..........] - ETA: 58s - loss: 0.8486 - regression_loss: 0.7537 - classification_loss: 0.0949 349/500 [===================>..........] - ETA: 58s - loss: 0.8483 - regression_loss: 0.7535 - classification_loss: 0.0948 350/500 [====================>.........] - ETA: 58s - loss: 0.8478 - regression_loss: 0.7531 - classification_loss: 0.0947 351/500 [====================>.........] - ETA: 57s - loss: 0.8485 - regression_loss: 0.7537 - classification_loss: 0.0949 352/500 [====================>.........] - ETA: 57s - loss: 0.8497 - regression_loss: 0.7546 - classification_loss: 0.0951 353/500 [====================>.........] - ETA: 56s - loss: 0.8488 - regression_loss: 0.7538 - classification_loss: 0.0950 354/500 [====================>.........] - ETA: 56s - loss: 0.8491 - regression_loss: 0.7540 - classification_loss: 0.0951 355/500 [====================>.........] - ETA: 56s - loss: 0.8496 - regression_loss: 0.7544 - classification_loss: 0.0952 356/500 [====================>.........] - ETA: 55s - loss: 0.8492 - regression_loss: 0.7541 - classification_loss: 0.0951 357/500 [====================>.........] - ETA: 55s - loss: 0.8503 - regression_loss: 0.7551 - classification_loss: 0.0952 358/500 [====================>.........] - ETA: 54s - loss: 0.8506 - regression_loss: 0.7555 - classification_loss: 0.0951 359/500 [====================>.........] - ETA: 54s - loss: 0.8500 - regression_loss: 0.7550 - classification_loss: 0.0950 360/500 [====================>.........] - ETA: 54s - loss: 0.8509 - regression_loss: 0.7558 - classification_loss: 0.0951 361/500 [====================>.........] - ETA: 53s - loss: 0.8522 - regression_loss: 0.7568 - classification_loss: 0.0954 362/500 [====================>.........] - ETA: 53s - loss: 0.8526 - regression_loss: 0.7572 - classification_loss: 0.0954 363/500 [====================>.........] - ETA: 53s - loss: 0.8524 - regression_loss: 0.7570 - classification_loss: 0.0954 364/500 [====================>.........] - ETA: 52s - loss: 0.8518 - regression_loss: 0.7565 - classification_loss: 0.0953 365/500 [====================>.........] - ETA: 52s - loss: 0.8518 - regression_loss: 0.7566 - classification_loss: 0.0952 366/500 [====================>.........] - ETA: 51s - loss: 0.8515 - regression_loss: 0.7562 - classification_loss: 0.0953 367/500 [=====================>........] - ETA: 51s - loss: 0.8517 - regression_loss: 0.7565 - classification_loss: 0.0952 368/500 [=====================>........] - ETA: 51s - loss: 0.8507 - regression_loss: 0.7556 - classification_loss: 0.0951 369/500 [=====================>........] - ETA: 50s - loss: 0.8527 - regression_loss: 0.7573 - classification_loss: 0.0953 370/500 [=====================>........] - ETA: 50s - loss: 0.8511 - regression_loss: 0.7560 - classification_loss: 0.0952 371/500 [=====================>........] - ETA: 49s - loss: 0.8517 - regression_loss: 0.7565 - classification_loss: 0.0952 372/500 [=====================>........] - ETA: 49s - loss: 0.8519 - regression_loss: 0.7567 - classification_loss: 0.0952 373/500 [=====================>........] - ETA: 49s - loss: 0.8521 - regression_loss: 0.7568 - classification_loss: 0.0953 374/500 [=====================>........] - ETA: 48s - loss: 0.8512 - regression_loss: 0.7561 - classification_loss: 0.0951 375/500 [=====================>........] - ETA: 48s - loss: 0.8514 - regression_loss: 0.7562 - classification_loss: 0.0952 376/500 [=====================>........] - ETA: 48s - loss: 0.8536 - regression_loss: 0.7580 - classification_loss: 0.0955 377/500 [=====================>........] - ETA: 47s - loss: 0.8536 - regression_loss: 0.7579 - classification_loss: 0.0956 378/500 [=====================>........] - ETA: 47s - loss: 0.8534 - regression_loss: 0.7578 - classification_loss: 0.0956 379/500 [=====================>........] - ETA: 46s - loss: 0.8522 - regression_loss: 0.7567 - classification_loss: 0.0954 380/500 [=====================>........] - ETA: 46s - loss: 0.8518 - regression_loss: 0.7563 - classification_loss: 0.0954 381/500 [=====================>........] - ETA: 46s - loss: 0.8527 - regression_loss: 0.7573 - classification_loss: 0.0954 382/500 [=====================>........] - ETA: 45s - loss: 0.8530 - regression_loss: 0.7576 - classification_loss: 0.0954 383/500 [=====================>........] - ETA: 45s - loss: 0.8529 - regression_loss: 0.7575 - classification_loss: 0.0954 384/500 [======================>.......] - ETA: 44s - loss: 0.8530 - regression_loss: 0.7576 - classification_loss: 0.0954 385/500 [======================>.......] - ETA: 44s - loss: 0.8536 - regression_loss: 0.7580 - classification_loss: 0.0955 386/500 [======================>.......] - ETA: 44s - loss: 0.8533 - regression_loss: 0.7578 - classification_loss: 0.0955 387/500 [======================>.......] - ETA: 43s - loss: 0.8538 - regression_loss: 0.7582 - classification_loss: 0.0956 388/500 [======================>.......] - ETA: 43s - loss: 0.8535 - regression_loss: 0.7579 - classification_loss: 0.0956 389/500 [======================>.......] - ETA: 43s - loss: 0.8525 - regression_loss: 0.7571 - classification_loss: 0.0955 390/500 [======================>.......] - ETA: 42s - loss: 0.8514 - regression_loss: 0.7561 - classification_loss: 0.0953 391/500 [======================>.......] - ETA: 42s - loss: 0.8519 - regression_loss: 0.7564 - classification_loss: 0.0954 392/500 [======================>.......] - ETA: 41s - loss: 0.8519 - regression_loss: 0.7565 - classification_loss: 0.0955 393/500 [======================>.......] - ETA: 41s - loss: 0.8509 - regression_loss: 0.7555 - classification_loss: 0.0954 394/500 [======================>.......] - ETA: 41s - loss: 0.8519 - regression_loss: 0.7563 - classification_loss: 0.0955 395/500 [======================>.......] - ETA: 40s - loss: 0.8504 - regression_loss: 0.7550 - classification_loss: 0.0954 396/500 [======================>.......] - ETA: 40s - loss: 0.8497 - regression_loss: 0.7545 - classification_loss: 0.0952 397/500 [======================>.......] - ETA: 39s - loss: 0.8507 - regression_loss: 0.7554 - classification_loss: 0.0953 398/500 [======================>.......] - ETA: 39s - loss: 0.8497 - regression_loss: 0.7545 - classification_loss: 0.0952 399/500 [======================>.......] - ETA: 39s - loss: 0.8489 - regression_loss: 0.7538 - classification_loss: 0.0951 400/500 [=======================>......] - ETA: 38s - loss: 0.8493 - regression_loss: 0.7542 - classification_loss: 0.0951 401/500 [=======================>......] - ETA: 38s - loss: 0.8504 - regression_loss: 0.7551 - classification_loss: 0.0953 402/500 [=======================>......] - ETA: 38s - loss: 0.8509 - regression_loss: 0.7556 - classification_loss: 0.0954 403/500 [=======================>......] - ETA: 37s - loss: 0.8512 - regression_loss: 0.7558 - classification_loss: 0.0954 404/500 [=======================>......] - ETA: 37s - loss: 0.8506 - regression_loss: 0.7552 - classification_loss: 0.0954 405/500 [=======================>......] - ETA: 36s - loss: 0.8500 - regression_loss: 0.7547 - classification_loss: 0.0953 406/500 [=======================>......] - ETA: 36s - loss: 0.8498 - regression_loss: 0.7545 - classification_loss: 0.0953 407/500 [=======================>......] - ETA: 36s - loss: 0.8495 - regression_loss: 0.7543 - classification_loss: 0.0952 408/500 [=======================>......] - ETA: 35s - loss: 0.8506 - regression_loss: 0.7553 - classification_loss: 0.0953 409/500 [=======================>......] - ETA: 35s - loss: 0.8505 - regression_loss: 0.7552 - classification_loss: 0.0953 410/500 [=======================>......] - ETA: 34s - loss: 0.8500 - regression_loss: 0.7548 - classification_loss: 0.0952 411/500 [=======================>......] - ETA: 34s - loss: 0.8519 - regression_loss: 0.7560 - classification_loss: 0.0959 412/500 [=======================>......] - ETA: 34s - loss: 0.8523 - regression_loss: 0.7563 - classification_loss: 0.0959 413/500 [=======================>......] - ETA: 33s - loss: 0.8519 - regression_loss: 0.7560 - classification_loss: 0.0959 414/500 [=======================>......] - ETA: 33s - loss: 0.8521 - regression_loss: 0.7562 - classification_loss: 0.0959 415/500 [=======================>......] - ETA: 32s - loss: 0.8527 - regression_loss: 0.7567 - classification_loss: 0.0960 416/500 [=======================>......] - ETA: 32s - loss: 0.8526 - regression_loss: 0.7566 - classification_loss: 0.0961 417/500 [========================>.....] - ETA: 32s - loss: 0.8518 - regression_loss: 0.7559 - classification_loss: 0.0959 418/500 [========================>.....] - ETA: 31s - loss: 0.8516 - regression_loss: 0.7556 - classification_loss: 0.0960 419/500 [========================>.....] - ETA: 31s - loss: 0.8514 - regression_loss: 0.7554 - classification_loss: 0.0960 420/500 [========================>.....] - ETA: 31s - loss: 0.8517 - regression_loss: 0.7556 - classification_loss: 0.0961 421/500 [========================>.....] - ETA: 30s - loss: 0.8508 - regression_loss: 0.7548 - classification_loss: 0.0960 422/500 [========================>.....] - ETA: 30s - loss: 0.8513 - regression_loss: 0.7553 - classification_loss: 0.0961 423/500 [========================>.....] - ETA: 29s - loss: 0.8505 - regression_loss: 0.7545 - classification_loss: 0.0960 424/500 [========================>.....] - ETA: 29s - loss: 0.8506 - regression_loss: 0.7546 - classification_loss: 0.0960 425/500 [========================>.....] - ETA: 29s - loss: 0.8498 - regression_loss: 0.7539 - classification_loss: 0.0959 426/500 [========================>.....] - ETA: 28s - loss: 0.8514 - regression_loss: 0.7554 - classification_loss: 0.0960 427/500 [========================>.....] - ETA: 28s - loss: 0.8504 - regression_loss: 0.7545 - classification_loss: 0.0959 428/500 [========================>.....] - ETA: 27s - loss: 0.8501 - regression_loss: 0.7543 - classification_loss: 0.0958 429/500 [========================>.....] - ETA: 27s - loss: 0.8504 - regression_loss: 0.7545 - classification_loss: 0.0959 430/500 [========================>.....] - ETA: 27s - loss: 0.8497 - regression_loss: 0.7540 - classification_loss: 0.0958 431/500 [========================>.....] - ETA: 26s - loss: 0.8499 - regression_loss: 0.7541 - classification_loss: 0.0958 432/500 [========================>.....] - ETA: 26s - loss: 0.8495 - regression_loss: 0.7537 - classification_loss: 0.0958 433/500 [========================>.....] - ETA: 25s - loss: 0.8500 - regression_loss: 0.7540 - classification_loss: 0.0960 434/500 [=========================>....] - ETA: 25s - loss: 0.8493 - regression_loss: 0.7534 - classification_loss: 0.0959 435/500 [=========================>....] - ETA: 25s - loss: 0.8490 - regression_loss: 0.7530 - classification_loss: 0.0959 436/500 [=========================>....] - ETA: 24s - loss: 0.8481 - regression_loss: 0.7523 - classification_loss: 0.0958 437/500 [=========================>....] - ETA: 24s - loss: 0.8470 - regression_loss: 0.7514 - classification_loss: 0.0956 438/500 [=========================>....] - ETA: 24s - loss: 0.8471 - regression_loss: 0.7514 - classification_loss: 0.0956 439/500 [=========================>....] - ETA: 23s - loss: 0.8463 - regression_loss: 0.7508 - classification_loss: 0.0956 440/500 [=========================>....] - ETA: 23s - loss: 0.8463 - regression_loss: 0.7508 - classification_loss: 0.0956 441/500 [=========================>....] - ETA: 22s - loss: 0.8458 - regression_loss: 0.7503 - classification_loss: 0.0954 442/500 [=========================>....] - ETA: 22s - loss: 0.8450 - regression_loss: 0.7497 - classification_loss: 0.0953 443/500 [=========================>....] - ETA: 22s - loss: 0.8448 - regression_loss: 0.7495 - classification_loss: 0.0953 444/500 [=========================>....] - ETA: 21s - loss: 0.8443 - regression_loss: 0.7491 - classification_loss: 0.0953 445/500 [=========================>....] - ETA: 21s - loss: 0.8437 - regression_loss: 0.7485 - classification_loss: 0.0952 446/500 [=========================>....] - ETA: 20s - loss: 0.8431 - regression_loss: 0.7481 - classification_loss: 0.0951 447/500 [=========================>....] - ETA: 20s - loss: 0.8429 - regression_loss: 0.7479 - classification_loss: 0.0950 448/500 [=========================>....] - ETA: 20s - loss: 0.8439 - regression_loss: 0.7485 - classification_loss: 0.0953 449/500 [=========================>....] - ETA: 19s - loss: 0.8440 - regression_loss: 0.7485 - classification_loss: 0.0954 450/500 [==========================>...] - ETA: 19s - loss: 0.8446 - regression_loss: 0.7491 - classification_loss: 0.0955 451/500 [==========================>...] - ETA: 19s - loss: 0.8451 - regression_loss: 0.7496 - classification_loss: 0.0955 452/500 [==========================>...] - ETA: 18s - loss: 0.8450 - regression_loss: 0.7496 - classification_loss: 0.0954 453/500 [==========================>...] - ETA: 18s - loss: 0.8442 - regression_loss: 0.7489 - classification_loss: 0.0953 454/500 [==========================>...] - ETA: 17s - loss: 0.8439 - regression_loss: 0.7486 - classification_loss: 0.0953 455/500 [==========================>...] - ETA: 17s - loss: 0.8441 - regression_loss: 0.7488 - classification_loss: 0.0953 456/500 [==========================>...] - ETA: 17s - loss: 0.8441 - regression_loss: 0.7486 - classification_loss: 0.0956 457/500 [==========================>...] - ETA: 16s - loss: 0.8449 - regression_loss: 0.7492 - classification_loss: 0.0957 458/500 [==========================>...] - ETA: 16s - loss: 0.8446 - regression_loss: 0.7489 - classification_loss: 0.0956 459/500 [==========================>...] - ETA: 15s - loss: 0.8442 - regression_loss: 0.7486 - classification_loss: 0.0956 460/500 [==========================>...] - ETA: 15s - loss: 0.8444 - regression_loss: 0.7488 - classification_loss: 0.0956 461/500 [==========================>...] - ETA: 15s - loss: 0.8442 - regression_loss: 0.7485 - classification_loss: 0.0957 462/500 [==========================>...] - ETA: 14s - loss: 0.8441 - regression_loss: 0.7484 - classification_loss: 0.0957 463/500 [==========================>...] - ETA: 14s - loss: 0.8439 - regression_loss: 0.7482 - classification_loss: 0.0957 464/500 [==========================>...] - ETA: 13s - loss: 0.8440 - regression_loss: 0.7483 - classification_loss: 0.0957 465/500 [==========================>...] - ETA: 13s - loss: 0.8442 - regression_loss: 0.7485 - classification_loss: 0.0957 466/500 [==========================>...] - ETA: 13s - loss: 0.8442 - regression_loss: 0.7485 - classification_loss: 0.0957 467/500 [===========================>..] - ETA: 12s - loss: 0.8450 - regression_loss: 0.7491 - classification_loss: 0.0959 468/500 [===========================>..] - ETA: 12s - loss: 0.8443 - regression_loss: 0.7486 - classification_loss: 0.0958 469/500 [===========================>..] - ETA: 12s - loss: 0.8443 - regression_loss: 0.7486 - classification_loss: 0.0957 470/500 [===========================>..] - ETA: 11s - loss: 0.8437 - regression_loss: 0.7481 - classification_loss: 0.0956 471/500 [===========================>..] - ETA: 11s - loss: 0.8434 - regression_loss: 0.7478 - classification_loss: 0.0955 472/500 [===========================>..] - ETA: 10s - loss: 0.8428 - regression_loss: 0.7474 - classification_loss: 0.0954 473/500 [===========================>..] - ETA: 10s - loss: 0.8420 - regression_loss: 0.7466 - classification_loss: 0.0954 474/500 [===========================>..] - ETA: 10s - loss: 0.8411 - regression_loss: 0.7458 - classification_loss: 0.0953 475/500 [===========================>..] - ETA: 9s - loss: 0.8409 - regression_loss: 0.7456 - classification_loss: 0.0952  476/500 [===========================>..] - ETA: 9s - loss: 0.8399 - regression_loss: 0.7448 - classification_loss: 0.0951 477/500 [===========================>..] - ETA: 8s - loss: 0.8411 - regression_loss: 0.7458 - classification_loss: 0.0953 478/500 [===========================>..] - ETA: 8s - loss: 0.8404 - regression_loss: 0.7452 - classification_loss: 0.0952 479/500 [===========================>..] - ETA: 8s - loss: 0.8405 - regression_loss: 0.7453 - classification_loss: 0.0952 480/500 [===========================>..] - ETA: 7s - loss: 0.8395 - regression_loss: 0.7443 - classification_loss: 0.0952 481/500 [===========================>..] - ETA: 7s - loss: 0.8391 - regression_loss: 0.7440 - classification_loss: 0.0951 482/500 [===========================>..] - ETA: 6s - loss: 0.8395 - regression_loss: 0.7443 - classification_loss: 0.0952 483/500 [===========================>..] - ETA: 6s - loss: 0.8385 - regression_loss: 0.7433 - classification_loss: 0.0951 484/500 [============================>.] - ETA: 6s - loss: 0.8385 - regression_loss: 0.7433 - classification_loss: 0.0952 485/500 [============================>.] - ETA: 5s - loss: 0.8382 - regression_loss: 0.7430 - classification_loss: 0.0951 486/500 [============================>.] - ETA: 5s - loss: 0.8380 - regression_loss: 0.7429 - classification_loss: 0.0951 487/500 [============================>.] - ETA: 5s - loss: 0.8380 - regression_loss: 0.7429 - classification_loss: 0.0951 488/500 [============================>.] - ETA: 4s - loss: 0.8382 - regression_loss: 0.7430 - classification_loss: 0.0952 489/500 [============================>.] - ETA: 4s - loss: 0.8384 - regression_loss: 0.7431 - classification_loss: 0.0953 490/500 [============================>.] - ETA: 3s - loss: 0.8388 - regression_loss: 0.7435 - classification_loss: 0.0953 491/500 [============================>.] - ETA: 3s - loss: 0.8384 - regression_loss: 0.7431 - classification_loss: 0.0953 492/500 [============================>.] - ETA: 3s - loss: 0.8376 - regression_loss: 0.7424 - classification_loss: 0.0952 493/500 [============================>.] - ETA: 2s - loss: 0.8379 - regression_loss: 0.7426 - classification_loss: 0.0952 494/500 [============================>.] - ETA: 2s - loss: 0.8371 - regression_loss: 0.7420 - classification_loss: 0.0952 495/500 [============================>.] - ETA: 1s - loss: 0.8370 - regression_loss: 0.7418 - classification_loss: 0.0952 496/500 [============================>.] - ETA: 1s - loss: 0.8368 - regression_loss: 0.7415 - classification_loss: 0.0953 497/500 [============================>.] - ETA: 1s - loss: 0.8366 - regression_loss: 0.7414 - classification_loss: 0.0952 498/500 [============================>.] - ETA: 0s - loss: 0.8363 - regression_loss: 0.7411 - classification_loss: 0.0952 499/500 [============================>.] - ETA: 0s - loss: 0.8353 - regression_loss: 0.7403 - classification_loss: 0.0951 500/500 [==============================] - 194s 388ms/step - loss: 0.8350 - regression_loss: 0.7400 - classification_loss: 0.0950 1172 instances of class plum with average precision: 0.6942 mAP: 0.6942 Epoch 00012: saving model to ./training/snapshots/resnet101_pascal_12.h5 Epoch 13/20 1/500 [..............................] - ETA: 3:12 - loss: 0.7874 - regression_loss: 0.7021 - classification_loss: 0.0853 2/500 [..............................] - ETA: 3:09 - loss: 0.8422 - regression_loss: 0.7424 - classification_loss: 0.0998 3/500 [..............................] - ETA: 3:04 - loss: 0.7954 - regression_loss: 0.7039 - classification_loss: 0.0915 4/500 [..............................] - ETA: 3:03 - loss: 0.7384 - regression_loss: 0.6529 - classification_loss: 0.0856 5/500 [..............................] - ETA: 3:05 - loss: 0.7452 - regression_loss: 0.6582 - classification_loss: 0.0870 6/500 [..............................] - ETA: 3:06 - loss: 0.8260 - regression_loss: 0.7378 - classification_loss: 0.0882 7/500 [..............................] - ETA: 3:06 - loss: 0.8195 - regression_loss: 0.7305 - classification_loss: 0.0891 8/500 [..............................] - ETA: 3:05 - loss: 0.8200 - regression_loss: 0.7260 - classification_loss: 0.0940 9/500 [..............................] - ETA: 3:05 - loss: 0.8337 - regression_loss: 0.7383 - classification_loss: 0.0954 10/500 [..............................] - ETA: 3:06 - loss: 0.8415 - regression_loss: 0.7443 - classification_loss: 0.0972 11/500 [..............................] - ETA: 3:06 - loss: 0.8353 - regression_loss: 0.7401 - classification_loss: 0.0952 12/500 [..............................] - ETA: 3:07 - loss: 0.8098 - regression_loss: 0.7172 - classification_loss: 0.0926 13/500 [..............................] - ETA: 3:06 - loss: 0.8054 - regression_loss: 0.7136 - classification_loss: 0.0918 14/500 [..............................] - ETA: 3:06 - loss: 0.8319 - regression_loss: 0.7343 - classification_loss: 0.0976 15/500 [..............................] - ETA: 3:05 - loss: 0.7996 - regression_loss: 0.7060 - classification_loss: 0.0936 16/500 [..............................] - ETA: 3:05 - loss: 0.7855 - regression_loss: 0.6941 - classification_loss: 0.0914 17/500 [>.............................] - ETA: 3:05 - loss: 0.7964 - regression_loss: 0.7021 - classification_loss: 0.0943 18/500 [>.............................] - ETA: 3:05 - loss: 0.8071 - regression_loss: 0.7116 - classification_loss: 0.0955 19/500 [>.............................] - ETA: 3:05 - loss: 0.8097 - regression_loss: 0.7138 - classification_loss: 0.0959 20/500 [>.............................] - ETA: 3:04 - loss: 0.8223 - regression_loss: 0.7238 - classification_loss: 0.0985 21/500 [>.............................] - ETA: 3:04 - loss: 0.8268 - regression_loss: 0.7262 - classification_loss: 0.1006 22/500 [>.............................] - ETA: 3:04 - loss: 0.8226 - regression_loss: 0.7207 - classification_loss: 0.1019 23/500 [>.............................] - ETA: 3:03 - loss: 0.8030 - regression_loss: 0.7047 - classification_loss: 0.0984 24/500 [>.............................] - ETA: 3:03 - loss: 0.8001 - regression_loss: 0.7035 - classification_loss: 0.0966 25/500 [>.............................] - ETA: 3:03 - loss: 0.7979 - regression_loss: 0.7012 - classification_loss: 0.0967 26/500 [>.............................] - ETA: 3:03 - loss: 0.8238 - regression_loss: 0.7271 - classification_loss: 0.0967 27/500 [>.............................] - ETA: 3:02 - loss: 0.8367 - regression_loss: 0.7372 - classification_loss: 0.0995 28/500 [>.............................] - ETA: 3:02 - loss: 0.8260 - regression_loss: 0.7284 - classification_loss: 0.0976 29/500 [>.............................] - ETA: 3:02 - loss: 0.8191 - regression_loss: 0.7226 - classification_loss: 0.0964 30/500 [>.............................] - ETA: 3:02 - loss: 0.8272 - regression_loss: 0.7303 - classification_loss: 0.0969 31/500 [>.............................] - ETA: 3:01 - loss: 0.8173 - regression_loss: 0.7221 - classification_loss: 0.0952 32/500 [>.............................] - ETA: 3:01 - loss: 0.8232 - regression_loss: 0.7275 - classification_loss: 0.0957 33/500 [>.............................] - ETA: 3:01 - loss: 0.8300 - regression_loss: 0.7337 - classification_loss: 0.0962 34/500 [=>............................] - ETA: 3:00 - loss: 0.8311 - regression_loss: 0.7339 - classification_loss: 0.0972 35/500 [=>............................] - ETA: 3:00 - loss: 0.8272 - regression_loss: 0.7305 - classification_loss: 0.0967 36/500 [=>............................] - ETA: 2:59 - loss: 0.8305 - regression_loss: 0.7331 - classification_loss: 0.0974 37/500 [=>............................] - ETA: 2:59 - loss: 0.8266 - regression_loss: 0.7303 - classification_loss: 0.0964 38/500 [=>............................] - ETA: 2:59 - loss: 0.8302 - regression_loss: 0.7343 - classification_loss: 0.0960 39/500 [=>............................] - ETA: 2:58 - loss: 0.8228 - regression_loss: 0.7283 - classification_loss: 0.0945 40/500 [=>............................] - ETA: 2:58 - loss: 0.8138 - regression_loss: 0.7206 - classification_loss: 0.0931 41/500 [=>............................] - ETA: 2:57 - loss: 0.8030 - regression_loss: 0.7114 - classification_loss: 0.0916 42/500 [=>............................] - ETA: 2:57 - loss: 0.8112 - regression_loss: 0.7182 - classification_loss: 0.0929 43/500 [=>............................] - ETA: 2:57 - loss: 0.8103 - regression_loss: 0.7178 - classification_loss: 0.0925 44/500 [=>............................] - ETA: 2:57 - loss: 0.8170 - regression_loss: 0.7235 - classification_loss: 0.0935 45/500 [=>............................] - ETA: 2:56 - loss: 0.8109 - regression_loss: 0.7182 - classification_loss: 0.0928 46/500 [=>............................] - ETA: 2:56 - loss: 0.8199 - regression_loss: 0.7277 - classification_loss: 0.0922 47/500 [=>............................] - ETA: 2:56 - loss: 0.8264 - regression_loss: 0.7334 - classification_loss: 0.0929 48/500 [=>............................] - ETA: 2:56 - loss: 0.8198 - regression_loss: 0.7276 - classification_loss: 0.0922 49/500 [=>............................] - ETA: 2:55 - loss: 0.8166 - regression_loss: 0.7249 - classification_loss: 0.0918 50/500 [==>...........................] - ETA: 2:55 - loss: 0.8182 - regression_loss: 0.7265 - classification_loss: 0.0917 51/500 [==>...........................] - ETA: 2:55 - loss: 0.8201 - regression_loss: 0.7278 - classification_loss: 0.0923 52/500 [==>...........................] - ETA: 2:54 - loss: 0.8196 - regression_loss: 0.7276 - classification_loss: 0.0920 53/500 [==>...........................] - ETA: 2:54 - loss: 0.8178 - regression_loss: 0.7265 - classification_loss: 0.0913 54/500 [==>...........................] - ETA: 2:54 - loss: 0.8173 - regression_loss: 0.7261 - classification_loss: 0.0912 55/500 [==>...........................] - ETA: 2:53 - loss: 0.8275 - regression_loss: 0.7349 - classification_loss: 0.0926 56/500 [==>...........................] - ETA: 2:53 - loss: 0.8285 - regression_loss: 0.7358 - classification_loss: 0.0927 57/500 [==>...........................] - ETA: 2:53 - loss: 0.8280 - regression_loss: 0.7356 - classification_loss: 0.0924 58/500 [==>...........................] - ETA: 2:52 - loss: 0.8260 - regression_loss: 0.7338 - classification_loss: 0.0922 59/500 [==>...........................] - ETA: 2:52 - loss: 0.8218 - regression_loss: 0.7299 - classification_loss: 0.0919 60/500 [==>...........................] - ETA: 2:51 - loss: 0.8292 - regression_loss: 0.7369 - classification_loss: 0.0923 61/500 [==>...........................] - ETA: 2:51 - loss: 0.8257 - regression_loss: 0.7339 - classification_loss: 0.0919 62/500 [==>...........................] - ETA: 2:51 - loss: 0.8250 - regression_loss: 0.7328 - classification_loss: 0.0923 63/500 [==>...........................] - ETA: 2:50 - loss: 0.8204 - regression_loss: 0.7288 - classification_loss: 0.0916 64/500 [==>...........................] - ETA: 2:50 - loss: 0.8202 - regression_loss: 0.7287 - classification_loss: 0.0915 65/500 [==>...........................] - ETA: 2:50 - loss: 0.8191 - regression_loss: 0.7279 - classification_loss: 0.0912 66/500 [==>...........................] - ETA: 2:49 - loss: 0.8151 - regression_loss: 0.7246 - classification_loss: 0.0906 67/500 [===>..........................] - ETA: 2:49 - loss: 0.8191 - regression_loss: 0.7275 - classification_loss: 0.0915 68/500 [===>..........................] - ETA: 2:48 - loss: 0.8224 - regression_loss: 0.7303 - classification_loss: 0.0920 69/500 [===>..........................] - ETA: 2:48 - loss: 0.8179 - regression_loss: 0.7265 - classification_loss: 0.0915 70/500 [===>..........................] - ETA: 2:47 - loss: 0.8198 - regression_loss: 0.7283 - classification_loss: 0.0916 71/500 [===>..........................] - ETA: 2:47 - loss: 0.8257 - regression_loss: 0.7337 - classification_loss: 0.0920 72/500 [===>..........................] - ETA: 2:46 - loss: 0.8315 - regression_loss: 0.7395 - classification_loss: 0.0920 73/500 [===>..........................] - ETA: 2:46 - loss: 0.8365 - regression_loss: 0.7438 - classification_loss: 0.0927 74/500 [===>..........................] - ETA: 2:45 - loss: 0.8379 - regression_loss: 0.7454 - classification_loss: 0.0925 75/500 [===>..........................] - ETA: 2:45 - loss: 0.8347 - regression_loss: 0.7426 - classification_loss: 0.0921 76/500 [===>..........................] - ETA: 2:45 - loss: 0.8342 - regression_loss: 0.7424 - classification_loss: 0.0917 77/500 [===>..........................] - ETA: 2:44 - loss: 0.8364 - regression_loss: 0.7443 - classification_loss: 0.0920 78/500 [===>..........................] - ETA: 2:44 - loss: 0.8377 - regression_loss: 0.7453 - classification_loss: 0.0924 79/500 [===>..........................] - ETA: 2:43 - loss: 0.8400 - regression_loss: 0.7472 - classification_loss: 0.0928 80/500 [===>..........................] - ETA: 2:43 - loss: 0.8350 - regression_loss: 0.7426 - classification_loss: 0.0924 81/500 [===>..........................] - ETA: 2:42 - loss: 0.8300 - regression_loss: 0.7383 - classification_loss: 0.0917 82/500 [===>..........................] - ETA: 2:42 - loss: 0.8283 - regression_loss: 0.7369 - classification_loss: 0.0914 83/500 [===>..........................] - ETA: 2:42 - loss: 0.8226 - regression_loss: 0.7319 - classification_loss: 0.0908 84/500 [====>.........................] - ETA: 2:41 - loss: 0.8165 - regression_loss: 0.7264 - classification_loss: 0.0901 85/500 [====>.........................] - ETA: 2:41 - loss: 0.8149 - regression_loss: 0.7251 - classification_loss: 0.0898 86/500 [====>.........................] - ETA: 2:41 - loss: 0.8220 - regression_loss: 0.7311 - classification_loss: 0.0908 87/500 [====>.........................] - ETA: 2:40 - loss: 0.8219 - regression_loss: 0.7310 - classification_loss: 0.0909 88/500 [====>.........................] - ETA: 2:40 - loss: 0.8256 - regression_loss: 0.7341 - classification_loss: 0.0916 89/500 [====>.........................] - ETA: 2:39 - loss: 0.8276 - regression_loss: 0.7358 - classification_loss: 0.0917 90/500 [====>.........................] - ETA: 2:39 - loss: 0.8316 - regression_loss: 0.7396 - classification_loss: 0.0921 91/500 [====>.........................] - ETA: 2:39 - loss: 0.8279 - regression_loss: 0.7362 - classification_loss: 0.0918 92/500 [====>.........................] - ETA: 2:38 - loss: 0.8243 - regression_loss: 0.7332 - classification_loss: 0.0912 93/500 [====>.........................] - ETA: 2:38 - loss: 0.8199 - regression_loss: 0.7291 - classification_loss: 0.0909 94/500 [====>.........................] - ETA: 2:37 - loss: 0.8177 - regression_loss: 0.7269 - classification_loss: 0.0908 95/500 [====>.........................] - ETA: 2:37 - loss: 0.8184 - regression_loss: 0.7274 - classification_loss: 0.0909 96/500 [====>.........................] - ETA: 2:37 - loss: 0.8159 - regression_loss: 0.7256 - classification_loss: 0.0903 97/500 [====>.........................] - ETA: 2:36 - loss: 0.8142 - regression_loss: 0.7240 - classification_loss: 0.0902 98/500 [====>.........................] - ETA: 2:36 - loss: 0.8171 - regression_loss: 0.7264 - classification_loss: 0.0907 99/500 [====>.........................] - ETA: 2:35 - loss: 0.8203 - regression_loss: 0.7288 - classification_loss: 0.0915 100/500 [=====>........................] - ETA: 2:35 - loss: 0.8165 - regression_loss: 0.7253 - classification_loss: 0.0912 101/500 [=====>........................] - ETA: 2:35 - loss: 0.8195 - regression_loss: 0.7280 - classification_loss: 0.0915 102/500 [=====>........................] - ETA: 2:34 - loss: 0.8216 - regression_loss: 0.7299 - classification_loss: 0.0917 103/500 [=====>........................] - ETA: 2:34 - loss: 0.8198 - regression_loss: 0.7286 - classification_loss: 0.0913 104/500 [=====>........................] - ETA: 2:33 - loss: 0.8156 - regression_loss: 0.7248 - classification_loss: 0.0908 105/500 [=====>........................] - ETA: 2:33 - loss: 0.8148 - regression_loss: 0.7238 - classification_loss: 0.0910 106/500 [=====>........................] - ETA: 2:33 - loss: 0.8131 - regression_loss: 0.7224 - classification_loss: 0.0907 107/500 [=====>........................] - ETA: 2:32 - loss: 0.8167 - regression_loss: 0.7255 - classification_loss: 0.0912 108/500 [=====>........................] - ETA: 2:32 - loss: 0.8168 - regression_loss: 0.7254 - classification_loss: 0.0914 109/500 [=====>........................] - ETA: 2:31 - loss: 0.8175 - regression_loss: 0.7258 - classification_loss: 0.0917 110/500 [=====>........................] - ETA: 2:31 - loss: 0.8151 - regression_loss: 0.7236 - classification_loss: 0.0915 111/500 [=====>........................] - ETA: 2:31 - loss: 0.8168 - regression_loss: 0.7252 - classification_loss: 0.0916 112/500 [=====>........................] - ETA: 2:30 - loss: 0.8195 - regression_loss: 0.7277 - classification_loss: 0.0917 113/500 [=====>........................] - ETA: 2:30 - loss: 0.8186 - regression_loss: 0.7272 - classification_loss: 0.0914 114/500 [=====>........................] - ETA: 2:30 - loss: 0.8225 - regression_loss: 0.7308 - classification_loss: 0.0917 115/500 [=====>........................] - ETA: 2:29 - loss: 0.8192 - regression_loss: 0.7279 - classification_loss: 0.0912 116/500 [=====>........................] - ETA: 2:29 - loss: 0.8166 - regression_loss: 0.7256 - classification_loss: 0.0910 117/500 [======>.......................] - ETA: 2:28 - loss: 0.8126 - regression_loss: 0.7222 - classification_loss: 0.0905 118/500 [======>.......................] - ETA: 2:28 - loss: 0.8104 - regression_loss: 0.7201 - classification_loss: 0.0903 119/500 [======>.......................] - ETA: 2:28 - loss: 0.8056 - regression_loss: 0.7158 - classification_loss: 0.0898 120/500 [======>.......................] - ETA: 2:27 - loss: 0.8081 - regression_loss: 0.7174 - classification_loss: 0.0907 121/500 [======>.......................] - ETA: 2:27 - loss: 0.8094 - regression_loss: 0.7187 - classification_loss: 0.0908 122/500 [======>.......................] - ETA: 2:26 - loss: 0.8111 - regression_loss: 0.7201 - classification_loss: 0.0910 123/500 [======>.......................] - ETA: 2:26 - loss: 0.8118 - regression_loss: 0.7207 - classification_loss: 0.0911 124/500 [======>.......................] - ETA: 2:26 - loss: 0.8088 - regression_loss: 0.7181 - classification_loss: 0.0907 125/500 [======>.......................] - ETA: 2:25 - loss: 0.8084 - regression_loss: 0.7177 - classification_loss: 0.0907 126/500 [======>.......................] - ETA: 2:25 - loss: 0.8098 - regression_loss: 0.7188 - classification_loss: 0.0910 127/500 [======>.......................] - ETA: 2:24 - loss: 0.8135 - regression_loss: 0.7218 - classification_loss: 0.0917 128/500 [======>.......................] - ETA: 2:24 - loss: 0.8109 - regression_loss: 0.7196 - classification_loss: 0.0913 129/500 [======>.......................] - ETA: 2:24 - loss: 0.8112 - regression_loss: 0.7194 - classification_loss: 0.0919 130/500 [======>.......................] - ETA: 2:23 - loss: 0.8089 - regression_loss: 0.7173 - classification_loss: 0.0916 131/500 [======>.......................] - ETA: 2:23 - loss: 0.8108 - regression_loss: 0.7191 - classification_loss: 0.0916 132/500 [======>.......................] - ETA: 2:23 - loss: 0.8090 - regression_loss: 0.7176 - classification_loss: 0.0914 133/500 [======>.......................] - ETA: 2:22 - loss: 0.8123 - regression_loss: 0.7204 - classification_loss: 0.0919 134/500 [=======>......................] - ETA: 2:22 - loss: 0.8111 - regression_loss: 0.7196 - classification_loss: 0.0916 135/500 [=======>......................] - ETA: 2:21 - loss: 0.8128 - regression_loss: 0.7211 - classification_loss: 0.0917 136/500 [=======>......................] - ETA: 2:21 - loss: 0.8134 - regression_loss: 0.7217 - classification_loss: 0.0917 137/500 [=======>......................] - ETA: 2:21 - loss: 0.8130 - regression_loss: 0.7215 - classification_loss: 0.0915 138/500 [=======>......................] - ETA: 2:20 - loss: 0.8131 - regression_loss: 0.7214 - classification_loss: 0.0916 139/500 [=======>......................] - ETA: 2:20 - loss: 0.8103 - regression_loss: 0.7190 - classification_loss: 0.0913 140/500 [=======>......................] - ETA: 2:19 - loss: 0.8109 - regression_loss: 0.7196 - classification_loss: 0.0913 141/500 [=======>......................] - ETA: 2:19 - loss: 0.8129 - regression_loss: 0.7215 - classification_loss: 0.0914 142/500 [=======>......................] - ETA: 2:18 - loss: 0.8124 - regression_loss: 0.7209 - classification_loss: 0.0914 143/500 [=======>......................] - ETA: 2:18 - loss: 0.8142 - regression_loss: 0.7227 - classification_loss: 0.0915 144/500 [=======>......................] - ETA: 2:17 - loss: 0.8161 - regression_loss: 0.7243 - classification_loss: 0.0918 145/500 [=======>......................] - ETA: 2:17 - loss: 0.8167 - regression_loss: 0.7248 - classification_loss: 0.0919 146/500 [=======>......................] - ETA: 2:17 - loss: 0.8139 - regression_loss: 0.7223 - classification_loss: 0.0917 147/500 [=======>......................] - ETA: 2:16 - loss: 0.8144 - regression_loss: 0.7228 - classification_loss: 0.0916 148/500 [=======>......................] - ETA: 2:16 - loss: 0.8142 - regression_loss: 0.7227 - classification_loss: 0.0915 149/500 [=======>......................] - ETA: 2:16 - loss: 0.8192 - regression_loss: 0.7272 - classification_loss: 0.0920 150/500 [========>.....................] - ETA: 2:15 - loss: 0.8191 - regression_loss: 0.7273 - classification_loss: 0.0918 151/500 [========>.....................] - ETA: 2:15 - loss: 0.8160 - regression_loss: 0.7246 - classification_loss: 0.0914 152/500 [========>.....................] - ETA: 2:14 - loss: 0.8170 - regression_loss: 0.7255 - classification_loss: 0.0915 153/500 [========>.....................] - ETA: 2:14 - loss: 0.8160 - regression_loss: 0.7247 - classification_loss: 0.0913 154/500 [========>.....................] - ETA: 2:14 - loss: 0.8134 - regression_loss: 0.7225 - classification_loss: 0.0909 155/500 [========>.....................] - ETA: 2:13 - loss: 0.8123 - regression_loss: 0.7217 - classification_loss: 0.0906 156/500 [========>.....................] - ETA: 2:13 - loss: 0.8131 - regression_loss: 0.7224 - classification_loss: 0.0906 157/500 [========>.....................] - ETA: 2:13 - loss: 0.8149 - regression_loss: 0.7239 - classification_loss: 0.0910 158/500 [========>.....................] - ETA: 2:12 - loss: 0.8114 - regression_loss: 0.7209 - classification_loss: 0.0906 159/500 [========>.....................] - ETA: 2:12 - loss: 0.8107 - regression_loss: 0.7203 - classification_loss: 0.0904 160/500 [========>.....................] - ETA: 2:11 - loss: 0.8088 - regression_loss: 0.7187 - classification_loss: 0.0901 161/500 [========>.....................] - ETA: 2:11 - loss: 0.8120 - regression_loss: 0.7215 - classification_loss: 0.0905 162/500 [========>.....................] - ETA: 2:11 - loss: 0.8123 - regression_loss: 0.7216 - classification_loss: 0.0907 163/500 [========>.....................] - ETA: 2:10 - loss: 0.8101 - regression_loss: 0.7197 - classification_loss: 0.0904 164/500 [========>.....................] - ETA: 2:10 - loss: 0.8114 - regression_loss: 0.7210 - classification_loss: 0.0904 165/500 [========>.....................] - ETA: 2:10 - loss: 0.8104 - regression_loss: 0.7201 - classification_loss: 0.0903 166/500 [========>.....................] - ETA: 2:09 - loss: 0.8100 - regression_loss: 0.7199 - classification_loss: 0.0902 167/500 [=========>....................] - ETA: 2:09 - loss: 0.8093 - regression_loss: 0.7195 - classification_loss: 0.0899 168/500 [=========>....................] - ETA: 2:08 - loss: 0.8113 - regression_loss: 0.7212 - classification_loss: 0.0901 169/500 [=========>....................] - ETA: 2:08 - loss: 0.8100 - regression_loss: 0.7201 - classification_loss: 0.0899 170/500 [=========>....................] - ETA: 2:08 - loss: 0.8102 - regression_loss: 0.7202 - classification_loss: 0.0899 171/500 [=========>....................] - ETA: 2:07 - loss: 0.8133 - regression_loss: 0.7225 - classification_loss: 0.0907 172/500 [=========>....................] - ETA: 2:07 - loss: 0.8112 - regression_loss: 0.7207 - classification_loss: 0.0905 173/500 [=========>....................] - ETA: 2:07 - loss: 0.8133 - regression_loss: 0.7227 - classification_loss: 0.0905 174/500 [=========>....................] - ETA: 2:06 - loss: 0.8124 - regression_loss: 0.7220 - classification_loss: 0.0904 175/500 [=========>....................] - ETA: 2:06 - loss: 0.8128 - regression_loss: 0.7224 - classification_loss: 0.0904 176/500 [=========>....................] - ETA: 2:05 - loss: 0.8115 - regression_loss: 0.7212 - classification_loss: 0.0903 177/500 [=========>....................] - ETA: 2:05 - loss: 0.8122 - regression_loss: 0.7220 - classification_loss: 0.0902 178/500 [=========>....................] - ETA: 2:05 - loss: 0.8098 - regression_loss: 0.7199 - classification_loss: 0.0899 179/500 [=========>....................] - ETA: 2:04 - loss: 0.8081 - regression_loss: 0.7184 - classification_loss: 0.0897 180/500 [=========>....................] - ETA: 2:04 - loss: 0.8081 - regression_loss: 0.7183 - classification_loss: 0.0898 181/500 [=========>....................] - ETA: 2:03 - loss: 0.8080 - regression_loss: 0.7183 - classification_loss: 0.0897 182/500 [=========>....................] - ETA: 2:03 - loss: 0.8084 - regression_loss: 0.7186 - classification_loss: 0.0897 183/500 [=========>....................] - ETA: 2:03 - loss: 0.8076 - regression_loss: 0.7180 - classification_loss: 0.0896 184/500 [==========>...................] - ETA: 2:02 - loss: 0.8084 - regression_loss: 0.7187 - classification_loss: 0.0898 185/500 [==========>...................] - ETA: 2:02 - loss: 0.8100 - regression_loss: 0.7201 - classification_loss: 0.0899 186/500 [==========>...................] - ETA: 2:01 - loss: 0.8093 - regression_loss: 0.7195 - classification_loss: 0.0897 187/500 [==========>...................] - ETA: 2:01 - loss: 0.8105 - regression_loss: 0.7205 - classification_loss: 0.0900 188/500 [==========>...................] - ETA: 2:01 - loss: 0.8113 - regression_loss: 0.7212 - classification_loss: 0.0901 189/500 [==========>...................] - ETA: 2:00 - loss: 0.8100 - regression_loss: 0.7200 - classification_loss: 0.0900 190/500 [==========>...................] - ETA: 2:00 - loss: 0.8089 - regression_loss: 0.7190 - classification_loss: 0.0899 191/500 [==========>...................] - ETA: 2:00 - loss: 0.8070 - regression_loss: 0.7172 - classification_loss: 0.0897 192/500 [==========>...................] - ETA: 1:59 - loss: 0.8075 - regression_loss: 0.7176 - classification_loss: 0.0899 193/500 [==========>...................] - ETA: 1:59 - loss: 0.8085 - regression_loss: 0.7186 - classification_loss: 0.0899 194/500 [==========>...................] - ETA: 1:58 - loss: 0.8067 - regression_loss: 0.7171 - classification_loss: 0.0896 195/500 [==========>...................] - ETA: 1:58 - loss: 0.8103 - regression_loss: 0.7201 - classification_loss: 0.0902 196/500 [==========>...................] - ETA: 1:58 - loss: 0.8114 - regression_loss: 0.7211 - classification_loss: 0.0903 197/500 [==========>...................] - ETA: 1:57 - loss: 0.8115 - regression_loss: 0.7211 - classification_loss: 0.0903 198/500 [==========>...................] - ETA: 1:57 - loss: 0.8094 - regression_loss: 0.7192 - classification_loss: 0.0902 199/500 [==========>...................] - ETA: 1:56 - loss: 0.8091 - regression_loss: 0.7190 - classification_loss: 0.0902 200/500 [===========>..................] - ETA: 1:56 - loss: 0.8112 - regression_loss: 0.7209 - classification_loss: 0.0903 201/500 [===========>..................] - ETA: 1:56 - loss: 0.8132 - regression_loss: 0.7228 - classification_loss: 0.0904 202/500 [===========>..................] - ETA: 1:55 - loss: 0.8124 - regression_loss: 0.7217 - classification_loss: 0.0907 203/500 [===========>..................] - ETA: 1:55 - loss: 0.8109 - regression_loss: 0.7204 - classification_loss: 0.0905 204/500 [===========>..................] - ETA: 1:55 - loss: 0.8110 - regression_loss: 0.7204 - classification_loss: 0.0906 205/500 [===========>..................] - ETA: 1:54 - loss: 0.8094 - regression_loss: 0.7191 - classification_loss: 0.0903 206/500 [===========>..................] - ETA: 1:54 - loss: 0.8093 - regression_loss: 0.7190 - classification_loss: 0.0903 207/500 [===========>..................] - ETA: 1:53 - loss: 0.8086 - regression_loss: 0.7183 - classification_loss: 0.0902 208/500 [===========>..................] - ETA: 1:53 - loss: 0.8089 - regression_loss: 0.7187 - classification_loss: 0.0902 209/500 [===========>..................] - ETA: 1:53 - loss: 0.8109 - regression_loss: 0.7209 - classification_loss: 0.0901 210/500 [===========>..................] - ETA: 1:52 - loss: 0.8109 - regression_loss: 0.7208 - classification_loss: 0.0901 211/500 [===========>..................] - ETA: 1:52 - loss: 0.8114 - regression_loss: 0.7212 - classification_loss: 0.0902 212/500 [===========>..................] - ETA: 1:52 - loss: 0.8102 - regression_loss: 0.7201 - classification_loss: 0.0901 213/500 [===========>..................] - ETA: 1:51 - loss: 0.8096 - regression_loss: 0.7197 - classification_loss: 0.0899 214/500 [===========>..................] - ETA: 1:51 - loss: 0.8103 - regression_loss: 0.7202 - classification_loss: 0.0902 215/500 [===========>..................] - ETA: 1:50 - loss: 0.8100 - regression_loss: 0.7198 - classification_loss: 0.0901 216/500 [===========>..................] - ETA: 1:50 - loss: 0.8094 - regression_loss: 0.7194 - classification_loss: 0.0900 217/500 [============>.................] - ETA: 1:50 - loss: 0.8086 - regression_loss: 0.7188 - classification_loss: 0.0898 218/500 [============>.................] - ETA: 1:49 - loss: 0.8091 - regression_loss: 0.7192 - classification_loss: 0.0900 219/500 [============>.................] - ETA: 1:49 - loss: 0.8090 - regression_loss: 0.7188 - classification_loss: 0.0901 220/500 [============>.................] - ETA: 1:48 - loss: 0.8096 - regression_loss: 0.7195 - classification_loss: 0.0900 221/500 [============>.................] - ETA: 1:48 - loss: 0.8102 - regression_loss: 0.7201 - classification_loss: 0.0900 222/500 [============>.................] - ETA: 1:48 - loss: 0.8123 - regression_loss: 0.7219 - classification_loss: 0.0903 223/500 [============>.................] - ETA: 1:47 - loss: 0.8114 - regression_loss: 0.7212 - classification_loss: 0.0902 224/500 [============>.................] - ETA: 1:47 - loss: 0.8129 - regression_loss: 0.7226 - classification_loss: 0.0903 225/500 [============>.................] - ETA: 1:46 - loss: 0.8122 - regression_loss: 0.7221 - classification_loss: 0.0901 226/500 [============>.................] - ETA: 1:46 - loss: 0.8127 - regression_loss: 0.7224 - classification_loss: 0.0904 227/500 [============>.................] - ETA: 1:46 - loss: 0.8128 - regression_loss: 0.7224 - classification_loss: 0.0904 228/500 [============>.................] - ETA: 1:45 - loss: 0.8124 - regression_loss: 0.7220 - classification_loss: 0.0904 229/500 [============>.................] - ETA: 1:45 - loss: 0.8103 - regression_loss: 0.7201 - classification_loss: 0.0902 230/500 [============>.................] - ETA: 1:44 - loss: 0.8083 - regression_loss: 0.7184 - classification_loss: 0.0899 231/500 [============>.................] - ETA: 1:44 - loss: 0.8070 - regression_loss: 0.7172 - classification_loss: 0.0898 232/500 [============>.................] - ETA: 1:44 - loss: 0.8079 - regression_loss: 0.7179 - classification_loss: 0.0900 233/500 [============>.................] - ETA: 1:43 - loss: 0.8067 - regression_loss: 0.7169 - classification_loss: 0.0898 234/500 [=============>................] - ETA: 1:43 - loss: 0.8058 - regression_loss: 0.7160 - classification_loss: 0.0898 235/500 [=============>................] - ETA: 1:43 - loss: 0.8059 - regression_loss: 0.7161 - classification_loss: 0.0898 236/500 [=============>................] - ETA: 1:42 - loss: 0.8082 - regression_loss: 0.7181 - classification_loss: 0.0902 237/500 [=============>................] - ETA: 1:42 - loss: 0.8090 - regression_loss: 0.7188 - classification_loss: 0.0902 238/500 [=============>................] - ETA: 1:41 - loss: 0.8082 - regression_loss: 0.7182 - classification_loss: 0.0901 239/500 [=============>................] - ETA: 1:41 - loss: 0.8072 - regression_loss: 0.7173 - classification_loss: 0.0899 240/500 [=============>................] - ETA: 1:41 - loss: 0.8081 - regression_loss: 0.7182 - classification_loss: 0.0899 241/500 [=============>................] - ETA: 1:40 - loss: 0.8082 - regression_loss: 0.7183 - classification_loss: 0.0899 242/500 [=============>................] - ETA: 1:40 - loss: 0.8075 - regression_loss: 0.7178 - classification_loss: 0.0897 243/500 [=============>................] - ETA: 1:39 - loss: 0.8052 - regression_loss: 0.7158 - classification_loss: 0.0894 244/500 [=============>................] - ETA: 1:39 - loss: 0.8040 - regression_loss: 0.7147 - classification_loss: 0.0893 245/500 [=============>................] - ETA: 1:39 - loss: 0.8051 - regression_loss: 0.7158 - classification_loss: 0.0893 246/500 [=============>................] - ETA: 1:38 - loss: 0.8061 - regression_loss: 0.7168 - classification_loss: 0.0893 247/500 [=============>................] - ETA: 1:38 - loss: 0.8057 - regression_loss: 0.7164 - classification_loss: 0.0894 248/500 [=============>................] - ETA: 1:37 - loss: 0.8043 - regression_loss: 0.7152 - classification_loss: 0.0891 249/500 [=============>................] - ETA: 1:37 - loss: 0.8037 - regression_loss: 0.7148 - classification_loss: 0.0889 250/500 [==============>...............] - ETA: 1:37 - loss: 0.8045 - regression_loss: 0.7153 - classification_loss: 0.0891 251/500 [==============>...............] - ETA: 1:36 - loss: 0.8049 - regression_loss: 0.7157 - classification_loss: 0.0892 252/500 [==============>...............] - ETA: 1:36 - loss: 0.8056 - regression_loss: 0.7166 - classification_loss: 0.0890 253/500 [==============>...............] - ETA: 1:35 - loss: 0.8052 - regression_loss: 0.7163 - classification_loss: 0.0889 254/500 [==============>...............] - ETA: 1:35 - loss: 0.8052 - regression_loss: 0.7163 - classification_loss: 0.0889 255/500 [==============>...............] - ETA: 1:35 - loss: 0.8051 - regression_loss: 0.7162 - classification_loss: 0.0889 256/500 [==============>...............] - ETA: 1:34 - loss: 0.8044 - regression_loss: 0.7155 - classification_loss: 0.0889 257/500 [==============>...............] - ETA: 1:34 - loss: 0.8046 - regression_loss: 0.7157 - classification_loss: 0.0888 258/500 [==============>...............] - ETA: 1:34 - loss: 0.8049 - regression_loss: 0.7160 - classification_loss: 0.0889 259/500 [==============>...............] - ETA: 1:33 - loss: 0.8054 - regression_loss: 0.7164 - classification_loss: 0.0890 260/500 [==============>...............] - ETA: 1:33 - loss: 0.8044 - regression_loss: 0.7156 - classification_loss: 0.0889 261/500 [==============>...............] - ETA: 1:32 - loss: 0.8047 - regression_loss: 0.7158 - classification_loss: 0.0889 262/500 [==============>...............] - ETA: 1:32 - loss: 0.8047 - regression_loss: 0.7159 - classification_loss: 0.0888 263/500 [==============>...............] - ETA: 1:32 - loss: 0.8032 - regression_loss: 0.7147 - classification_loss: 0.0885 264/500 [==============>...............] - ETA: 1:31 - loss: 0.8043 - regression_loss: 0.7158 - classification_loss: 0.0886 265/500 [==============>...............] - ETA: 1:31 - loss: 0.8044 - regression_loss: 0.7158 - classification_loss: 0.0886 266/500 [==============>...............] - ETA: 1:30 - loss: 0.8025 - regression_loss: 0.7141 - classification_loss: 0.0884 267/500 [===============>..............] - ETA: 1:30 - loss: 0.8005 - regression_loss: 0.7124 - classification_loss: 0.0881 268/500 [===============>..............] - ETA: 1:30 - loss: 0.8000 - regression_loss: 0.7119 - classification_loss: 0.0881 269/500 [===============>..............] - ETA: 1:29 - loss: 0.7987 - regression_loss: 0.7107 - classification_loss: 0.0880 270/500 [===============>..............] - ETA: 1:29 - loss: 0.7994 - regression_loss: 0.7111 - classification_loss: 0.0882 271/500 [===============>..............] - ETA: 1:29 - loss: 0.7984 - regression_loss: 0.7103 - classification_loss: 0.0881 272/500 [===============>..............] - ETA: 1:28 - loss: 0.7987 - regression_loss: 0.7106 - classification_loss: 0.0881 273/500 [===============>..............] - ETA: 1:28 - loss: 0.7990 - regression_loss: 0.7110 - classification_loss: 0.0880 274/500 [===============>..............] - ETA: 1:27 - loss: 0.8004 - regression_loss: 0.7122 - classification_loss: 0.0882 275/500 [===============>..............] - ETA: 1:27 - loss: 0.8015 - regression_loss: 0.7133 - classification_loss: 0.0882 276/500 [===============>..............] - ETA: 1:27 - loss: 0.8024 - regression_loss: 0.7141 - classification_loss: 0.0883 277/500 [===============>..............] - ETA: 1:26 - loss: 0.8041 - regression_loss: 0.7157 - classification_loss: 0.0883 278/500 [===============>..............] - ETA: 1:26 - loss: 0.8030 - regression_loss: 0.7147 - classification_loss: 0.0883 279/500 [===============>..............] - ETA: 1:25 - loss: 0.8032 - regression_loss: 0.7148 - classification_loss: 0.0884 280/500 [===============>..............] - ETA: 1:25 - loss: 0.8035 - regression_loss: 0.7150 - classification_loss: 0.0885 281/500 [===============>..............] - ETA: 1:25 - loss: 0.8025 - regression_loss: 0.7140 - classification_loss: 0.0886 282/500 [===============>..............] - ETA: 1:24 - loss: 0.8033 - regression_loss: 0.7146 - classification_loss: 0.0887 283/500 [===============>..............] - ETA: 1:24 - loss: 0.8032 - regression_loss: 0.7144 - classification_loss: 0.0887 284/500 [================>.............] - ETA: 1:23 - loss: 0.8050 - regression_loss: 0.7160 - classification_loss: 0.0890 285/500 [================>.............] - ETA: 1:23 - loss: 0.8042 - regression_loss: 0.7152 - classification_loss: 0.0890 286/500 [================>.............] - ETA: 1:23 - loss: 0.8031 - regression_loss: 0.7143 - classification_loss: 0.0888 287/500 [================>.............] - ETA: 1:22 - loss: 0.8030 - regression_loss: 0.7141 - classification_loss: 0.0889 288/500 [================>.............] - ETA: 1:22 - loss: 0.8015 - regression_loss: 0.7128 - classification_loss: 0.0887 289/500 [================>.............] - ETA: 1:22 - loss: 0.8013 - regression_loss: 0.7126 - classification_loss: 0.0887 290/500 [================>.............] - ETA: 1:21 - loss: 0.8010 - regression_loss: 0.7124 - classification_loss: 0.0886 291/500 [================>.............] - ETA: 1:21 - loss: 0.8000 - regression_loss: 0.7115 - classification_loss: 0.0885 292/500 [================>.............] - ETA: 1:20 - loss: 0.8003 - regression_loss: 0.7118 - classification_loss: 0.0885 293/500 [================>.............] - ETA: 1:20 - loss: 0.8021 - regression_loss: 0.7134 - classification_loss: 0.0887 294/500 [================>.............] - ETA: 1:20 - loss: 0.8021 - regression_loss: 0.7135 - classification_loss: 0.0886 295/500 [================>.............] - ETA: 1:19 - loss: 0.8010 - regression_loss: 0.7126 - classification_loss: 0.0885 296/500 [================>.............] - ETA: 1:19 - loss: 0.8007 - regression_loss: 0.7123 - classification_loss: 0.0884 297/500 [================>.............] - ETA: 1:18 - loss: 0.7999 - regression_loss: 0.7117 - classification_loss: 0.0882 298/500 [================>.............] - ETA: 1:18 - loss: 0.7996 - regression_loss: 0.7114 - classification_loss: 0.0882 299/500 [================>.............] - ETA: 1:18 - loss: 0.7998 - regression_loss: 0.7115 - classification_loss: 0.0883 300/500 [=================>............] - ETA: 1:17 - loss: 0.7989 - regression_loss: 0.7106 - classification_loss: 0.0883 301/500 [=================>............] - ETA: 1:17 - loss: 0.7989 - regression_loss: 0.7107 - classification_loss: 0.0882 302/500 [=================>............] - ETA: 1:16 - loss: 0.7990 - regression_loss: 0.7109 - classification_loss: 0.0882 303/500 [=================>............] - ETA: 1:16 - loss: 0.7975 - regression_loss: 0.7095 - classification_loss: 0.0880 304/500 [=================>............] - ETA: 1:16 - loss: 0.7987 - regression_loss: 0.7106 - classification_loss: 0.0880 305/500 [=================>............] - ETA: 1:15 - loss: 0.7994 - regression_loss: 0.7114 - classification_loss: 0.0880 306/500 [=================>............] - ETA: 1:15 - loss: 0.7991 - regression_loss: 0.7111 - classification_loss: 0.0880 307/500 [=================>............] - ETA: 1:15 - loss: 0.7984 - regression_loss: 0.7105 - classification_loss: 0.0878 308/500 [=================>............] - ETA: 1:14 - loss: 0.7979 - regression_loss: 0.7102 - classification_loss: 0.0877 309/500 [=================>............] - ETA: 1:14 - loss: 0.7969 - regression_loss: 0.7093 - classification_loss: 0.0876 310/500 [=================>............] - ETA: 1:13 - loss: 0.7978 - regression_loss: 0.7101 - classification_loss: 0.0877 311/500 [=================>............] - ETA: 1:13 - loss: 0.7978 - regression_loss: 0.7101 - classification_loss: 0.0877 312/500 [=================>............] - ETA: 1:13 - loss: 0.7973 - regression_loss: 0.7097 - classification_loss: 0.0875 313/500 [=================>............] - ETA: 1:12 - loss: 0.7958 - regression_loss: 0.7084 - classification_loss: 0.0874 314/500 [=================>............] - ETA: 1:12 - loss: 0.7947 - regression_loss: 0.7074 - classification_loss: 0.0872 315/500 [=================>............] - ETA: 1:11 - loss: 0.7931 - regression_loss: 0.7060 - classification_loss: 0.0870 316/500 [=================>............] - ETA: 1:11 - loss: 0.7920 - regression_loss: 0.7051 - classification_loss: 0.0869 317/500 [==================>...........] - ETA: 1:11 - loss: 0.7935 - regression_loss: 0.7064 - classification_loss: 0.0871 318/500 [==================>...........] - ETA: 1:10 - loss: 0.7941 - regression_loss: 0.7070 - classification_loss: 0.0871 319/500 [==================>...........] - ETA: 1:10 - loss: 0.7970 - regression_loss: 0.7095 - classification_loss: 0.0875 320/500 [==================>...........] - ETA: 1:10 - loss: 0.7983 - regression_loss: 0.7107 - classification_loss: 0.0876 321/500 [==================>...........] - ETA: 1:09 - loss: 0.7997 - regression_loss: 0.7119 - classification_loss: 0.0878 322/500 [==================>...........] - ETA: 1:09 - loss: 0.8002 - regression_loss: 0.7123 - classification_loss: 0.0879 323/500 [==================>...........] - ETA: 1:08 - loss: 0.8010 - regression_loss: 0.7130 - classification_loss: 0.0880 324/500 [==================>...........] - ETA: 1:08 - loss: 0.7997 - regression_loss: 0.7118 - classification_loss: 0.0878 325/500 [==================>...........] - ETA: 1:08 - loss: 0.8006 - regression_loss: 0.7124 - classification_loss: 0.0882 326/500 [==================>...........] - ETA: 1:07 - loss: 0.8008 - regression_loss: 0.7126 - classification_loss: 0.0882 327/500 [==================>...........] - ETA: 1:07 - loss: 0.8010 - regression_loss: 0.7127 - classification_loss: 0.0883 328/500 [==================>...........] - ETA: 1:06 - loss: 0.8008 - regression_loss: 0.7125 - classification_loss: 0.0883 329/500 [==================>...........] - ETA: 1:06 - loss: 0.8011 - regression_loss: 0.7127 - classification_loss: 0.0883 330/500 [==================>...........] - ETA: 1:06 - loss: 0.8016 - regression_loss: 0.7131 - classification_loss: 0.0885 331/500 [==================>...........] - ETA: 1:05 - loss: 0.8016 - regression_loss: 0.7131 - classification_loss: 0.0886 332/500 [==================>...........] - ETA: 1:05 - loss: 0.8019 - regression_loss: 0.7133 - classification_loss: 0.0887 333/500 [==================>...........] - ETA: 1:04 - loss: 0.8013 - regression_loss: 0.7127 - classification_loss: 0.0886 334/500 [===================>..........] - ETA: 1:04 - loss: 0.8016 - regression_loss: 0.7128 - classification_loss: 0.0887 335/500 [===================>..........] - ETA: 1:04 - loss: 0.8016 - regression_loss: 0.7129 - classification_loss: 0.0887 336/500 [===================>..........] - ETA: 1:03 - loss: 0.8030 - regression_loss: 0.7141 - classification_loss: 0.0888 337/500 [===================>..........] - ETA: 1:03 - loss: 0.8039 - regression_loss: 0.7150 - classification_loss: 0.0889 338/500 [===================>..........] - ETA: 1:02 - loss: 0.8037 - regression_loss: 0.7149 - classification_loss: 0.0889 339/500 [===================>..........] - ETA: 1:02 - loss: 0.8045 - regression_loss: 0.7155 - classification_loss: 0.0890 340/500 [===================>..........] - ETA: 1:02 - loss: 0.8046 - regression_loss: 0.7156 - classification_loss: 0.0890 341/500 [===================>..........] - ETA: 1:01 - loss: 0.8049 - regression_loss: 0.7159 - classification_loss: 0.0890 342/500 [===================>..........] - ETA: 1:01 - loss: 0.8038 - regression_loss: 0.7149 - classification_loss: 0.0890 343/500 [===================>..........] - ETA: 1:00 - loss: 0.8046 - regression_loss: 0.7156 - classification_loss: 0.0891 344/500 [===================>..........] - ETA: 1:00 - loss: 0.8044 - regression_loss: 0.7153 - classification_loss: 0.0891 345/500 [===================>..........] - ETA: 1:00 - loss: 0.8036 - regression_loss: 0.7145 - classification_loss: 0.0890 346/500 [===================>..........] - ETA: 59s - loss: 0.8036 - regression_loss: 0.7146 - classification_loss: 0.0890  347/500 [===================>..........] - ETA: 59s - loss: 0.8041 - regression_loss: 0.7150 - classification_loss: 0.0891 348/500 [===================>..........] - ETA: 58s - loss: 0.8041 - regression_loss: 0.7149 - classification_loss: 0.0892 349/500 [===================>..........] - ETA: 58s - loss: 0.8032 - regression_loss: 0.7141 - classification_loss: 0.0891 350/500 [====================>.........] - ETA: 58s - loss: 0.8035 - regression_loss: 0.7143 - classification_loss: 0.0891 351/500 [====================>.........] - ETA: 57s - loss: 0.8021 - regression_loss: 0.7131 - classification_loss: 0.0890 352/500 [====================>.........] - ETA: 57s - loss: 0.8017 - regression_loss: 0.7125 - classification_loss: 0.0892 353/500 [====================>.........] - ETA: 56s - loss: 0.8018 - regression_loss: 0.7126 - classification_loss: 0.0892 354/500 [====================>.........] - ETA: 56s - loss: 0.8018 - regression_loss: 0.7127 - classification_loss: 0.0892 355/500 [====================>.........] - ETA: 56s - loss: 0.8009 - regression_loss: 0.7118 - classification_loss: 0.0891 356/500 [====================>.........] - ETA: 55s - loss: 0.8005 - regression_loss: 0.7115 - classification_loss: 0.0890 357/500 [====================>.........] - ETA: 55s - loss: 0.8006 - regression_loss: 0.7115 - classification_loss: 0.0891 358/500 [====================>.........] - ETA: 55s - loss: 0.8009 - regression_loss: 0.7118 - classification_loss: 0.0891 359/500 [====================>.........] - ETA: 54s - loss: 0.8011 - regression_loss: 0.7120 - classification_loss: 0.0892 360/500 [====================>.........] - ETA: 54s - loss: 0.8003 - regression_loss: 0.7112 - classification_loss: 0.0890 361/500 [====================>.........] - ETA: 53s - loss: 0.7994 - regression_loss: 0.7105 - classification_loss: 0.0889 362/500 [====================>.........] - ETA: 53s - loss: 0.7989 - regression_loss: 0.7101 - classification_loss: 0.0888 363/500 [====================>.........] - ETA: 53s - loss: 0.7981 - regression_loss: 0.7094 - classification_loss: 0.0887 364/500 [====================>.........] - ETA: 52s - loss: 0.7974 - regression_loss: 0.7088 - classification_loss: 0.0886 365/500 [====================>.........] - ETA: 52s - loss: 0.7971 - regression_loss: 0.7084 - classification_loss: 0.0886 366/500 [====================>.........] - ETA: 51s - loss: 0.7973 - regression_loss: 0.7086 - classification_loss: 0.0886 367/500 [=====================>........] - ETA: 51s - loss: 0.7974 - regression_loss: 0.7088 - classification_loss: 0.0886 368/500 [=====================>........] - ETA: 51s - loss: 0.7979 - regression_loss: 0.7093 - classification_loss: 0.0887 369/500 [=====================>........] - ETA: 50s - loss: 0.7987 - regression_loss: 0.7099 - classification_loss: 0.0888 370/500 [=====================>........] - ETA: 50s - loss: 0.7984 - regression_loss: 0.7095 - classification_loss: 0.0889 371/500 [=====================>........] - ETA: 49s - loss: 0.7989 - regression_loss: 0.7099 - classification_loss: 0.0890 372/500 [=====================>........] - ETA: 49s - loss: 0.7978 - regression_loss: 0.7088 - classification_loss: 0.0889 373/500 [=====================>........] - ETA: 49s - loss: 0.7976 - regression_loss: 0.7087 - classification_loss: 0.0889 374/500 [=====================>........] - ETA: 48s - loss: 0.7971 - regression_loss: 0.7083 - classification_loss: 0.0888 375/500 [=====================>........] - ETA: 48s - loss: 0.7968 - regression_loss: 0.7079 - classification_loss: 0.0888 376/500 [=====================>........] - ETA: 48s - loss: 0.7973 - regression_loss: 0.7085 - classification_loss: 0.0889 377/500 [=====================>........] - ETA: 47s - loss: 0.7969 - regression_loss: 0.7082 - classification_loss: 0.0887 378/500 [=====================>........] - ETA: 47s - loss: 0.7966 - regression_loss: 0.7080 - classification_loss: 0.0886 379/500 [=====================>........] - ETA: 46s - loss: 0.7952 - regression_loss: 0.7067 - classification_loss: 0.0885 380/500 [=====================>........] - ETA: 46s - loss: 0.7957 - regression_loss: 0.7072 - classification_loss: 0.0885 381/500 [=====================>........] - ETA: 46s - loss: 0.7958 - regression_loss: 0.7073 - classification_loss: 0.0885 382/500 [=====================>........] - ETA: 45s - loss: 0.7960 - regression_loss: 0.7075 - classification_loss: 0.0885 383/500 [=====================>........] - ETA: 45s - loss: 0.7959 - regression_loss: 0.7075 - classification_loss: 0.0885 384/500 [======================>.......] - ETA: 44s - loss: 0.7959 - regression_loss: 0.7074 - classification_loss: 0.0886 385/500 [======================>.......] - ETA: 44s - loss: 0.7951 - regression_loss: 0.7065 - classification_loss: 0.0886 386/500 [======================>.......] - ETA: 44s - loss: 0.7945 - regression_loss: 0.7059 - classification_loss: 0.0886 387/500 [======================>.......] - ETA: 43s - loss: 0.7944 - regression_loss: 0.7057 - classification_loss: 0.0888 388/500 [======================>.......] - ETA: 43s - loss: 0.7942 - regression_loss: 0.7055 - classification_loss: 0.0887 389/500 [======================>.......] - ETA: 42s - loss: 0.7947 - regression_loss: 0.7060 - classification_loss: 0.0887 390/500 [======================>.......] - ETA: 42s - loss: 0.7959 - regression_loss: 0.7071 - classification_loss: 0.0888 391/500 [======================>.......] - ETA: 42s - loss: 0.7954 - regression_loss: 0.7067 - classification_loss: 0.0887 392/500 [======================>.......] - ETA: 41s - loss: 0.7946 - regression_loss: 0.7060 - classification_loss: 0.0886 393/500 [======================>.......] - ETA: 41s - loss: 0.7952 - regression_loss: 0.7066 - classification_loss: 0.0887 394/500 [======================>.......] - ETA: 41s - loss: 0.7952 - regression_loss: 0.7064 - classification_loss: 0.0888 395/500 [======================>.......] - ETA: 40s - loss: 0.7962 - regression_loss: 0.7072 - classification_loss: 0.0890 396/500 [======================>.......] - ETA: 40s - loss: 0.7964 - regression_loss: 0.7072 - classification_loss: 0.0891 397/500 [======================>.......] - ETA: 39s - loss: 0.7957 - regression_loss: 0.7066 - classification_loss: 0.0891 398/500 [======================>.......] - ETA: 39s - loss: 0.7951 - regression_loss: 0.7060 - classification_loss: 0.0891 399/500 [======================>.......] - ETA: 39s - loss: 0.7952 - regression_loss: 0.7061 - classification_loss: 0.0890 400/500 [=======================>......] - ETA: 38s - loss: 0.7949 - regression_loss: 0.7059 - classification_loss: 0.0890 401/500 [=======================>......] - ETA: 38s - loss: 0.7952 - regression_loss: 0.7062 - classification_loss: 0.0890 402/500 [=======================>......] - ETA: 37s - loss: 0.7946 - regression_loss: 0.7056 - classification_loss: 0.0889 403/500 [=======================>......] - ETA: 37s - loss: 0.7942 - regression_loss: 0.7054 - classification_loss: 0.0888 404/500 [=======================>......] - ETA: 37s - loss: 0.7935 - regression_loss: 0.7046 - classification_loss: 0.0888 405/500 [=======================>......] - ETA: 36s - loss: 0.7937 - regression_loss: 0.7048 - classification_loss: 0.0889 406/500 [=======================>......] - ETA: 36s - loss: 0.7940 - regression_loss: 0.7050 - classification_loss: 0.0889 407/500 [=======================>......] - ETA: 35s - loss: 0.7936 - regression_loss: 0.7048 - classification_loss: 0.0889 408/500 [=======================>......] - ETA: 35s - loss: 0.7940 - regression_loss: 0.7050 - classification_loss: 0.0889 409/500 [=======================>......] - ETA: 35s - loss: 0.7933 - regression_loss: 0.7045 - classification_loss: 0.0888 410/500 [=======================>......] - ETA: 34s - loss: 0.7927 - regression_loss: 0.7039 - classification_loss: 0.0888 411/500 [=======================>......] - ETA: 34s - loss: 0.7920 - regression_loss: 0.7032 - classification_loss: 0.0887 412/500 [=======================>......] - ETA: 33s - loss: 0.7922 - regression_loss: 0.7035 - classification_loss: 0.0887 413/500 [=======================>......] - ETA: 33s - loss: 0.7918 - regression_loss: 0.7032 - classification_loss: 0.0886 414/500 [=======================>......] - ETA: 33s - loss: 0.7919 - regression_loss: 0.7032 - classification_loss: 0.0887 415/500 [=======================>......] - ETA: 32s - loss: 0.7920 - regression_loss: 0.7034 - classification_loss: 0.0887 416/500 [=======================>......] - ETA: 32s - loss: 0.7922 - regression_loss: 0.7035 - classification_loss: 0.0887 417/500 [========================>.....] - ETA: 32s - loss: 0.7915 - regression_loss: 0.7029 - classification_loss: 0.0886 418/500 [========================>.....] - ETA: 31s - loss: 0.7918 - regression_loss: 0.7031 - classification_loss: 0.0887 419/500 [========================>.....] - ETA: 31s - loss: 0.7912 - regression_loss: 0.7026 - classification_loss: 0.0886 420/500 [========================>.....] - ETA: 30s - loss: 0.7900 - regression_loss: 0.7016 - classification_loss: 0.0884 421/500 [========================>.....] - ETA: 30s - loss: 0.7909 - regression_loss: 0.7025 - classification_loss: 0.0884 422/500 [========================>.....] - ETA: 30s - loss: 0.7917 - regression_loss: 0.7033 - classification_loss: 0.0884 423/500 [========================>.....] - ETA: 29s - loss: 0.7913 - regression_loss: 0.7030 - classification_loss: 0.0883 424/500 [========================>.....] - ETA: 29s - loss: 0.7918 - regression_loss: 0.7035 - classification_loss: 0.0883 425/500 [========================>.....] - ETA: 28s - loss: 0.7915 - regression_loss: 0.7032 - classification_loss: 0.0883 426/500 [========================>.....] - ETA: 28s - loss: 0.7918 - regression_loss: 0.7035 - classification_loss: 0.0883 427/500 [========================>.....] - ETA: 28s - loss: 0.7928 - regression_loss: 0.7044 - classification_loss: 0.0884 428/500 [========================>.....] - ETA: 27s - loss: 0.7940 - regression_loss: 0.7054 - classification_loss: 0.0886 429/500 [========================>.....] - ETA: 27s - loss: 0.7935 - regression_loss: 0.7050 - classification_loss: 0.0885 430/500 [========================>.....] - ETA: 27s - loss: 0.7934 - regression_loss: 0.7050 - classification_loss: 0.0885 431/500 [========================>.....] - ETA: 26s - loss: 0.7930 - regression_loss: 0.7046 - classification_loss: 0.0885 432/500 [========================>.....] - ETA: 26s - loss: 0.7940 - regression_loss: 0.7054 - classification_loss: 0.0886 433/500 [========================>.....] - ETA: 25s - loss: 0.7939 - regression_loss: 0.7053 - classification_loss: 0.0887 434/500 [=========================>....] - ETA: 25s - loss: 0.7938 - regression_loss: 0.7052 - classification_loss: 0.0886 435/500 [=========================>....] - ETA: 25s - loss: 0.7943 - regression_loss: 0.7056 - classification_loss: 0.0887 436/500 [=========================>....] - ETA: 24s - loss: 0.7939 - regression_loss: 0.7053 - classification_loss: 0.0886 437/500 [=========================>....] - ETA: 24s - loss: 0.7929 - regression_loss: 0.7044 - classification_loss: 0.0885 438/500 [=========================>....] - ETA: 23s - loss: 0.7930 - regression_loss: 0.7045 - classification_loss: 0.0885 439/500 [=========================>....] - ETA: 23s - loss: 0.7923 - regression_loss: 0.7039 - classification_loss: 0.0884 440/500 [=========================>....] - ETA: 23s - loss: 0.7924 - regression_loss: 0.7039 - classification_loss: 0.0885 441/500 [=========================>....] - ETA: 22s - loss: 0.7929 - regression_loss: 0.7044 - classification_loss: 0.0885 442/500 [=========================>....] - ETA: 22s - loss: 0.7930 - regression_loss: 0.7045 - classification_loss: 0.0885 443/500 [=========================>....] - ETA: 21s - loss: 0.7936 - regression_loss: 0.7050 - classification_loss: 0.0886 444/500 [=========================>....] - ETA: 21s - loss: 0.7939 - regression_loss: 0.7053 - classification_loss: 0.0886 445/500 [=========================>....] - ETA: 21s - loss: 0.7939 - regression_loss: 0.7053 - classification_loss: 0.0886 446/500 [=========================>....] - ETA: 20s - loss: 0.7934 - regression_loss: 0.7048 - classification_loss: 0.0885 447/500 [=========================>....] - ETA: 20s - loss: 0.7936 - regression_loss: 0.7050 - classification_loss: 0.0886 448/500 [=========================>....] - ETA: 20s - loss: 0.7928 - regression_loss: 0.7043 - classification_loss: 0.0884 449/500 [=========================>....] - ETA: 19s - loss: 0.7929 - regression_loss: 0.7045 - classification_loss: 0.0884 450/500 [==========================>...] - ETA: 19s - loss: 0.7921 - regression_loss: 0.7038 - classification_loss: 0.0883 451/500 [==========================>...] - ETA: 18s - loss: 0.7918 - regression_loss: 0.7036 - classification_loss: 0.0882 452/500 [==========================>...] - ETA: 18s - loss: 0.7919 - regression_loss: 0.7038 - classification_loss: 0.0882 453/500 [==========================>...] - ETA: 18s - loss: 0.7924 - regression_loss: 0.7042 - classification_loss: 0.0882 454/500 [==========================>...] - ETA: 17s - loss: 0.7935 - regression_loss: 0.7052 - classification_loss: 0.0883 455/500 [==========================>...] - ETA: 17s - loss: 0.7934 - regression_loss: 0.7051 - classification_loss: 0.0883 456/500 [==========================>...] - ETA: 16s - loss: 0.7930 - regression_loss: 0.7048 - classification_loss: 0.0882 457/500 [==========================>...] - ETA: 16s - loss: 0.7926 - regression_loss: 0.7045 - classification_loss: 0.0881 458/500 [==========================>...] - ETA: 16s - loss: 0.7934 - regression_loss: 0.7051 - classification_loss: 0.0883 459/500 [==========================>...] - ETA: 15s - loss: 0.7934 - regression_loss: 0.7051 - classification_loss: 0.0882 460/500 [==========================>...] - ETA: 15s - loss: 0.7940 - regression_loss: 0.7057 - classification_loss: 0.0883 461/500 [==========================>...] - ETA: 15s - loss: 0.7938 - regression_loss: 0.7055 - classification_loss: 0.0883 462/500 [==========================>...] - ETA: 14s - loss: 0.7937 - regression_loss: 0.7054 - classification_loss: 0.0883 463/500 [==========================>...] - ETA: 14s - loss: 0.7944 - regression_loss: 0.7061 - classification_loss: 0.0884 464/500 [==========================>...] - ETA: 13s - loss: 0.7940 - regression_loss: 0.7057 - classification_loss: 0.0883 465/500 [==========================>...] - ETA: 13s - loss: 0.7936 - regression_loss: 0.7053 - classification_loss: 0.0883 466/500 [==========================>...] - ETA: 13s - loss: 0.7928 - regression_loss: 0.7047 - classification_loss: 0.0882 467/500 [===========================>..] - ETA: 12s - loss: 0.7943 - regression_loss: 0.7059 - classification_loss: 0.0884 468/500 [===========================>..] - ETA: 12s - loss: 0.7943 - regression_loss: 0.7059 - classification_loss: 0.0884 469/500 [===========================>..] - ETA: 11s - loss: 0.7947 - regression_loss: 0.7062 - classification_loss: 0.0885 470/500 [===========================>..] - ETA: 11s - loss: 0.7950 - regression_loss: 0.7065 - classification_loss: 0.0885 471/500 [===========================>..] - ETA: 11s - loss: 0.7941 - regression_loss: 0.7057 - classification_loss: 0.0884 472/500 [===========================>..] - ETA: 10s - loss: 0.7941 - regression_loss: 0.7057 - classification_loss: 0.0884 473/500 [===========================>..] - ETA: 10s - loss: 0.7939 - regression_loss: 0.7056 - classification_loss: 0.0883 474/500 [===========================>..] - ETA: 10s - loss: 0.7939 - regression_loss: 0.7056 - classification_loss: 0.0883 475/500 [===========================>..] - ETA: 9s - loss: 0.7938 - regression_loss: 0.7055 - classification_loss: 0.0883  476/500 [===========================>..] - ETA: 9s - loss: 0.7934 - regression_loss: 0.7051 - classification_loss: 0.0882 477/500 [===========================>..] - ETA: 8s - loss: 0.7939 - regression_loss: 0.7055 - classification_loss: 0.0884 478/500 [===========================>..] - ETA: 8s - loss: 0.7943 - regression_loss: 0.7059 - classification_loss: 0.0884 479/500 [===========================>..] - ETA: 8s - loss: 0.7946 - regression_loss: 0.7061 - classification_loss: 0.0885 480/500 [===========================>..] - ETA: 7s - loss: 0.7945 - regression_loss: 0.7061 - classification_loss: 0.0884 481/500 [===========================>..] - ETA: 7s - loss: 0.7954 - regression_loss: 0.7068 - classification_loss: 0.0886 482/500 [===========================>..] - ETA: 6s - loss: 0.7945 - regression_loss: 0.7061 - classification_loss: 0.0884 483/500 [===========================>..] - ETA: 6s - loss: 0.7946 - regression_loss: 0.7062 - classification_loss: 0.0884 484/500 [============================>.] - ETA: 6s - loss: 0.7944 - regression_loss: 0.7060 - classification_loss: 0.0884 485/500 [============================>.] - ETA: 5s - loss: 0.7936 - regression_loss: 0.7053 - classification_loss: 0.0883 486/500 [============================>.] - ETA: 5s - loss: 0.7945 - regression_loss: 0.7060 - classification_loss: 0.0885 487/500 [============================>.] - ETA: 5s - loss: 0.7944 - regression_loss: 0.7059 - classification_loss: 0.0885 488/500 [============================>.] - ETA: 4s - loss: 0.7939 - regression_loss: 0.7055 - classification_loss: 0.0884 489/500 [============================>.] - ETA: 4s - loss: 0.7934 - regression_loss: 0.7050 - classification_loss: 0.0883 490/500 [============================>.] - ETA: 3s - loss: 0.7936 - regression_loss: 0.7052 - classification_loss: 0.0884 491/500 [============================>.] - ETA: 3s - loss: 0.7949 - regression_loss: 0.7062 - classification_loss: 0.0887 492/500 [============================>.] - ETA: 3s - loss: 0.7948 - regression_loss: 0.7062 - classification_loss: 0.0887 493/500 [============================>.] - ETA: 2s - loss: 0.7940 - regression_loss: 0.7054 - classification_loss: 0.0885 494/500 [============================>.] - ETA: 2s - loss: 0.7933 - regression_loss: 0.7048 - classification_loss: 0.0884 495/500 [============================>.] - ETA: 1s - loss: 0.7928 - regression_loss: 0.7045 - classification_loss: 0.0883 496/500 [============================>.] - ETA: 1s - loss: 0.7927 - regression_loss: 0.7044 - classification_loss: 0.0883 497/500 [============================>.] - ETA: 1s - loss: 0.7923 - regression_loss: 0.7041 - classification_loss: 0.0882 498/500 [============================>.] - ETA: 0s - loss: 0.7919 - regression_loss: 0.7038 - classification_loss: 0.0882 499/500 [============================>.] - ETA: 0s - loss: 0.7910 - regression_loss: 0.7029 - classification_loss: 0.0880 500/500 [==============================] - 193s 385ms/step - loss: 0.7915 - regression_loss: 0.7035 - classification_loss: 0.0881 1172 instances of class plum with average precision: 0.7149 mAP: 0.7149 Epoch 00013: saving model to ./training/snapshots/resnet101_pascal_13.h5 Epoch 14/20 1/500 [..............................] - ETA: 3:08 - loss: 1.0796 - regression_loss: 1.0162 - classification_loss: 0.0634 2/500 [..............................] - ETA: 3:14 - loss: 0.7815 - regression_loss: 0.7205 - classification_loss: 0.0610 3/500 [..............................] - ETA: 3:15 - loss: 0.8334 - regression_loss: 0.7582 - classification_loss: 0.0752 4/500 [..............................] - ETA: 3:16 - loss: 0.8399 - regression_loss: 0.7608 - classification_loss: 0.0791 5/500 [..............................] - ETA: 3:16 - loss: 0.7750 - regression_loss: 0.7071 - classification_loss: 0.0678 6/500 [..............................] - ETA: 3:14 - loss: 0.7291 - regression_loss: 0.6659 - classification_loss: 0.0632 7/500 [..............................] - ETA: 3:12 - loss: 0.7481 - regression_loss: 0.6858 - classification_loss: 0.0623 8/500 [..............................] - ETA: 3:11 - loss: 0.7012 - regression_loss: 0.6394 - classification_loss: 0.0618 9/500 [..............................] - ETA: 3:11 - loss: 0.6918 - regression_loss: 0.6265 - classification_loss: 0.0654 10/500 [..............................] - ETA: 3:11 - loss: 0.6974 - regression_loss: 0.6307 - classification_loss: 0.0667 11/500 [..............................] - ETA: 3:11 - loss: 0.7068 - regression_loss: 0.6392 - classification_loss: 0.0676 12/500 [..............................] - ETA: 3:09 - loss: 0.7069 - regression_loss: 0.6398 - classification_loss: 0.0671 13/500 [..............................] - ETA: 3:08 - loss: 0.7155 - regression_loss: 0.6444 - classification_loss: 0.0712 14/500 [..............................] - ETA: 3:08 - loss: 0.7152 - regression_loss: 0.6460 - classification_loss: 0.0692 15/500 [..............................] - ETA: 3:08 - loss: 0.7168 - regression_loss: 0.6470 - classification_loss: 0.0698 16/500 [..............................] - ETA: 3:08 - loss: 0.7137 - regression_loss: 0.6461 - classification_loss: 0.0675 17/500 [>.............................] - ETA: 3:07 - loss: 0.7076 - regression_loss: 0.6408 - classification_loss: 0.0669 18/500 [>.............................] - ETA: 3:06 - loss: 0.7052 - regression_loss: 0.6376 - classification_loss: 0.0676 19/500 [>.............................] - ETA: 3:05 - loss: 0.7060 - regression_loss: 0.6372 - classification_loss: 0.0688 20/500 [>.............................] - ETA: 3:05 - loss: 0.6866 - regression_loss: 0.6197 - classification_loss: 0.0668 21/500 [>.............................] - ETA: 3:05 - loss: 0.7192 - regression_loss: 0.6485 - classification_loss: 0.0707 22/500 [>.............................] - ETA: 3:05 - loss: 0.7195 - regression_loss: 0.6486 - classification_loss: 0.0710 23/500 [>.............................] - ETA: 3:04 - loss: 0.7403 - regression_loss: 0.6680 - classification_loss: 0.0724 24/500 [>.............................] - ETA: 3:03 - loss: 0.7491 - regression_loss: 0.6762 - classification_loss: 0.0728 25/500 [>.............................] - ETA: 3:02 - loss: 0.7508 - regression_loss: 0.6781 - classification_loss: 0.0728 26/500 [>.............................] - ETA: 3:01 - loss: 0.7295 - regression_loss: 0.6583 - classification_loss: 0.0712 27/500 [>.............................] - ETA: 3:01 - loss: 0.7165 - regression_loss: 0.6460 - classification_loss: 0.0705 28/500 [>.............................] - ETA: 3:00 - loss: 0.7099 - regression_loss: 0.6396 - classification_loss: 0.0703 29/500 [>.............................] - ETA: 3:00 - loss: 0.7169 - regression_loss: 0.6455 - classification_loss: 0.0714 30/500 [>.............................] - ETA: 2:59 - loss: 0.7250 - regression_loss: 0.6523 - classification_loss: 0.0728 31/500 [>.............................] - ETA: 2:58 - loss: 0.7166 - regression_loss: 0.6453 - classification_loss: 0.0713 32/500 [>.............................] - ETA: 2:57 - loss: 0.7079 - regression_loss: 0.6376 - classification_loss: 0.0703 33/500 [>.............................] - ETA: 2:57 - loss: 0.7015 - regression_loss: 0.6321 - classification_loss: 0.0694 34/500 [=>............................] - ETA: 2:56 - loss: 0.7047 - regression_loss: 0.6345 - classification_loss: 0.0702 35/500 [=>............................] - ETA: 2:56 - loss: 0.7048 - regression_loss: 0.6347 - classification_loss: 0.0701 36/500 [=>............................] - ETA: 2:56 - loss: 0.7102 - regression_loss: 0.6394 - classification_loss: 0.0708 37/500 [=>............................] - ETA: 2:56 - loss: 0.7077 - regression_loss: 0.6374 - classification_loss: 0.0703 38/500 [=>............................] - ETA: 2:55 - loss: 0.7093 - regression_loss: 0.6393 - classification_loss: 0.0700 39/500 [=>............................] - ETA: 2:55 - loss: 0.7088 - regression_loss: 0.6386 - classification_loss: 0.0702 40/500 [=>............................] - ETA: 2:55 - loss: 0.7011 - regression_loss: 0.6317 - classification_loss: 0.0694 41/500 [=>............................] - ETA: 2:54 - loss: 0.7023 - regression_loss: 0.6321 - classification_loss: 0.0702 42/500 [=>............................] - ETA: 2:54 - loss: 0.7029 - regression_loss: 0.6321 - classification_loss: 0.0707 43/500 [=>............................] - ETA: 2:54 - loss: 0.7001 - regression_loss: 0.6294 - classification_loss: 0.0707 44/500 [=>............................] - ETA: 2:53 - loss: 0.7015 - regression_loss: 0.6305 - classification_loss: 0.0711 45/500 [=>............................] - ETA: 2:52 - loss: 0.7057 - regression_loss: 0.6341 - classification_loss: 0.0716 46/500 [=>............................] - ETA: 2:52 - loss: 0.7045 - regression_loss: 0.6327 - classification_loss: 0.0718 47/500 [=>............................] - ETA: 2:52 - loss: 0.7006 - regression_loss: 0.6297 - classification_loss: 0.0709 48/500 [=>............................] - ETA: 2:52 - loss: 0.7023 - regression_loss: 0.6315 - classification_loss: 0.0708 49/500 [=>............................] - ETA: 2:51 - loss: 0.7062 - regression_loss: 0.6350 - classification_loss: 0.0712 50/500 [==>...........................] - ETA: 2:51 - loss: 0.7016 - regression_loss: 0.6310 - classification_loss: 0.0706 51/500 [==>...........................] - ETA: 2:50 - loss: 0.7100 - regression_loss: 0.6382 - classification_loss: 0.0718 52/500 [==>...........................] - ETA: 2:50 - loss: 0.7144 - regression_loss: 0.6419 - classification_loss: 0.0725 53/500 [==>...........................] - ETA: 2:50 - loss: 0.7082 - regression_loss: 0.6364 - classification_loss: 0.0718 54/500 [==>...........................] - ETA: 2:49 - loss: 0.7020 - regression_loss: 0.6305 - classification_loss: 0.0715 55/500 [==>...........................] - ETA: 2:49 - loss: 0.7096 - regression_loss: 0.6370 - classification_loss: 0.0726 56/500 [==>...........................] - ETA: 2:49 - loss: 0.7138 - regression_loss: 0.6410 - classification_loss: 0.0728 57/500 [==>...........................] - ETA: 2:48 - loss: 0.7142 - regression_loss: 0.6407 - classification_loss: 0.0734 58/500 [==>...........................] - ETA: 2:48 - loss: 0.7103 - regression_loss: 0.6373 - classification_loss: 0.0729 59/500 [==>...........................] - ETA: 2:47 - loss: 0.7111 - regression_loss: 0.6380 - classification_loss: 0.0731 60/500 [==>...........................] - ETA: 2:47 - loss: 0.7061 - regression_loss: 0.6336 - classification_loss: 0.0725 61/500 [==>...........................] - ETA: 2:47 - loss: 0.7008 - regression_loss: 0.6289 - classification_loss: 0.0719 62/500 [==>...........................] - ETA: 2:46 - loss: 0.7080 - regression_loss: 0.6360 - classification_loss: 0.0720 63/500 [==>...........................] - ETA: 2:46 - loss: 0.7125 - regression_loss: 0.6401 - classification_loss: 0.0724 64/500 [==>...........................] - ETA: 2:46 - loss: 0.7195 - regression_loss: 0.6459 - classification_loss: 0.0736 65/500 [==>...........................] - ETA: 2:45 - loss: 0.7221 - regression_loss: 0.6481 - classification_loss: 0.0740 66/500 [==>...........................] - ETA: 2:45 - loss: 0.7301 - regression_loss: 0.6549 - classification_loss: 0.0753 67/500 [===>..........................] - ETA: 2:44 - loss: 0.7259 - regression_loss: 0.6509 - classification_loss: 0.0750 68/500 [===>..........................] - ETA: 2:44 - loss: 0.7242 - regression_loss: 0.6494 - classification_loss: 0.0748 69/500 [===>..........................] - ETA: 2:44 - loss: 0.7343 - regression_loss: 0.6576 - classification_loss: 0.0767 70/500 [===>..........................] - ETA: 2:43 - loss: 0.7308 - regression_loss: 0.6547 - classification_loss: 0.0762 71/500 [===>..........................] - ETA: 2:43 - loss: 0.7332 - regression_loss: 0.6570 - classification_loss: 0.0762 72/500 [===>..........................] - ETA: 2:42 - loss: 0.7393 - regression_loss: 0.6623 - classification_loss: 0.0770 73/500 [===>..........................] - ETA: 2:42 - loss: 0.7392 - regression_loss: 0.6623 - classification_loss: 0.0769 74/500 [===>..........................] - ETA: 2:42 - loss: 0.7384 - regression_loss: 0.6615 - classification_loss: 0.0769 75/500 [===>..........................] - ETA: 2:41 - loss: 0.7350 - regression_loss: 0.6586 - classification_loss: 0.0764 76/500 [===>..........................] - ETA: 2:41 - loss: 0.7344 - regression_loss: 0.6582 - classification_loss: 0.0763 77/500 [===>..........................] - ETA: 2:41 - loss: 0.7296 - regression_loss: 0.6537 - classification_loss: 0.0759 78/500 [===>..........................] - ETA: 2:40 - loss: 0.7272 - regression_loss: 0.6516 - classification_loss: 0.0756 79/500 [===>..........................] - ETA: 2:40 - loss: 0.7273 - regression_loss: 0.6518 - classification_loss: 0.0754 80/500 [===>..........................] - ETA: 2:40 - loss: 0.7284 - regression_loss: 0.6529 - classification_loss: 0.0754 81/500 [===>..........................] - ETA: 2:39 - loss: 0.7305 - regression_loss: 0.6547 - classification_loss: 0.0758 82/500 [===>..........................] - ETA: 2:39 - loss: 0.7320 - regression_loss: 0.6563 - classification_loss: 0.0757 83/500 [===>..........................] - ETA: 2:38 - loss: 0.7341 - regression_loss: 0.6584 - classification_loss: 0.0757 84/500 [====>.........................] - ETA: 2:38 - loss: 0.7314 - regression_loss: 0.6562 - classification_loss: 0.0752 85/500 [====>.........................] - ETA: 2:38 - loss: 0.7267 - regression_loss: 0.6519 - classification_loss: 0.0749 86/500 [====>.........................] - ETA: 2:37 - loss: 0.7229 - regression_loss: 0.6482 - classification_loss: 0.0747 87/500 [====>.........................] - ETA: 2:37 - loss: 0.7264 - regression_loss: 0.6515 - classification_loss: 0.0748 88/500 [====>.........................] - ETA: 2:36 - loss: 0.7230 - regression_loss: 0.6485 - classification_loss: 0.0745 89/500 [====>.........................] - ETA: 2:36 - loss: 0.7208 - regression_loss: 0.6465 - classification_loss: 0.0743 90/500 [====>.........................] - ETA: 2:36 - loss: 0.7165 - regression_loss: 0.6427 - classification_loss: 0.0738 91/500 [====>.........................] - ETA: 2:35 - loss: 0.7239 - regression_loss: 0.6489 - classification_loss: 0.0751 92/500 [====>.........................] - ETA: 2:35 - loss: 0.7192 - regression_loss: 0.6445 - classification_loss: 0.0747 93/500 [====>.........................] - ETA: 2:35 - loss: 0.7208 - regression_loss: 0.6458 - classification_loss: 0.0749 94/500 [====>.........................] - ETA: 2:36 - loss: 0.7206 - regression_loss: 0.6454 - classification_loss: 0.0752 95/500 [====>.........................] - ETA: 2:36 - loss: 0.7199 - regression_loss: 0.6448 - classification_loss: 0.0750 96/500 [====>.........................] - ETA: 2:35 - loss: 0.7176 - regression_loss: 0.6430 - classification_loss: 0.0746 97/500 [====>.........................] - ETA: 2:35 - loss: 0.7170 - regression_loss: 0.6423 - classification_loss: 0.0747 98/500 [====>.........................] - ETA: 2:34 - loss: 0.7141 - regression_loss: 0.6398 - classification_loss: 0.0742 99/500 [====>.........................] - ETA: 2:34 - loss: 0.7152 - regression_loss: 0.6404 - classification_loss: 0.0748 100/500 [=====>........................] - ETA: 2:34 - loss: 0.7164 - regression_loss: 0.6416 - classification_loss: 0.0748 101/500 [=====>........................] - ETA: 2:33 - loss: 0.7137 - regression_loss: 0.6389 - classification_loss: 0.0748 102/500 [=====>........................] - ETA: 2:33 - loss: 0.7100 - regression_loss: 0.6357 - classification_loss: 0.0743 103/500 [=====>........................] - ETA: 2:32 - loss: 0.7105 - regression_loss: 0.6360 - classification_loss: 0.0745 104/500 [=====>........................] - ETA: 2:32 - loss: 0.7071 - regression_loss: 0.6330 - classification_loss: 0.0741 105/500 [=====>........................] - ETA: 2:32 - loss: 0.7044 - regression_loss: 0.6306 - classification_loss: 0.0738 106/500 [=====>........................] - ETA: 2:31 - loss: 0.7051 - regression_loss: 0.6314 - classification_loss: 0.0737 107/500 [=====>........................] - ETA: 2:31 - loss: 0.7054 - regression_loss: 0.6317 - classification_loss: 0.0737 108/500 [=====>........................] - ETA: 2:31 - loss: 0.7066 - regression_loss: 0.6329 - classification_loss: 0.0737 109/500 [=====>........................] - ETA: 2:30 - loss: 0.7061 - regression_loss: 0.6328 - classification_loss: 0.0734 110/500 [=====>........................] - ETA: 2:30 - loss: 0.7083 - regression_loss: 0.6346 - classification_loss: 0.0737 111/500 [=====>........................] - ETA: 2:29 - loss: 0.7089 - regression_loss: 0.6351 - classification_loss: 0.0738 112/500 [=====>........................] - ETA: 2:29 - loss: 0.7113 - regression_loss: 0.6373 - classification_loss: 0.0739 113/500 [=====>........................] - ETA: 2:29 - loss: 0.7120 - regression_loss: 0.6382 - classification_loss: 0.0738 114/500 [=====>........................] - ETA: 2:28 - loss: 0.7138 - regression_loss: 0.6396 - classification_loss: 0.0743 115/500 [=====>........................] - ETA: 2:28 - loss: 0.7170 - regression_loss: 0.6425 - classification_loss: 0.0745 116/500 [=====>........................] - ETA: 2:27 - loss: 0.7147 - regression_loss: 0.6406 - classification_loss: 0.0741 117/500 [======>.......................] - ETA: 2:27 - loss: 0.7187 - regression_loss: 0.6439 - classification_loss: 0.0748 118/500 [======>.......................] - ETA: 2:27 - loss: 0.7175 - regression_loss: 0.6429 - classification_loss: 0.0746 119/500 [======>.......................] - ETA: 2:26 - loss: 0.7180 - regression_loss: 0.6433 - classification_loss: 0.0747 120/500 [======>.......................] - ETA: 2:26 - loss: 0.7174 - regression_loss: 0.6430 - classification_loss: 0.0745 121/500 [======>.......................] - ETA: 2:25 - loss: 0.7180 - regression_loss: 0.6434 - classification_loss: 0.0746 122/500 [======>.......................] - ETA: 2:25 - loss: 0.7185 - regression_loss: 0.6438 - classification_loss: 0.0747 123/500 [======>.......................] - ETA: 2:25 - loss: 0.7175 - regression_loss: 0.6430 - classification_loss: 0.0745 124/500 [======>.......................] - ETA: 2:24 - loss: 0.7136 - regression_loss: 0.6396 - classification_loss: 0.0741 125/500 [======>.......................] - ETA: 2:24 - loss: 0.7143 - regression_loss: 0.6402 - classification_loss: 0.0741 126/500 [======>.......................] - ETA: 2:23 - loss: 0.7165 - regression_loss: 0.6410 - classification_loss: 0.0754 127/500 [======>.......................] - ETA: 2:23 - loss: 0.7163 - regression_loss: 0.6410 - classification_loss: 0.0753 128/500 [======>.......................] - ETA: 2:23 - loss: 0.7176 - regression_loss: 0.6420 - classification_loss: 0.0756 129/500 [======>.......................] - ETA: 2:22 - loss: 0.7181 - regression_loss: 0.6426 - classification_loss: 0.0755 130/500 [======>.......................] - ETA: 2:22 - loss: 0.7187 - regression_loss: 0.6432 - classification_loss: 0.0755 131/500 [======>.......................] - ETA: 2:22 - loss: 0.7210 - regression_loss: 0.6456 - classification_loss: 0.0754 132/500 [======>.......................] - ETA: 2:21 - loss: 0.7204 - regression_loss: 0.6450 - classification_loss: 0.0754 133/500 [======>.......................] - ETA: 2:21 - loss: 0.7215 - regression_loss: 0.6459 - classification_loss: 0.0756 134/500 [=======>......................] - ETA: 2:20 - loss: 0.7219 - regression_loss: 0.6461 - classification_loss: 0.0757 135/500 [=======>......................] - ETA: 2:20 - loss: 0.7248 - regression_loss: 0.6485 - classification_loss: 0.0763 136/500 [=======>......................] - ETA: 2:20 - loss: 0.7225 - regression_loss: 0.6465 - classification_loss: 0.0760 137/500 [=======>......................] - ETA: 2:19 - loss: 0.7209 - regression_loss: 0.6451 - classification_loss: 0.0758 138/500 [=======>......................] - ETA: 2:19 - loss: 0.7204 - regression_loss: 0.6445 - classification_loss: 0.0759 139/500 [=======>......................] - ETA: 2:18 - loss: 0.7213 - regression_loss: 0.6454 - classification_loss: 0.0759 140/500 [=======>......................] - ETA: 2:18 - loss: 0.7227 - regression_loss: 0.6467 - classification_loss: 0.0760 141/500 [=======>......................] - ETA: 2:18 - loss: 0.7218 - regression_loss: 0.6458 - classification_loss: 0.0760 142/500 [=======>......................] - ETA: 2:17 - loss: 0.7219 - regression_loss: 0.6458 - classification_loss: 0.0761 143/500 [=======>......................] - ETA: 2:17 - loss: 0.7204 - regression_loss: 0.6444 - classification_loss: 0.0760 144/500 [=======>......................] - ETA: 2:17 - loss: 0.7214 - regression_loss: 0.6454 - classification_loss: 0.0760 145/500 [=======>......................] - ETA: 2:16 - loss: 0.7208 - regression_loss: 0.6451 - classification_loss: 0.0757 146/500 [=======>......................] - ETA: 2:16 - loss: 0.7237 - regression_loss: 0.6481 - classification_loss: 0.0756 147/500 [=======>......................] - ETA: 2:15 - loss: 0.7244 - regression_loss: 0.6489 - classification_loss: 0.0755 148/500 [=======>......................] - ETA: 2:15 - loss: 0.7277 - regression_loss: 0.6517 - classification_loss: 0.0760 149/500 [=======>......................] - ETA: 2:14 - loss: 0.7295 - regression_loss: 0.6534 - classification_loss: 0.0761 150/500 [========>.....................] - ETA: 2:14 - loss: 0.7306 - regression_loss: 0.6544 - classification_loss: 0.0762 151/500 [========>.....................] - ETA: 2:14 - loss: 0.7292 - regression_loss: 0.6531 - classification_loss: 0.0761 152/500 [========>.....................] - ETA: 2:13 - loss: 0.7313 - regression_loss: 0.6546 - classification_loss: 0.0766 153/500 [========>.....................] - ETA: 2:13 - loss: 0.7348 - regression_loss: 0.6575 - classification_loss: 0.0773 154/500 [========>.....................] - ETA: 2:12 - loss: 0.7356 - regression_loss: 0.6580 - classification_loss: 0.0776 155/500 [========>.....................] - ETA: 2:12 - loss: 0.7366 - regression_loss: 0.6589 - classification_loss: 0.0777 156/500 [========>.....................] - ETA: 2:12 - loss: 0.7373 - regression_loss: 0.6594 - classification_loss: 0.0779 157/500 [========>.....................] - ETA: 2:11 - loss: 0.7367 - regression_loss: 0.6589 - classification_loss: 0.0778 158/500 [========>.....................] - ETA: 2:11 - loss: 0.7342 - regression_loss: 0.6568 - classification_loss: 0.0775 159/500 [========>.....................] - ETA: 2:10 - loss: 0.7354 - regression_loss: 0.6577 - classification_loss: 0.0777 160/500 [========>.....................] - ETA: 2:10 - loss: 0.7353 - regression_loss: 0.6576 - classification_loss: 0.0776 161/500 [========>.....................] - ETA: 2:10 - loss: 0.7331 - regression_loss: 0.6558 - classification_loss: 0.0774 162/500 [========>.....................] - ETA: 2:09 - loss: 0.7314 - regression_loss: 0.6543 - classification_loss: 0.0771 163/500 [========>.....................] - ETA: 2:09 - loss: 0.7324 - regression_loss: 0.6551 - classification_loss: 0.0773 164/500 [========>.....................] - ETA: 2:08 - loss: 0.7342 - regression_loss: 0.6568 - classification_loss: 0.0774 165/500 [========>.....................] - ETA: 2:08 - loss: 0.7336 - regression_loss: 0.6563 - classification_loss: 0.0773 166/500 [========>.....................] - ETA: 2:08 - loss: 0.7331 - regression_loss: 0.6559 - classification_loss: 0.0772 167/500 [=========>....................] - ETA: 2:07 - loss: 0.7342 - regression_loss: 0.6569 - classification_loss: 0.0773 168/500 [=========>....................] - ETA: 2:07 - loss: 0.7358 - regression_loss: 0.6582 - classification_loss: 0.0776 169/500 [=========>....................] - ETA: 2:07 - loss: 0.7371 - regression_loss: 0.6593 - classification_loss: 0.0777 170/500 [=========>....................] - ETA: 2:06 - loss: 0.7387 - regression_loss: 0.6610 - classification_loss: 0.0777 171/500 [=========>....................] - ETA: 2:06 - loss: 0.7402 - regression_loss: 0.6622 - classification_loss: 0.0780 172/500 [=========>....................] - ETA: 2:05 - loss: 0.7415 - regression_loss: 0.6634 - classification_loss: 0.0781 173/500 [=========>....................] - ETA: 2:05 - loss: 0.7403 - regression_loss: 0.6621 - classification_loss: 0.0781 174/500 [=========>....................] - ETA: 2:04 - loss: 0.7396 - regression_loss: 0.6618 - classification_loss: 0.0778 175/500 [=========>....................] - ETA: 2:04 - loss: 0.7398 - regression_loss: 0.6619 - classification_loss: 0.0778 176/500 [=========>....................] - ETA: 2:04 - loss: 0.7403 - regression_loss: 0.6625 - classification_loss: 0.0778 177/500 [=========>....................] - ETA: 2:03 - loss: 0.7386 - regression_loss: 0.6610 - classification_loss: 0.0777 178/500 [=========>....................] - ETA: 2:03 - loss: 0.7364 - regression_loss: 0.6590 - classification_loss: 0.0774 179/500 [=========>....................] - ETA: 2:03 - loss: 0.7377 - regression_loss: 0.6604 - classification_loss: 0.0772 180/500 [=========>....................] - ETA: 2:02 - loss: 0.7379 - regression_loss: 0.6607 - classification_loss: 0.0772 181/500 [=========>....................] - ETA: 2:02 - loss: 0.7379 - regression_loss: 0.6606 - classification_loss: 0.0773 182/500 [=========>....................] - ETA: 2:01 - loss: 0.7392 - regression_loss: 0.6617 - classification_loss: 0.0775 183/500 [=========>....................] - ETA: 2:01 - loss: 0.7402 - regression_loss: 0.6624 - classification_loss: 0.0777 184/500 [==========>...................] - ETA: 2:01 - loss: 0.7399 - regression_loss: 0.6622 - classification_loss: 0.0777 185/500 [==========>...................] - ETA: 2:00 - loss: 0.7389 - regression_loss: 0.6613 - classification_loss: 0.0776 186/500 [==========>...................] - ETA: 2:00 - loss: 0.7407 - regression_loss: 0.6626 - classification_loss: 0.0780 187/500 [==========>...................] - ETA: 1:59 - loss: 0.7415 - regression_loss: 0.6632 - classification_loss: 0.0782 188/500 [==========>...................] - ETA: 1:59 - loss: 0.7411 - regression_loss: 0.6629 - classification_loss: 0.0782 189/500 [==========>...................] - ETA: 1:59 - loss: 0.7407 - regression_loss: 0.6625 - classification_loss: 0.0781 190/500 [==========>...................] - ETA: 1:58 - loss: 0.7420 - regression_loss: 0.6636 - classification_loss: 0.0784 191/500 [==========>...................] - ETA: 1:58 - loss: 0.7416 - regression_loss: 0.6632 - classification_loss: 0.0784 192/500 [==========>...................] - ETA: 1:57 - loss: 0.7412 - regression_loss: 0.6628 - classification_loss: 0.0784 193/500 [==========>...................] - ETA: 1:57 - loss: 0.7388 - regression_loss: 0.6607 - classification_loss: 0.0782 194/500 [==========>...................] - ETA: 1:57 - loss: 0.7383 - regression_loss: 0.6602 - classification_loss: 0.0781 195/500 [==========>...................] - ETA: 1:56 - loss: 0.7373 - regression_loss: 0.6593 - classification_loss: 0.0780 196/500 [==========>...................] - ETA: 1:56 - loss: 0.7360 - regression_loss: 0.6579 - classification_loss: 0.0782 197/500 [==========>...................] - ETA: 1:56 - loss: 0.7348 - regression_loss: 0.6567 - classification_loss: 0.0781 198/500 [==========>...................] - ETA: 1:55 - loss: 0.7357 - regression_loss: 0.6576 - classification_loss: 0.0781 199/500 [==========>...................] - ETA: 1:55 - loss: 0.7358 - regression_loss: 0.6577 - classification_loss: 0.0781 200/500 [===========>..................] - ETA: 1:54 - loss: 0.7345 - regression_loss: 0.6566 - classification_loss: 0.0779 201/500 [===========>..................] - ETA: 1:54 - loss: 0.7351 - regression_loss: 0.6570 - classification_loss: 0.0781 202/500 [===========>..................] - ETA: 1:54 - loss: 0.7341 - regression_loss: 0.6561 - classification_loss: 0.0780 203/500 [===========>..................] - ETA: 1:53 - loss: 0.7333 - regression_loss: 0.6554 - classification_loss: 0.0778 204/500 [===========>..................] - ETA: 1:53 - loss: 0.7321 - regression_loss: 0.6545 - classification_loss: 0.0776 205/500 [===========>..................] - ETA: 1:53 - loss: 0.7317 - regression_loss: 0.6543 - classification_loss: 0.0775 206/500 [===========>..................] - ETA: 1:52 - loss: 0.7302 - regression_loss: 0.6529 - classification_loss: 0.0772 207/500 [===========>..................] - ETA: 1:52 - loss: 0.7319 - regression_loss: 0.6544 - classification_loss: 0.0775 208/500 [===========>..................] - ETA: 1:51 - loss: 0.7354 - regression_loss: 0.6571 - classification_loss: 0.0783 209/500 [===========>..................] - ETA: 1:51 - loss: 0.7352 - regression_loss: 0.6569 - classification_loss: 0.0783 210/500 [===========>..................] - ETA: 1:51 - loss: 0.7345 - regression_loss: 0.6563 - classification_loss: 0.0783 211/500 [===========>..................] - ETA: 1:50 - loss: 0.7361 - regression_loss: 0.6576 - classification_loss: 0.0785 212/500 [===========>..................] - ETA: 1:50 - loss: 0.7340 - regression_loss: 0.6557 - classification_loss: 0.0783 213/500 [===========>..................] - ETA: 1:50 - loss: 0.7319 - regression_loss: 0.6538 - classification_loss: 0.0781 214/500 [===========>..................] - ETA: 1:49 - loss: 0.7321 - regression_loss: 0.6541 - classification_loss: 0.0780 215/500 [===========>..................] - ETA: 1:49 - loss: 0.7327 - regression_loss: 0.6545 - classification_loss: 0.0782 216/500 [===========>..................] - ETA: 1:48 - loss: 0.7345 - regression_loss: 0.6561 - classification_loss: 0.0784 217/500 [============>.................] - ETA: 1:48 - loss: 0.7334 - regression_loss: 0.6550 - classification_loss: 0.0784 218/500 [============>.................] - ETA: 1:48 - loss: 0.7330 - regression_loss: 0.6547 - classification_loss: 0.0782 219/500 [============>.................] - ETA: 1:47 - loss: 0.7346 - regression_loss: 0.6561 - classification_loss: 0.0786 220/500 [============>.................] - ETA: 1:47 - loss: 0.7355 - regression_loss: 0.6568 - classification_loss: 0.0788 221/500 [============>.................] - ETA: 1:47 - loss: 0.7377 - regression_loss: 0.6587 - classification_loss: 0.0790 222/500 [============>.................] - ETA: 1:46 - loss: 0.7377 - regression_loss: 0.6587 - classification_loss: 0.0789 223/500 [============>.................] - ETA: 1:46 - loss: 0.7379 - regression_loss: 0.6590 - classification_loss: 0.0790 224/500 [============>.................] - ETA: 1:45 - loss: 0.7381 - regression_loss: 0.6591 - classification_loss: 0.0790 225/500 [============>.................] - ETA: 1:45 - loss: 0.7371 - regression_loss: 0.6583 - classification_loss: 0.0788 226/500 [============>.................] - ETA: 1:45 - loss: 0.7388 - regression_loss: 0.6599 - classification_loss: 0.0788 227/500 [============>.................] - ETA: 1:44 - loss: 0.7401 - regression_loss: 0.6614 - classification_loss: 0.0787 228/500 [============>.................] - ETA: 1:44 - loss: 0.7413 - regression_loss: 0.6623 - classification_loss: 0.0790 229/500 [============>.................] - ETA: 1:44 - loss: 0.7432 - regression_loss: 0.6640 - classification_loss: 0.0792 230/500 [============>.................] - ETA: 1:43 - loss: 0.7443 - regression_loss: 0.6650 - classification_loss: 0.0793 231/500 [============>.................] - ETA: 1:43 - loss: 0.7457 - regression_loss: 0.6663 - classification_loss: 0.0794 232/500 [============>.................] - ETA: 1:42 - loss: 0.7457 - regression_loss: 0.6663 - classification_loss: 0.0793 233/500 [============>.................] - ETA: 1:42 - loss: 0.7461 - regression_loss: 0.6666 - classification_loss: 0.0795 234/500 [=============>................] - ETA: 1:42 - loss: 0.7458 - regression_loss: 0.6663 - classification_loss: 0.0795 235/500 [=============>................] - ETA: 1:41 - loss: 0.7455 - regression_loss: 0.6662 - classification_loss: 0.0793 236/500 [=============>................] - ETA: 1:41 - loss: 0.7456 - regression_loss: 0.6663 - classification_loss: 0.0793 237/500 [=============>................] - ETA: 1:40 - loss: 0.7467 - regression_loss: 0.6674 - classification_loss: 0.0793 238/500 [=============>................] - ETA: 1:40 - loss: 0.7466 - regression_loss: 0.6674 - classification_loss: 0.0792 239/500 [=============>................] - ETA: 1:40 - loss: 0.7482 - regression_loss: 0.6691 - classification_loss: 0.0791 240/500 [=============>................] - ETA: 1:39 - loss: 0.7480 - regression_loss: 0.6688 - classification_loss: 0.0792 241/500 [=============>................] - ETA: 1:39 - loss: 0.7457 - regression_loss: 0.6667 - classification_loss: 0.0790 242/500 [=============>................] - ETA: 1:39 - loss: 0.7463 - regression_loss: 0.6673 - classification_loss: 0.0790 243/500 [=============>................] - ETA: 1:38 - loss: 0.7459 - regression_loss: 0.6670 - classification_loss: 0.0789 244/500 [=============>................] - ETA: 1:38 - loss: 0.7462 - regression_loss: 0.6673 - classification_loss: 0.0789 245/500 [=============>................] - ETA: 1:37 - loss: 0.7438 - regression_loss: 0.6652 - classification_loss: 0.0786 246/500 [=============>................] - ETA: 1:37 - loss: 0.7428 - regression_loss: 0.6642 - classification_loss: 0.0786 247/500 [=============>................] - ETA: 1:37 - loss: 0.7431 - regression_loss: 0.6644 - classification_loss: 0.0786 248/500 [=============>................] - ETA: 1:36 - loss: 0.7421 - regression_loss: 0.6636 - classification_loss: 0.0785 249/500 [=============>................] - ETA: 1:36 - loss: 0.7418 - regression_loss: 0.6634 - classification_loss: 0.0784 250/500 [==============>...............] - ETA: 1:35 - loss: 0.7429 - regression_loss: 0.6645 - classification_loss: 0.0785 251/500 [==============>...............] - ETA: 1:35 - loss: 0.7445 - regression_loss: 0.6661 - classification_loss: 0.0785 252/500 [==============>...............] - ETA: 1:35 - loss: 0.7431 - regression_loss: 0.6648 - classification_loss: 0.0782 253/500 [==============>...............] - ETA: 1:34 - loss: 0.7439 - regression_loss: 0.6655 - classification_loss: 0.0784 254/500 [==============>...............] - ETA: 1:34 - loss: 0.7439 - regression_loss: 0.6654 - classification_loss: 0.0785 255/500 [==============>...............] - ETA: 1:34 - loss: 0.7431 - regression_loss: 0.6648 - classification_loss: 0.0783 256/500 [==============>...............] - ETA: 1:33 - loss: 0.7459 - regression_loss: 0.6669 - classification_loss: 0.0789 257/500 [==============>...............] - ETA: 1:33 - loss: 0.7456 - regression_loss: 0.6668 - classification_loss: 0.0788 258/500 [==============>...............] - ETA: 1:32 - loss: 0.7444 - regression_loss: 0.6657 - classification_loss: 0.0787 259/500 [==============>...............] - ETA: 1:32 - loss: 0.7441 - regression_loss: 0.6656 - classification_loss: 0.0785 260/500 [==============>...............] - ETA: 1:32 - loss: 0.7443 - regression_loss: 0.6657 - classification_loss: 0.0786 261/500 [==============>...............] - ETA: 1:31 - loss: 0.7440 - regression_loss: 0.6656 - classification_loss: 0.0784 262/500 [==============>...............] - ETA: 1:31 - loss: 0.7449 - regression_loss: 0.6664 - classification_loss: 0.0785 263/500 [==============>...............] - ETA: 1:30 - loss: 0.7457 - regression_loss: 0.6672 - classification_loss: 0.0785 264/500 [==============>...............] - ETA: 1:30 - loss: 0.7441 - regression_loss: 0.6658 - classification_loss: 0.0783 265/500 [==============>...............] - ETA: 1:30 - loss: 0.7450 - regression_loss: 0.6666 - classification_loss: 0.0784 266/500 [==============>...............] - ETA: 1:29 - loss: 0.7450 - regression_loss: 0.6664 - classification_loss: 0.0785 267/500 [===============>..............] - ETA: 1:29 - loss: 0.7453 - regression_loss: 0.6667 - classification_loss: 0.0786 268/500 [===============>..............] - ETA: 1:28 - loss: 0.7451 - regression_loss: 0.6665 - classification_loss: 0.0786 269/500 [===============>..............] - ETA: 1:28 - loss: 0.7455 - regression_loss: 0.6668 - classification_loss: 0.0787 270/500 [===============>..............] - ETA: 1:28 - loss: 0.7448 - regression_loss: 0.6661 - classification_loss: 0.0786 271/500 [===============>..............] - ETA: 1:27 - loss: 0.7438 - regression_loss: 0.6654 - classification_loss: 0.0784 272/500 [===============>..............] - ETA: 1:27 - loss: 0.7447 - regression_loss: 0.6663 - classification_loss: 0.0784 273/500 [===============>..............] - ETA: 1:27 - loss: 0.7455 - regression_loss: 0.6669 - classification_loss: 0.0786 274/500 [===============>..............] - ETA: 1:26 - loss: 0.7450 - regression_loss: 0.6664 - classification_loss: 0.0786 275/500 [===============>..............] - ETA: 1:26 - loss: 0.7450 - regression_loss: 0.6663 - classification_loss: 0.0787 276/500 [===============>..............] - ETA: 1:25 - loss: 0.7472 - regression_loss: 0.6682 - classification_loss: 0.0790 277/500 [===============>..............] - ETA: 1:25 - loss: 0.7477 - regression_loss: 0.6687 - classification_loss: 0.0790 278/500 [===============>..............] - ETA: 1:25 - loss: 0.7479 - regression_loss: 0.6689 - classification_loss: 0.0790 279/500 [===============>..............] - ETA: 1:24 - loss: 0.7470 - regression_loss: 0.6681 - classification_loss: 0.0789 280/500 [===============>..............] - ETA: 1:24 - loss: 0.7472 - regression_loss: 0.6684 - classification_loss: 0.0787 281/500 [===============>..............] - ETA: 1:23 - loss: 0.7477 - regression_loss: 0.6690 - classification_loss: 0.0788 282/500 [===============>..............] - ETA: 1:23 - loss: 0.7489 - regression_loss: 0.6699 - classification_loss: 0.0790 283/500 [===============>..............] - ETA: 1:23 - loss: 0.7495 - regression_loss: 0.6706 - classification_loss: 0.0789 284/500 [================>.............] - ETA: 1:22 - loss: 0.7492 - regression_loss: 0.6704 - classification_loss: 0.0788 285/500 [================>.............] - ETA: 1:22 - loss: 0.7488 - regression_loss: 0.6701 - classification_loss: 0.0787 286/500 [================>.............] - ETA: 1:22 - loss: 0.7480 - regression_loss: 0.6694 - classification_loss: 0.0786 287/500 [================>.............] - ETA: 1:21 - loss: 0.7487 - regression_loss: 0.6699 - classification_loss: 0.0788 288/500 [================>.............] - ETA: 1:21 - loss: 0.7494 - regression_loss: 0.6706 - classification_loss: 0.0788 289/500 [================>.............] - ETA: 1:20 - loss: 0.7488 - regression_loss: 0.6701 - classification_loss: 0.0787 290/500 [================>.............] - ETA: 1:20 - loss: 0.7492 - regression_loss: 0.6705 - classification_loss: 0.0787 291/500 [================>.............] - ETA: 1:20 - loss: 0.7491 - regression_loss: 0.6703 - classification_loss: 0.0788 292/500 [================>.............] - ETA: 1:19 - loss: 0.7481 - regression_loss: 0.6694 - classification_loss: 0.0786 293/500 [================>.............] - ETA: 1:19 - loss: 0.7490 - regression_loss: 0.6703 - classification_loss: 0.0787 294/500 [================>.............] - ETA: 1:19 - loss: 0.7487 - regression_loss: 0.6700 - classification_loss: 0.0787 295/500 [================>.............] - ETA: 1:18 - loss: 0.7483 - regression_loss: 0.6697 - classification_loss: 0.0786 296/500 [================>.............] - ETA: 1:18 - loss: 0.7485 - regression_loss: 0.6699 - classification_loss: 0.0786 297/500 [================>.............] - ETA: 1:17 - loss: 0.7483 - regression_loss: 0.6697 - classification_loss: 0.0786 298/500 [================>.............] - ETA: 1:17 - loss: 0.7486 - regression_loss: 0.6700 - classification_loss: 0.0786 299/500 [================>.............] - ETA: 1:17 - loss: 0.7490 - regression_loss: 0.6703 - classification_loss: 0.0786 300/500 [=================>............] - ETA: 1:16 - loss: 0.7490 - regression_loss: 0.6703 - classification_loss: 0.0787 301/500 [=================>............] - ETA: 1:16 - loss: 0.7490 - regression_loss: 0.6703 - classification_loss: 0.0787 302/500 [=================>............] - ETA: 1:15 - loss: 0.7495 - regression_loss: 0.6707 - classification_loss: 0.0789 303/500 [=================>............] - ETA: 1:15 - loss: 0.7492 - regression_loss: 0.6702 - classification_loss: 0.0790 304/500 [=================>............] - ETA: 1:15 - loss: 0.7484 - regression_loss: 0.6695 - classification_loss: 0.0790 305/500 [=================>............] - ETA: 1:14 - loss: 0.7499 - regression_loss: 0.6707 - classification_loss: 0.0792 306/500 [=================>............] - ETA: 1:14 - loss: 0.7496 - regression_loss: 0.6705 - classification_loss: 0.0791 307/500 [=================>............] - ETA: 1:14 - loss: 0.7494 - regression_loss: 0.6702 - classification_loss: 0.0792 308/500 [=================>............] - ETA: 1:13 - loss: 0.7498 - regression_loss: 0.6706 - classification_loss: 0.0792 309/500 [=================>............] - ETA: 1:13 - loss: 0.7487 - regression_loss: 0.6697 - classification_loss: 0.0790 310/500 [=================>............] - ETA: 1:12 - loss: 0.7480 - regression_loss: 0.6691 - classification_loss: 0.0789 311/500 [=================>............] - ETA: 1:12 - loss: 0.7479 - regression_loss: 0.6689 - classification_loss: 0.0790 312/500 [=================>............] - ETA: 1:12 - loss: 0.7466 - regression_loss: 0.6677 - classification_loss: 0.0788 313/500 [=================>............] - ETA: 1:11 - loss: 0.7471 - regression_loss: 0.6682 - classification_loss: 0.0788 314/500 [=================>............] - ETA: 1:11 - loss: 0.7481 - regression_loss: 0.6692 - classification_loss: 0.0789 315/500 [=================>............] - ETA: 1:11 - loss: 0.7491 - regression_loss: 0.6702 - classification_loss: 0.0789 316/500 [=================>............] - ETA: 1:10 - loss: 0.7480 - regression_loss: 0.6693 - classification_loss: 0.0788 317/500 [==================>...........] - ETA: 1:10 - loss: 0.7474 - regression_loss: 0.6686 - classification_loss: 0.0788 318/500 [==================>...........] - ETA: 1:09 - loss: 0.7474 - regression_loss: 0.6686 - classification_loss: 0.0787 319/500 [==================>...........] - ETA: 1:09 - loss: 0.7479 - regression_loss: 0.6690 - classification_loss: 0.0789 320/500 [==================>...........] - ETA: 1:09 - loss: 0.7475 - regression_loss: 0.6687 - classification_loss: 0.0788 321/500 [==================>...........] - ETA: 1:08 - loss: 0.7479 - regression_loss: 0.6690 - classification_loss: 0.0789 322/500 [==================>...........] - ETA: 1:08 - loss: 0.7487 - regression_loss: 0.6696 - classification_loss: 0.0791 323/500 [==================>...........] - ETA: 1:07 - loss: 0.7480 - regression_loss: 0.6691 - classification_loss: 0.0790 324/500 [==================>...........] - ETA: 1:07 - loss: 0.7483 - regression_loss: 0.6694 - classification_loss: 0.0790 325/500 [==================>...........] - ETA: 1:07 - loss: 0.7486 - regression_loss: 0.6696 - classification_loss: 0.0790 326/500 [==================>...........] - ETA: 1:06 - loss: 0.7481 - regression_loss: 0.6691 - classification_loss: 0.0790 327/500 [==================>...........] - ETA: 1:06 - loss: 0.7488 - regression_loss: 0.6696 - classification_loss: 0.0792 328/500 [==================>...........] - ETA: 1:05 - loss: 0.7487 - regression_loss: 0.6694 - classification_loss: 0.0793 329/500 [==================>...........] - ETA: 1:05 - loss: 0.7485 - regression_loss: 0.6692 - classification_loss: 0.0793 330/500 [==================>...........] - ETA: 1:05 - loss: 0.7487 - regression_loss: 0.6694 - classification_loss: 0.0794 331/500 [==================>...........] - ETA: 1:04 - loss: 0.7490 - regression_loss: 0.6696 - classification_loss: 0.0794 332/500 [==================>...........] - ETA: 1:04 - loss: 0.7486 - regression_loss: 0.6693 - classification_loss: 0.0794 333/500 [==================>...........] - ETA: 1:04 - loss: 0.7486 - regression_loss: 0.6692 - classification_loss: 0.0794 334/500 [===================>..........] - ETA: 1:03 - loss: 0.7473 - regression_loss: 0.6680 - classification_loss: 0.0793 335/500 [===================>..........] - ETA: 1:03 - loss: 0.7472 - regression_loss: 0.6680 - classification_loss: 0.0793 336/500 [===================>..........] - ETA: 1:02 - loss: 0.7486 - regression_loss: 0.6691 - classification_loss: 0.0794 337/500 [===================>..........] - ETA: 1:02 - loss: 0.7474 - regression_loss: 0.6681 - classification_loss: 0.0793 338/500 [===================>..........] - ETA: 1:02 - loss: 0.7463 - regression_loss: 0.6671 - classification_loss: 0.0792 339/500 [===================>..........] - ETA: 1:01 - loss: 0.7456 - regression_loss: 0.6665 - classification_loss: 0.0790 340/500 [===================>..........] - ETA: 1:01 - loss: 0.7455 - regression_loss: 0.6664 - classification_loss: 0.0790 341/500 [===================>..........] - ETA: 1:01 - loss: 0.7450 - regression_loss: 0.6661 - classification_loss: 0.0789 342/500 [===================>..........] - ETA: 1:00 - loss: 0.7447 - regression_loss: 0.6659 - classification_loss: 0.0788 343/500 [===================>..........] - ETA: 1:00 - loss: 0.7440 - regression_loss: 0.6652 - classification_loss: 0.0788 344/500 [===================>..........] - ETA: 59s - loss: 0.7434 - regression_loss: 0.6647 - classification_loss: 0.0787  345/500 [===================>..........] - ETA: 59s - loss: 0.7426 - regression_loss: 0.6640 - classification_loss: 0.0786 346/500 [===================>..........] - ETA: 59s - loss: 0.7433 - regression_loss: 0.6645 - classification_loss: 0.0787 347/500 [===================>..........] - ETA: 58s - loss: 0.7432 - regression_loss: 0.6644 - classification_loss: 0.0788 348/500 [===================>..........] - ETA: 58s - loss: 0.7441 - regression_loss: 0.6654 - classification_loss: 0.0787 349/500 [===================>..........] - ETA: 57s - loss: 0.7456 - regression_loss: 0.6666 - classification_loss: 0.0790 350/500 [====================>.........] - ETA: 57s - loss: 0.7447 - regression_loss: 0.6658 - classification_loss: 0.0789 351/500 [====================>.........] - ETA: 57s - loss: 0.7452 - regression_loss: 0.6662 - classification_loss: 0.0790 352/500 [====================>.........] - ETA: 56s - loss: 0.7450 - regression_loss: 0.6661 - classification_loss: 0.0790 353/500 [====================>.........] - ETA: 56s - loss: 0.7456 - regression_loss: 0.6666 - classification_loss: 0.0791 354/500 [====================>.........] - ETA: 55s - loss: 0.7449 - regression_loss: 0.6659 - classification_loss: 0.0789 355/500 [====================>.........] - ETA: 55s - loss: 0.7450 - regression_loss: 0.6660 - classification_loss: 0.0790 356/500 [====================>.........] - ETA: 55s - loss: 0.7455 - regression_loss: 0.6664 - classification_loss: 0.0791 357/500 [====================>.........] - ETA: 54s - loss: 0.7445 - regression_loss: 0.6655 - classification_loss: 0.0790 358/500 [====================>.........] - ETA: 54s - loss: 0.7439 - regression_loss: 0.6649 - classification_loss: 0.0789 359/500 [====================>.........] - ETA: 54s - loss: 0.7451 - regression_loss: 0.6660 - classification_loss: 0.0791 360/500 [====================>.........] - ETA: 53s - loss: 0.7457 - regression_loss: 0.6665 - classification_loss: 0.0792 361/500 [====================>.........] - ETA: 53s - loss: 0.7446 - regression_loss: 0.6656 - classification_loss: 0.0790 362/500 [====================>.........] - ETA: 52s - loss: 0.7437 - regression_loss: 0.6648 - classification_loss: 0.0789 363/500 [====================>.........] - ETA: 52s - loss: 0.7439 - regression_loss: 0.6649 - classification_loss: 0.0790 364/500 [====================>.........] - ETA: 52s - loss: 0.7444 - regression_loss: 0.6653 - classification_loss: 0.0791 365/500 [====================>.........] - ETA: 51s - loss: 0.7458 - regression_loss: 0.6664 - classification_loss: 0.0794 366/500 [====================>.........] - ETA: 51s - loss: 0.7443 - regression_loss: 0.6650 - classification_loss: 0.0792 367/500 [=====================>........] - ETA: 50s - loss: 0.7446 - regression_loss: 0.6654 - classification_loss: 0.0792 368/500 [=====================>........] - ETA: 50s - loss: 0.7446 - regression_loss: 0.6654 - classification_loss: 0.0792 369/500 [=====================>........] - ETA: 50s - loss: 0.7438 - regression_loss: 0.6647 - classification_loss: 0.0791 370/500 [=====================>........] - ETA: 49s - loss: 0.7434 - regression_loss: 0.6643 - classification_loss: 0.0791 371/500 [=====================>........] - ETA: 49s - loss: 0.7434 - regression_loss: 0.6643 - classification_loss: 0.0790 372/500 [=====================>........] - ETA: 49s - loss: 0.7434 - regression_loss: 0.6644 - classification_loss: 0.0790 373/500 [=====================>........] - ETA: 48s - loss: 0.7435 - regression_loss: 0.6645 - classification_loss: 0.0790 374/500 [=====================>........] - ETA: 48s - loss: 0.7435 - regression_loss: 0.6645 - classification_loss: 0.0789 375/500 [=====================>........] - ETA: 47s - loss: 0.7443 - regression_loss: 0.6652 - classification_loss: 0.0791 376/500 [=====================>........] - ETA: 47s - loss: 0.7433 - regression_loss: 0.6644 - classification_loss: 0.0789 377/500 [=====================>........] - ETA: 47s - loss: 0.7439 - regression_loss: 0.6650 - classification_loss: 0.0789 378/500 [=====================>........] - ETA: 46s - loss: 0.7438 - regression_loss: 0.6649 - classification_loss: 0.0788 379/500 [=====================>........] - ETA: 46s - loss: 0.7437 - regression_loss: 0.6649 - classification_loss: 0.0788 380/500 [=====================>........] - ETA: 45s - loss: 0.7430 - regression_loss: 0.6643 - classification_loss: 0.0787 381/500 [=====================>........] - ETA: 45s - loss: 0.7420 - regression_loss: 0.6635 - classification_loss: 0.0785 382/500 [=====================>........] - ETA: 45s - loss: 0.7420 - regression_loss: 0.6634 - classification_loss: 0.0785 383/500 [=====================>........] - ETA: 44s - loss: 0.7417 - regression_loss: 0.6631 - classification_loss: 0.0785 384/500 [======================>.......] - ETA: 44s - loss: 0.7424 - regression_loss: 0.6638 - classification_loss: 0.0787 385/500 [======================>.......] - ETA: 44s - loss: 0.7435 - regression_loss: 0.6648 - classification_loss: 0.0787 386/500 [======================>.......] - ETA: 43s - loss: 0.7434 - regression_loss: 0.6646 - classification_loss: 0.0787 387/500 [======================>.......] - ETA: 43s - loss: 0.7439 - regression_loss: 0.6652 - classification_loss: 0.0788 388/500 [======================>.......] - ETA: 42s - loss: 0.7433 - regression_loss: 0.6647 - classification_loss: 0.0786 389/500 [======================>.......] - ETA: 42s - loss: 0.7424 - regression_loss: 0.6639 - classification_loss: 0.0785 390/500 [======================>.......] - ETA: 42s - loss: 0.7433 - regression_loss: 0.6647 - classification_loss: 0.0787 391/500 [======================>.......] - ETA: 41s - loss: 0.7430 - regression_loss: 0.6644 - classification_loss: 0.0786 392/500 [======================>.......] - ETA: 41s - loss: 0.7431 - regression_loss: 0.6645 - classification_loss: 0.0786 393/500 [======================>.......] - ETA: 40s - loss: 0.7452 - regression_loss: 0.6664 - classification_loss: 0.0788 394/500 [======================>.......] - ETA: 40s - loss: 0.7446 - regression_loss: 0.6660 - classification_loss: 0.0787 395/500 [======================>.......] - ETA: 40s - loss: 0.7451 - regression_loss: 0.6663 - classification_loss: 0.0788 396/500 [======================>.......] - ETA: 39s - loss: 0.7442 - regression_loss: 0.6656 - classification_loss: 0.0786 397/500 [======================>.......] - ETA: 39s - loss: 0.7460 - regression_loss: 0.6671 - classification_loss: 0.0789 398/500 [======================>.......] - ETA: 39s - loss: 0.7459 - regression_loss: 0.6670 - classification_loss: 0.0789 399/500 [======================>.......] - ETA: 38s - loss: 0.7461 - regression_loss: 0.6672 - classification_loss: 0.0789 400/500 [=======================>......] - ETA: 38s - loss: 0.7455 - regression_loss: 0.6664 - classification_loss: 0.0791 401/500 [=======================>......] - ETA: 37s - loss: 0.7451 - regression_loss: 0.6661 - classification_loss: 0.0791 402/500 [=======================>......] - ETA: 37s - loss: 0.7444 - regression_loss: 0.6655 - classification_loss: 0.0790 403/500 [=======================>......] - ETA: 37s - loss: 0.7440 - regression_loss: 0.6651 - classification_loss: 0.0789 404/500 [=======================>......] - ETA: 36s - loss: 0.7432 - regression_loss: 0.6643 - classification_loss: 0.0788 405/500 [=======================>......] - ETA: 36s - loss: 0.7430 - regression_loss: 0.6642 - classification_loss: 0.0788 406/500 [=======================>......] - ETA: 35s - loss: 0.7429 - regression_loss: 0.6641 - classification_loss: 0.0788 407/500 [=======================>......] - ETA: 35s - loss: 0.7426 - regression_loss: 0.6639 - classification_loss: 0.0788 408/500 [=======================>......] - ETA: 35s - loss: 0.7428 - regression_loss: 0.6640 - classification_loss: 0.0787 409/500 [=======================>......] - ETA: 34s - loss: 0.7434 - regression_loss: 0.6646 - classification_loss: 0.0787 410/500 [=======================>......] - ETA: 34s - loss: 0.7427 - regression_loss: 0.6640 - classification_loss: 0.0787 411/500 [=======================>......] - ETA: 34s - loss: 0.7432 - regression_loss: 0.6643 - classification_loss: 0.0788 412/500 [=======================>......] - ETA: 33s - loss: 0.7432 - regression_loss: 0.6644 - classification_loss: 0.0788 413/500 [=======================>......] - ETA: 33s - loss: 0.7434 - regression_loss: 0.6645 - classification_loss: 0.0789 414/500 [=======================>......] - ETA: 32s - loss: 0.7423 - regression_loss: 0.6635 - classification_loss: 0.0788 415/500 [=======================>......] - ETA: 32s - loss: 0.7426 - regression_loss: 0.6638 - classification_loss: 0.0788 416/500 [=======================>......] - ETA: 32s - loss: 0.7421 - regression_loss: 0.6633 - classification_loss: 0.0788 417/500 [========================>.....] - ETA: 31s - loss: 0.7410 - regression_loss: 0.6624 - classification_loss: 0.0786 418/500 [========================>.....] - ETA: 31s - loss: 0.7408 - regression_loss: 0.6622 - classification_loss: 0.0786 419/500 [========================>.....] - ETA: 31s - loss: 0.7400 - regression_loss: 0.6615 - classification_loss: 0.0785 420/500 [========================>.....] - ETA: 30s - loss: 0.7400 - regression_loss: 0.6615 - classification_loss: 0.0785 421/500 [========================>.....] - ETA: 30s - loss: 0.7406 - regression_loss: 0.6621 - classification_loss: 0.0785 422/500 [========================>.....] - ETA: 29s - loss: 0.7426 - regression_loss: 0.6638 - classification_loss: 0.0787 423/500 [========================>.....] - ETA: 29s - loss: 0.7429 - regression_loss: 0.6642 - classification_loss: 0.0788 424/500 [========================>.....] - ETA: 29s - loss: 0.7428 - regression_loss: 0.6641 - classification_loss: 0.0787 425/500 [========================>.....] - ETA: 28s - loss: 0.7427 - regression_loss: 0.6640 - classification_loss: 0.0787 426/500 [========================>.....] - ETA: 28s - loss: 0.7430 - regression_loss: 0.6643 - classification_loss: 0.0787 427/500 [========================>.....] - ETA: 27s - loss: 0.7438 - regression_loss: 0.6650 - classification_loss: 0.0787 428/500 [========================>.....] - ETA: 27s - loss: 0.7443 - regression_loss: 0.6656 - classification_loss: 0.0787 429/500 [========================>.....] - ETA: 27s - loss: 0.7444 - regression_loss: 0.6656 - classification_loss: 0.0788 430/500 [========================>.....] - ETA: 26s - loss: 0.7436 - regression_loss: 0.6649 - classification_loss: 0.0787 431/500 [========================>.....] - ETA: 26s - loss: 0.7435 - regression_loss: 0.6648 - classification_loss: 0.0787 432/500 [========================>.....] - ETA: 26s - loss: 0.7432 - regression_loss: 0.6645 - classification_loss: 0.0787 433/500 [========================>.....] - ETA: 25s - loss: 0.7427 - regression_loss: 0.6641 - classification_loss: 0.0786 434/500 [=========================>....] - ETA: 25s - loss: 0.7424 - regression_loss: 0.6639 - classification_loss: 0.0785 435/500 [=========================>....] - ETA: 24s - loss: 0.7420 - regression_loss: 0.6637 - classification_loss: 0.0784 436/500 [=========================>....] - ETA: 24s - loss: 0.7414 - regression_loss: 0.6631 - classification_loss: 0.0783 437/500 [=========================>....] - ETA: 24s - loss: 0.7414 - regression_loss: 0.6631 - classification_loss: 0.0783 438/500 [=========================>....] - ETA: 23s - loss: 0.7424 - regression_loss: 0.6640 - classification_loss: 0.0783 439/500 [=========================>....] - ETA: 23s - loss: 0.7428 - regression_loss: 0.6644 - classification_loss: 0.0784 440/500 [=========================>....] - ETA: 22s - loss: 0.7429 - regression_loss: 0.6644 - classification_loss: 0.0784 441/500 [=========================>....] - ETA: 22s - loss: 0.7433 - regression_loss: 0.6649 - classification_loss: 0.0784 442/500 [=========================>....] - ETA: 22s - loss: 0.7446 - regression_loss: 0.6662 - classification_loss: 0.0784 443/500 [=========================>....] - ETA: 21s - loss: 0.7449 - regression_loss: 0.6665 - classification_loss: 0.0784 444/500 [=========================>....] - ETA: 21s - loss: 0.7449 - regression_loss: 0.6665 - classification_loss: 0.0784 445/500 [=========================>....] - ETA: 21s - loss: 0.7456 - regression_loss: 0.6672 - classification_loss: 0.0785 446/500 [=========================>....] - ETA: 20s - loss: 0.7460 - regression_loss: 0.6675 - classification_loss: 0.0785 447/500 [=========================>....] - ETA: 20s - loss: 0.7467 - regression_loss: 0.6680 - classification_loss: 0.0787 448/500 [=========================>....] - ETA: 19s - loss: 0.7467 - regression_loss: 0.6681 - classification_loss: 0.0786 449/500 [=========================>....] - ETA: 19s - loss: 0.7461 - regression_loss: 0.6676 - classification_loss: 0.0785 450/500 [==========================>...] - ETA: 19s - loss: 0.7468 - regression_loss: 0.6682 - classification_loss: 0.0786 451/500 [==========================>...] - ETA: 18s - loss: 0.7464 - regression_loss: 0.6679 - classification_loss: 0.0785 452/500 [==========================>...] - ETA: 18s - loss: 0.7456 - regression_loss: 0.6672 - classification_loss: 0.0784 453/500 [==========================>...] - ETA: 17s - loss: 0.7454 - regression_loss: 0.6670 - classification_loss: 0.0785 454/500 [==========================>...] - ETA: 17s - loss: 0.7458 - regression_loss: 0.6672 - classification_loss: 0.0785 455/500 [==========================>...] - ETA: 17s - loss: 0.7456 - regression_loss: 0.6671 - classification_loss: 0.0785 456/500 [==========================>...] - ETA: 16s - loss: 0.7450 - regression_loss: 0.6666 - classification_loss: 0.0784 457/500 [==========================>...] - ETA: 16s - loss: 0.7444 - regression_loss: 0.6661 - classification_loss: 0.0783 458/500 [==========================>...] - ETA: 16s - loss: 0.7446 - regression_loss: 0.6662 - classification_loss: 0.0783 459/500 [==========================>...] - ETA: 15s - loss: 0.7446 - regression_loss: 0.6663 - classification_loss: 0.0783 460/500 [==========================>...] - ETA: 15s - loss: 0.7451 - regression_loss: 0.6668 - classification_loss: 0.0783 461/500 [==========================>...] - ETA: 14s - loss: 0.7456 - regression_loss: 0.6672 - classification_loss: 0.0784 462/500 [==========================>...] - ETA: 14s - loss: 0.7459 - regression_loss: 0.6674 - classification_loss: 0.0785 463/500 [==========================>...] - ETA: 14s - loss: 0.7461 - regression_loss: 0.6677 - classification_loss: 0.0785 464/500 [==========================>...] - ETA: 13s - loss: 0.7460 - regression_loss: 0.6675 - classification_loss: 0.0785 465/500 [==========================>...] - ETA: 13s - loss: 0.7463 - regression_loss: 0.6677 - classification_loss: 0.0785 466/500 [==========================>...] - ETA: 13s - loss: 0.7466 - regression_loss: 0.6679 - classification_loss: 0.0787 467/500 [===========================>..] - ETA: 12s - loss: 0.7455 - regression_loss: 0.6669 - classification_loss: 0.0786 468/500 [===========================>..] - ETA: 12s - loss: 0.7455 - regression_loss: 0.6670 - classification_loss: 0.0786 469/500 [===========================>..] - ETA: 11s - loss: 0.7447 - regression_loss: 0.6663 - classification_loss: 0.0784 470/500 [===========================>..] - ETA: 11s - loss: 0.7439 - regression_loss: 0.6655 - classification_loss: 0.0784 471/500 [===========================>..] - ETA: 11s - loss: 0.7434 - regression_loss: 0.6651 - classification_loss: 0.0783 472/500 [===========================>..] - ETA: 10s - loss: 0.7438 - regression_loss: 0.6654 - classification_loss: 0.0784 473/500 [===========================>..] - ETA: 10s - loss: 0.7441 - regression_loss: 0.6657 - classification_loss: 0.0784 474/500 [===========================>..] - ETA: 9s - loss: 0.7435 - regression_loss: 0.6652 - classification_loss: 0.0783  475/500 [===========================>..] - ETA: 9s - loss: 0.7439 - regression_loss: 0.6655 - classification_loss: 0.0783 476/500 [===========================>..] - ETA: 9s - loss: 0.7443 - regression_loss: 0.6659 - classification_loss: 0.0783 477/500 [===========================>..] - ETA: 8s - loss: 0.7445 - regression_loss: 0.6662 - classification_loss: 0.0783 478/500 [===========================>..] - ETA: 8s - loss: 0.7439 - regression_loss: 0.6656 - classification_loss: 0.0783 479/500 [===========================>..] - ETA: 8s - loss: 0.7442 - regression_loss: 0.6659 - classification_loss: 0.0784 480/500 [===========================>..] - ETA: 7s - loss: 0.7438 - regression_loss: 0.6655 - classification_loss: 0.0783 481/500 [===========================>..] - ETA: 7s - loss: 0.7443 - regression_loss: 0.6659 - classification_loss: 0.0783 482/500 [===========================>..] - ETA: 6s - loss: 0.7434 - regression_loss: 0.6652 - classification_loss: 0.0782 483/500 [===========================>..] - ETA: 6s - loss: 0.7434 - regression_loss: 0.6651 - classification_loss: 0.0782 484/500 [============================>.] - ETA: 6s - loss: 0.7427 - regression_loss: 0.6645 - classification_loss: 0.0781 485/500 [============================>.] - ETA: 5s - loss: 0.7430 - regression_loss: 0.6648 - classification_loss: 0.0782 486/500 [============================>.] - ETA: 5s - loss: 0.7429 - regression_loss: 0.6648 - classification_loss: 0.0781 487/500 [============================>.] - ETA: 4s - loss: 0.7430 - regression_loss: 0.6649 - classification_loss: 0.0781 488/500 [============================>.] - ETA: 4s - loss: 0.7435 - regression_loss: 0.6653 - classification_loss: 0.0782 489/500 [============================>.] - ETA: 4s - loss: 0.7430 - regression_loss: 0.6649 - classification_loss: 0.0781 490/500 [============================>.] - ETA: 3s - loss: 0.7429 - regression_loss: 0.6648 - classification_loss: 0.0781 491/500 [============================>.] - ETA: 3s - loss: 0.7432 - regression_loss: 0.6651 - classification_loss: 0.0781 492/500 [============================>.] - ETA: 3s - loss: 0.7430 - regression_loss: 0.6649 - classification_loss: 0.0781 493/500 [============================>.] - ETA: 2s - loss: 0.7434 - regression_loss: 0.6653 - classification_loss: 0.0781 494/500 [============================>.] - ETA: 2s - loss: 0.7443 - regression_loss: 0.6660 - classification_loss: 0.0783 495/500 [============================>.] - ETA: 1s - loss: 0.7439 - regression_loss: 0.6657 - classification_loss: 0.0782 496/500 [============================>.] - ETA: 1s - loss: 0.7440 - regression_loss: 0.6658 - classification_loss: 0.0782 497/500 [============================>.] - ETA: 1s - loss: 0.7451 - regression_loss: 0.6668 - classification_loss: 0.0783 498/500 [============================>.] - ETA: 0s - loss: 0.7456 - regression_loss: 0.6672 - classification_loss: 0.0784 499/500 [============================>.] - ETA: 0s - loss: 0.7460 - regression_loss: 0.6675 - classification_loss: 0.0785 500/500 [==============================] - 192s 383ms/step - loss: 0.7467 - regression_loss: 0.6682 - classification_loss: 0.0785 1172 instances of class plum with average precision: 0.7020 mAP: 0.7020 Epoch 00014: saving model to ./training/snapshots/resnet101_pascal_14.h5 Epoch 15/20 1/500 [..............................] - ETA: 2:59 - loss: 0.5087 - regression_loss: 0.4684 - classification_loss: 0.0404 2/500 [..............................] - ETA: 2:59 - loss: 0.5331 - regression_loss: 0.4833 - classification_loss: 0.0499 3/500 [..............................] - ETA: 3:04 - loss: 0.6820 - regression_loss: 0.6069 - classification_loss: 0.0751 4/500 [..............................] - ETA: 3:04 - loss: 0.6170 - regression_loss: 0.5470 - classification_loss: 0.0700 5/500 [..............................] - ETA: 3:03 - loss: 0.5678 - regression_loss: 0.5069 - classification_loss: 0.0609 6/500 [..............................] - ETA: 3:04 - loss: 0.5699 - regression_loss: 0.5093 - classification_loss: 0.0606 7/500 [..............................] - ETA: 3:04 - loss: 0.5698 - regression_loss: 0.5099 - classification_loss: 0.0599 8/500 [..............................] - ETA: 3:04 - loss: 0.6047 - regression_loss: 0.5408 - classification_loss: 0.0639 9/500 [..............................] - ETA: 3:04 - loss: 0.6307 - regression_loss: 0.5640 - classification_loss: 0.0667 10/500 [..............................] - ETA: 3:04 - loss: 0.6125 - regression_loss: 0.5477 - classification_loss: 0.0649 11/500 [..............................] - ETA: 3:04 - loss: 0.6414 - regression_loss: 0.5730 - classification_loss: 0.0684 12/500 [..............................] - ETA: 3:03 - loss: 0.6649 - regression_loss: 0.5953 - classification_loss: 0.0696 13/500 [..............................] - ETA: 3:03 - loss: 0.6849 - regression_loss: 0.6120 - classification_loss: 0.0728 14/500 [..............................] - ETA: 3:03 - loss: 0.6730 - regression_loss: 0.6026 - classification_loss: 0.0704 15/500 [..............................] - ETA: 3:02 - loss: 0.6638 - regression_loss: 0.5946 - classification_loss: 0.0692 16/500 [..............................] - ETA: 3:03 - loss: 0.6668 - regression_loss: 0.5982 - classification_loss: 0.0686 17/500 [>.............................] - ETA: 3:02 - loss: 0.6573 - regression_loss: 0.5906 - classification_loss: 0.0667 18/500 [>.............................] - ETA: 3:02 - loss: 0.6687 - regression_loss: 0.6024 - classification_loss: 0.0663 19/500 [>.............................] - ETA: 3:02 - loss: 0.6645 - regression_loss: 0.5992 - classification_loss: 0.0653 20/500 [>.............................] - ETA: 3:02 - loss: 0.6593 - regression_loss: 0.5934 - classification_loss: 0.0659 21/500 [>.............................] - ETA: 3:03 - loss: 0.6672 - regression_loss: 0.6005 - classification_loss: 0.0666 22/500 [>.............................] - ETA: 3:02 - loss: 0.6812 - regression_loss: 0.6124 - classification_loss: 0.0688 23/500 [>.............................] - ETA: 3:02 - loss: 0.6988 - regression_loss: 0.6269 - classification_loss: 0.0718 24/500 [>.............................] - ETA: 3:02 - loss: 0.6950 - regression_loss: 0.6237 - classification_loss: 0.0713 25/500 [>.............................] - ETA: 3:01 - loss: 0.6895 - regression_loss: 0.6192 - classification_loss: 0.0703 26/500 [>.............................] - ETA: 3:01 - loss: 0.6960 - regression_loss: 0.6249 - classification_loss: 0.0711 27/500 [>.............................] - ETA: 3:00 - loss: 0.6812 - regression_loss: 0.6119 - classification_loss: 0.0693 28/500 [>.............................] - ETA: 2:59 - loss: 0.6776 - regression_loss: 0.6091 - classification_loss: 0.0685 29/500 [>.............................] - ETA: 2:59 - loss: 0.6806 - regression_loss: 0.6119 - classification_loss: 0.0687 30/500 [>.............................] - ETA: 2:59 - loss: 0.6822 - regression_loss: 0.6129 - classification_loss: 0.0693 31/500 [>.............................] - ETA: 2:58 - loss: 0.6734 - regression_loss: 0.6045 - classification_loss: 0.0689 32/500 [>.............................] - ETA: 2:58 - loss: 0.6762 - regression_loss: 0.6060 - classification_loss: 0.0702 33/500 [>.............................] - ETA: 2:58 - loss: 0.6681 - regression_loss: 0.5989 - classification_loss: 0.0692 34/500 [=>............................] - ETA: 2:58 - loss: 0.6696 - regression_loss: 0.5999 - classification_loss: 0.0697 35/500 [=>............................] - ETA: 2:58 - loss: 0.6689 - regression_loss: 0.5990 - classification_loss: 0.0698 36/500 [=>............................] - ETA: 2:57 - loss: 0.6683 - regression_loss: 0.5991 - classification_loss: 0.0692 37/500 [=>............................] - ETA: 2:56 - loss: 0.6787 - regression_loss: 0.6084 - classification_loss: 0.0703 38/500 [=>............................] - ETA: 2:56 - loss: 0.6755 - regression_loss: 0.6052 - classification_loss: 0.0703 39/500 [=>............................] - ETA: 2:56 - loss: 0.6720 - regression_loss: 0.6017 - classification_loss: 0.0703 40/500 [=>............................] - ETA: 2:56 - loss: 0.6644 - regression_loss: 0.5948 - classification_loss: 0.0696 41/500 [=>............................] - ETA: 2:55 - loss: 0.6766 - regression_loss: 0.6050 - classification_loss: 0.0716 42/500 [=>............................] - ETA: 2:55 - loss: 0.6689 - regression_loss: 0.5977 - classification_loss: 0.0712 43/500 [=>............................] - ETA: 2:54 - loss: 0.6853 - regression_loss: 0.6120 - classification_loss: 0.0733 44/500 [=>............................] - ETA: 2:54 - loss: 0.6820 - regression_loss: 0.6090 - classification_loss: 0.0730 45/500 [=>............................] - ETA: 2:54 - loss: 0.6895 - regression_loss: 0.6156 - classification_loss: 0.0739 46/500 [=>............................] - ETA: 2:54 - loss: 0.6921 - regression_loss: 0.6182 - classification_loss: 0.0738 47/500 [=>............................] - ETA: 2:53 - loss: 0.6859 - regression_loss: 0.6127 - classification_loss: 0.0732 48/500 [=>............................] - ETA: 2:53 - loss: 0.6919 - regression_loss: 0.6176 - classification_loss: 0.0743 49/500 [=>............................] - ETA: 2:52 - loss: 0.6930 - regression_loss: 0.6193 - classification_loss: 0.0736 50/500 [==>...........................] - ETA: 2:52 - loss: 0.6958 - regression_loss: 0.6223 - classification_loss: 0.0735 51/500 [==>...........................] - ETA: 2:52 - loss: 0.6998 - regression_loss: 0.6264 - classification_loss: 0.0733 52/500 [==>...........................] - ETA: 2:51 - loss: 0.7013 - regression_loss: 0.6279 - classification_loss: 0.0734 53/500 [==>...........................] - ETA: 2:51 - loss: 0.7000 - regression_loss: 0.6264 - classification_loss: 0.0736 54/500 [==>...........................] - ETA: 2:50 - loss: 0.6983 - regression_loss: 0.6251 - classification_loss: 0.0732 55/500 [==>...........................] - ETA: 2:50 - loss: 0.6982 - regression_loss: 0.6256 - classification_loss: 0.0726 56/500 [==>...........................] - ETA: 2:49 - loss: 0.6921 - regression_loss: 0.6204 - classification_loss: 0.0717 57/500 [==>...........................] - ETA: 2:49 - loss: 0.6898 - regression_loss: 0.6183 - classification_loss: 0.0715 58/500 [==>...........................] - ETA: 2:48 - loss: 0.6904 - regression_loss: 0.6188 - classification_loss: 0.0716 59/500 [==>...........................] - ETA: 2:48 - loss: 0.6918 - regression_loss: 0.6202 - classification_loss: 0.0716 60/500 [==>...........................] - ETA: 2:48 - loss: 0.6895 - regression_loss: 0.6184 - classification_loss: 0.0711 61/500 [==>...........................] - ETA: 2:47 - loss: 0.6875 - regression_loss: 0.6168 - classification_loss: 0.0708 62/500 [==>...........................] - ETA: 2:47 - loss: 0.6920 - regression_loss: 0.6207 - classification_loss: 0.0714 63/500 [==>...........................] - ETA: 2:47 - loss: 0.6887 - regression_loss: 0.6180 - classification_loss: 0.0707 64/500 [==>...........................] - ETA: 2:46 - loss: 0.6874 - regression_loss: 0.6171 - classification_loss: 0.0703 65/500 [==>...........................] - ETA: 2:46 - loss: 0.6883 - regression_loss: 0.6177 - classification_loss: 0.0706 66/500 [==>...........................] - ETA: 2:46 - loss: 0.6848 - regression_loss: 0.6142 - classification_loss: 0.0705 67/500 [===>..........................] - ETA: 2:45 - loss: 0.6849 - regression_loss: 0.6145 - classification_loss: 0.0704 68/500 [===>..........................] - ETA: 2:45 - loss: 0.6808 - regression_loss: 0.6108 - classification_loss: 0.0700 69/500 [===>..........................] - ETA: 2:44 - loss: 0.6816 - regression_loss: 0.6115 - classification_loss: 0.0701 70/500 [===>..........................] - ETA: 2:44 - loss: 0.6842 - regression_loss: 0.6143 - classification_loss: 0.0699 71/500 [===>..........................] - ETA: 2:44 - loss: 0.6854 - regression_loss: 0.6149 - classification_loss: 0.0706 72/500 [===>..........................] - ETA: 2:44 - loss: 0.6863 - regression_loss: 0.6158 - classification_loss: 0.0705 73/500 [===>..........................] - ETA: 2:43 - loss: 0.6854 - regression_loss: 0.6152 - classification_loss: 0.0702 74/500 [===>..........................] - ETA: 2:43 - loss: 0.6824 - regression_loss: 0.6125 - classification_loss: 0.0699 75/500 [===>..........................] - ETA: 2:42 - loss: 0.6845 - regression_loss: 0.6145 - classification_loss: 0.0700 76/500 [===>..........................] - ETA: 2:42 - loss: 0.6896 - regression_loss: 0.6194 - classification_loss: 0.0702 77/500 [===>..........................] - ETA: 2:42 - loss: 0.6923 - regression_loss: 0.6217 - classification_loss: 0.0707 78/500 [===>..........................] - ETA: 2:41 - loss: 0.6916 - regression_loss: 0.6208 - classification_loss: 0.0708 79/500 [===>..........................] - ETA: 2:41 - loss: 0.6919 - regression_loss: 0.6213 - classification_loss: 0.0706 80/500 [===>..........................] - ETA: 2:40 - loss: 0.6900 - regression_loss: 0.6196 - classification_loss: 0.0704 81/500 [===>..........................] - ETA: 2:40 - loss: 0.6891 - regression_loss: 0.6188 - classification_loss: 0.0703 82/500 [===>..........................] - ETA: 2:40 - loss: 0.6868 - regression_loss: 0.6168 - classification_loss: 0.0701 83/500 [===>..........................] - ETA: 2:39 - loss: 0.6889 - regression_loss: 0.6184 - classification_loss: 0.0705 84/500 [====>.........................] - ETA: 2:39 - loss: 0.6942 - regression_loss: 0.6230 - classification_loss: 0.0712 85/500 [====>.........................] - ETA: 2:38 - loss: 0.6892 - regression_loss: 0.6185 - classification_loss: 0.0707 86/500 [====>.........................] - ETA: 2:38 - loss: 0.6932 - regression_loss: 0.6222 - classification_loss: 0.0710 87/500 [====>.........................] - ETA: 2:37 - loss: 0.6918 - regression_loss: 0.6211 - classification_loss: 0.0707 88/500 [====>.........................] - ETA: 2:37 - loss: 0.6904 - regression_loss: 0.6199 - classification_loss: 0.0705 89/500 [====>.........................] - ETA: 2:37 - loss: 0.6879 - regression_loss: 0.6177 - classification_loss: 0.0702 90/500 [====>.........................] - ETA: 2:36 - loss: 0.6829 - regression_loss: 0.6133 - classification_loss: 0.0696 91/500 [====>.........................] - ETA: 2:36 - loss: 0.6788 - regression_loss: 0.6097 - classification_loss: 0.0691 92/500 [====>.........................] - ETA: 2:36 - loss: 0.6775 - regression_loss: 0.6085 - classification_loss: 0.0690 93/500 [====>.........................] - ETA: 2:36 - loss: 0.6757 - regression_loss: 0.6069 - classification_loss: 0.0688 94/500 [====>.........................] - ETA: 2:35 - loss: 0.6723 - regression_loss: 0.6039 - classification_loss: 0.0685 95/500 [====>.........................] - ETA: 2:35 - loss: 0.6727 - regression_loss: 0.6043 - classification_loss: 0.0684 96/500 [====>.........................] - ETA: 2:34 - loss: 0.6723 - regression_loss: 0.6039 - classification_loss: 0.0684 97/500 [====>.........................] - ETA: 2:34 - loss: 0.6748 - regression_loss: 0.6058 - classification_loss: 0.0690 98/500 [====>.........................] - ETA: 2:33 - loss: 0.6739 - regression_loss: 0.6051 - classification_loss: 0.0688 99/500 [====>.........................] - ETA: 2:33 - loss: 0.6742 - regression_loss: 0.6054 - classification_loss: 0.0688 100/500 [=====>........................] - ETA: 2:33 - loss: 0.6711 - regression_loss: 0.6025 - classification_loss: 0.0685 101/500 [=====>........................] - ETA: 2:32 - loss: 0.6681 - regression_loss: 0.5997 - classification_loss: 0.0684 102/500 [=====>........................] - ETA: 2:32 - loss: 0.6708 - regression_loss: 0.6021 - classification_loss: 0.0687 103/500 [=====>........................] - ETA: 2:31 - loss: 0.6696 - regression_loss: 0.6011 - classification_loss: 0.0685 104/500 [=====>........................] - ETA: 2:31 - loss: 0.6723 - regression_loss: 0.6035 - classification_loss: 0.0687 105/500 [=====>........................] - ETA: 2:31 - loss: 0.6699 - regression_loss: 0.6016 - classification_loss: 0.0683 106/500 [=====>........................] - ETA: 2:30 - loss: 0.6701 - regression_loss: 0.6017 - classification_loss: 0.0684 107/500 [=====>........................] - ETA: 2:30 - loss: 0.6716 - regression_loss: 0.6030 - classification_loss: 0.0686 108/500 [=====>........................] - ETA: 2:30 - loss: 0.6706 - regression_loss: 0.6020 - classification_loss: 0.0686 109/500 [=====>........................] - ETA: 2:29 - loss: 0.6698 - regression_loss: 0.6012 - classification_loss: 0.0686 110/500 [=====>........................] - ETA: 2:29 - loss: 0.6689 - regression_loss: 0.6005 - classification_loss: 0.0683 111/500 [=====>........................] - ETA: 2:29 - loss: 0.6697 - regression_loss: 0.6016 - classification_loss: 0.0681 112/500 [=====>........................] - ETA: 2:28 - loss: 0.6720 - regression_loss: 0.6034 - classification_loss: 0.0686 113/500 [=====>........................] - ETA: 2:28 - loss: 0.6713 - regression_loss: 0.6030 - classification_loss: 0.0683 114/500 [=====>........................] - ETA: 2:27 - loss: 0.6688 - regression_loss: 0.6008 - classification_loss: 0.0680 115/500 [=====>........................] - ETA: 2:27 - loss: 0.6716 - regression_loss: 0.6031 - classification_loss: 0.0685 116/500 [=====>........................] - ETA: 2:27 - loss: 0.6711 - regression_loss: 0.6022 - classification_loss: 0.0688 117/500 [======>.......................] - ETA: 2:26 - loss: 0.6717 - regression_loss: 0.6028 - classification_loss: 0.0689 118/500 [======>.......................] - ETA: 2:26 - loss: 0.6734 - regression_loss: 0.6041 - classification_loss: 0.0693 119/500 [======>.......................] - ETA: 2:26 - loss: 0.6749 - regression_loss: 0.6055 - classification_loss: 0.0695 120/500 [======>.......................] - ETA: 2:25 - loss: 0.6713 - regression_loss: 0.6022 - classification_loss: 0.0691 121/500 [======>.......................] - ETA: 2:25 - loss: 0.6733 - regression_loss: 0.6042 - classification_loss: 0.0692 122/500 [======>.......................] - ETA: 2:24 - loss: 0.6725 - regression_loss: 0.6035 - classification_loss: 0.0690 123/500 [======>.......................] - ETA: 2:24 - loss: 0.6770 - regression_loss: 0.6070 - classification_loss: 0.0699 124/500 [======>.......................] - ETA: 2:23 - loss: 0.6779 - regression_loss: 0.6078 - classification_loss: 0.0701 125/500 [======>.......................] - ETA: 2:23 - loss: 0.6782 - regression_loss: 0.6081 - classification_loss: 0.0700 126/500 [======>.......................] - ETA: 2:23 - loss: 0.6778 - regression_loss: 0.6080 - classification_loss: 0.0697 127/500 [======>.......................] - ETA: 2:22 - loss: 0.6769 - regression_loss: 0.6072 - classification_loss: 0.0698 128/500 [======>.......................] - ETA: 2:22 - loss: 0.6802 - regression_loss: 0.6099 - classification_loss: 0.0704 129/500 [======>.......................] - ETA: 2:21 - loss: 0.6789 - regression_loss: 0.6087 - classification_loss: 0.0702 130/500 [======>.......................] - ETA: 2:21 - loss: 0.6787 - regression_loss: 0.6086 - classification_loss: 0.0701 131/500 [======>.......................] - ETA: 2:21 - loss: 0.6784 - regression_loss: 0.6084 - classification_loss: 0.0700 132/500 [======>.......................] - ETA: 2:21 - loss: 0.6756 - regression_loss: 0.6058 - classification_loss: 0.0698 133/500 [======>.......................] - ETA: 2:20 - loss: 0.6761 - regression_loss: 0.6062 - classification_loss: 0.0699 134/500 [=======>......................] - ETA: 2:20 - loss: 0.6777 - regression_loss: 0.6074 - classification_loss: 0.0703 135/500 [=======>......................] - ETA: 2:19 - loss: 0.6790 - regression_loss: 0.6084 - classification_loss: 0.0706 136/500 [=======>......................] - ETA: 2:19 - loss: 0.6793 - regression_loss: 0.6087 - classification_loss: 0.0706 137/500 [=======>......................] - ETA: 2:19 - loss: 0.6805 - regression_loss: 0.6098 - classification_loss: 0.0707 138/500 [=======>......................] - ETA: 2:18 - loss: 0.6805 - regression_loss: 0.6098 - classification_loss: 0.0707 139/500 [=======>......................] - ETA: 2:18 - loss: 0.6794 - regression_loss: 0.6089 - classification_loss: 0.0706 140/500 [=======>......................] - ETA: 2:18 - loss: 0.6782 - regression_loss: 0.6079 - classification_loss: 0.0703 141/500 [=======>......................] - ETA: 2:17 - loss: 0.6770 - regression_loss: 0.6068 - classification_loss: 0.0702 142/500 [=======>......................] - ETA: 2:17 - loss: 0.6788 - regression_loss: 0.6086 - classification_loss: 0.0702 143/500 [=======>......................] - ETA: 2:16 - loss: 0.6788 - regression_loss: 0.6086 - classification_loss: 0.0702 144/500 [=======>......................] - ETA: 2:16 - loss: 0.6792 - regression_loss: 0.6090 - classification_loss: 0.0702 145/500 [=======>......................] - ETA: 2:15 - loss: 0.6789 - regression_loss: 0.6087 - classification_loss: 0.0702 146/500 [=======>......................] - ETA: 2:15 - loss: 0.6780 - regression_loss: 0.6080 - classification_loss: 0.0701 147/500 [=======>......................] - ETA: 2:15 - loss: 0.6757 - regression_loss: 0.6059 - classification_loss: 0.0698 148/500 [=======>......................] - ETA: 2:14 - loss: 0.6758 - regression_loss: 0.6059 - classification_loss: 0.0698 149/500 [=======>......................] - ETA: 2:14 - loss: 0.6750 - regression_loss: 0.6054 - classification_loss: 0.0696 150/500 [========>.....................] - ETA: 2:14 - loss: 0.6739 - regression_loss: 0.6045 - classification_loss: 0.0695 151/500 [========>.....................] - ETA: 2:13 - loss: 0.6760 - regression_loss: 0.6065 - classification_loss: 0.0695 152/500 [========>.....................] - ETA: 2:13 - loss: 0.6757 - regression_loss: 0.6063 - classification_loss: 0.0694 153/500 [========>.....................] - ETA: 2:12 - loss: 0.6737 - regression_loss: 0.6045 - classification_loss: 0.0692 154/500 [========>.....................] - ETA: 2:12 - loss: 0.6748 - regression_loss: 0.6053 - classification_loss: 0.0696 155/500 [========>.....................] - ETA: 2:11 - loss: 0.6759 - regression_loss: 0.6063 - classification_loss: 0.0696 156/500 [========>.....................] - ETA: 2:11 - loss: 0.6746 - regression_loss: 0.6051 - classification_loss: 0.0695 157/500 [========>.....................] - ETA: 2:11 - loss: 0.6784 - regression_loss: 0.6086 - classification_loss: 0.0698 158/500 [========>.....................] - ETA: 2:10 - loss: 0.6765 - regression_loss: 0.6070 - classification_loss: 0.0695 159/500 [========>.....................] - ETA: 2:10 - loss: 0.6796 - regression_loss: 0.6096 - classification_loss: 0.0700 160/500 [========>.....................] - ETA: 2:09 - loss: 0.6805 - regression_loss: 0.6105 - classification_loss: 0.0700 161/500 [========>.....................] - ETA: 2:09 - loss: 0.6822 - regression_loss: 0.6121 - classification_loss: 0.0701 162/500 [========>.....................] - ETA: 2:09 - loss: 0.6807 - regression_loss: 0.6107 - classification_loss: 0.0700 163/500 [========>.....................] - ETA: 2:08 - loss: 0.6815 - regression_loss: 0.6115 - classification_loss: 0.0701 164/500 [========>.....................] - ETA: 2:08 - loss: 0.6833 - regression_loss: 0.6129 - classification_loss: 0.0704 165/500 [========>.....................] - ETA: 2:08 - loss: 0.6818 - regression_loss: 0.6117 - classification_loss: 0.0701 166/500 [========>.....................] - ETA: 2:07 - loss: 0.6852 - regression_loss: 0.6147 - classification_loss: 0.0705 167/500 [=========>....................] - ETA: 2:07 - loss: 0.6836 - regression_loss: 0.6133 - classification_loss: 0.0703 168/500 [=========>....................] - ETA: 2:06 - loss: 0.6815 - regression_loss: 0.6115 - classification_loss: 0.0700 169/500 [=========>....................] - ETA: 2:06 - loss: 0.6824 - regression_loss: 0.6124 - classification_loss: 0.0701 170/500 [=========>....................] - ETA: 2:06 - loss: 0.6830 - regression_loss: 0.6128 - classification_loss: 0.0703 171/500 [=========>....................] - ETA: 2:05 - loss: 0.6825 - regression_loss: 0.6122 - classification_loss: 0.0703 172/500 [=========>....................] - ETA: 2:05 - loss: 0.6865 - regression_loss: 0.6161 - classification_loss: 0.0704 173/500 [=========>....................] - ETA: 2:05 - loss: 0.6866 - regression_loss: 0.6163 - classification_loss: 0.0703 174/500 [=========>....................] - ETA: 2:04 - loss: 0.6864 - regression_loss: 0.6162 - classification_loss: 0.0702 175/500 [=========>....................] - ETA: 2:04 - loss: 0.6855 - regression_loss: 0.6153 - classification_loss: 0.0702 176/500 [=========>....................] - ETA: 2:03 - loss: 0.6838 - regression_loss: 0.6138 - classification_loss: 0.0701 177/500 [=========>....................] - ETA: 2:03 - loss: 0.6855 - regression_loss: 0.6153 - classification_loss: 0.0702 178/500 [=========>....................] - ETA: 2:03 - loss: 0.6868 - regression_loss: 0.6164 - classification_loss: 0.0704 179/500 [=========>....................] - ETA: 2:02 - loss: 0.6865 - regression_loss: 0.6163 - classification_loss: 0.0702 180/500 [=========>....................] - ETA: 2:02 - loss: 0.6854 - regression_loss: 0.6153 - classification_loss: 0.0702 181/500 [=========>....................] - ETA: 2:02 - loss: 0.6855 - regression_loss: 0.6153 - classification_loss: 0.0703 182/500 [=========>....................] - ETA: 2:01 - loss: 0.6855 - regression_loss: 0.6151 - classification_loss: 0.0704 183/500 [=========>....................] - ETA: 2:01 - loss: 0.6836 - regression_loss: 0.6134 - classification_loss: 0.0702 184/500 [==========>...................] - ETA: 2:01 - loss: 0.6816 - regression_loss: 0.6116 - classification_loss: 0.0700 185/500 [==========>...................] - ETA: 2:00 - loss: 0.6810 - regression_loss: 0.6110 - classification_loss: 0.0699 186/500 [==========>...................] - ETA: 2:00 - loss: 0.6799 - regression_loss: 0.6101 - classification_loss: 0.0698 187/500 [==========>...................] - ETA: 1:59 - loss: 0.6783 - regression_loss: 0.6088 - classification_loss: 0.0696 188/500 [==========>...................] - ETA: 1:59 - loss: 0.6786 - regression_loss: 0.6091 - classification_loss: 0.0695 189/500 [==========>...................] - ETA: 1:59 - loss: 0.6783 - regression_loss: 0.6087 - classification_loss: 0.0696 190/500 [==========>...................] - ETA: 1:58 - loss: 0.6789 - regression_loss: 0.6092 - classification_loss: 0.0697 191/500 [==========>...................] - ETA: 1:58 - loss: 0.6794 - regression_loss: 0.6097 - classification_loss: 0.0697 192/500 [==========>...................] - ETA: 1:57 - loss: 0.6816 - regression_loss: 0.6120 - classification_loss: 0.0696 193/500 [==========>...................] - ETA: 1:57 - loss: 0.6832 - regression_loss: 0.6133 - classification_loss: 0.0699 194/500 [==========>...................] - ETA: 1:57 - loss: 0.6808 - regression_loss: 0.6112 - classification_loss: 0.0696 195/500 [==========>...................] - ETA: 1:56 - loss: 0.6822 - regression_loss: 0.6124 - classification_loss: 0.0698 196/500 [==========>...................] - ETA: 1:56 - loss: 0.6811 - regression_loss: 0.6115 - classification_loss: 0.0696 197/500 [==========>...................] - ETA: 1:55 - loss: 0.6818 - regression_loss: 0.6121 - classification_loss: 0.0697 198/500 [==========>...................] - ETA: 1:55 - loss: 0.6807 - regression_loss: 0.6110 - classification_loss: 0.0696 199/500 [==========>...................] - ETA: 1:55 - loss: 0.6820 - regression_loss: 0.6121 - classification_loss: 0.0699 200/500 [===========>..................] - ETA: 1:54 - loss: 0.6828 - regression_loss: 0.6127 - classification_loss: 0.0702 201/500 [===========>..................] - ETA: 1:54 - loss: 0.6833 - regression_loss: 0.6130 - classification_loss: 0.0703 202/500 [===========>..................] - ETA: 1:53 - loss: 0.6830 - regression_loss: 0.6126 - classification_loss: 0.0704 203/500 [===========>..................] - ETA: 1:53 - loss: 0.6846 - regression_loss: 0.6139 - classification_loss: 0.0707 204/500 [===========>..................] - ETA: 1:53 - loss: 0.6844 - regression_loss: 0.6138 - classification_loss: 0.0706 205/500 [===========>..................] - ETA: 1:52 - loss: 0.6849 - regression_loss: 0.6143 - classification_loss: 0.0706 206/500 [===========>..................] - ETA: 1:52 - loss: 0.6838 - regression_loss: 0.6133 - classification_loss: 0.0705 207/500 [===========>..................] - ETA: 1:51 - loss: 0.6859 - regression_loss: 0.6152 - classification_loss: 0.0707 208/500 [===========>..................] - ETA: 1:51 - loss: 0.6844 - regression_loss: 0.6139 - classification_loss: 0.0705 209/500 [===========>..................] - ETA: 1:51 - loss: 0.6851 - regression_loss: 0.6146 - classification_loss: 0.0705 210/500 [===========>..................] - ETA: 1:50 - loss: 0.6845 - regression_loss: 0.6141 - classification_loss: 0.0704 211/500 [===========>..................] - ETA: 1:50 - loss: 0.6827 - regression_loss: 0.6125 - classification_loss: 0.0702 212/500 [===========>..................] - ETA: 1:50 - loss: 0.6829 - regression_loss: 0.6126 - classification_loss: 0.0703 213/500 [===========>..................] - ETA: 1:49 - loss: 0.6825 - regression_loss: 0.6123 - classification_loss: 0.0702 214/500 [===========>..................] - ETA: 1:49 - loss: 0.6814 - regression_loss: 0.6114 - classification_loss: 0.0700 215/500 [===========>..................] - ETA: 1:48 - loss: 0.6822 - regression_loss: 0.6120 - classification_loss: 0.0701 216/500 [===========>..................] - ETA: 1:48 - loss: 0.6807 - regression_loss: 0.6107 - classification_loss: 0.0700 217/500 [============>.................] - ETA: 1:48 - loss: 0.6817 - regression_loss: 0.6116 - classification_loss: 0.0701 218/500 [============>.................] - ETA: 1:47 - loss: 0.6810 - regression_loss: 0.6110 - classification_loss: 0.0700 219/500 [============>.................] - ETA: 1:47 - loss: 0.6799 - regression_loss: 0.6101 - classification_loss: 0.0698 220/500 [============>.................] - ETA: 1:46 - loss: 0.6786 - regression_loss: 0.6090 - classification_loss: 0.0696 221/500 [============>.................] - ETA: 1:46 - loss: 0.6792 - regression_loss: 0.6095 - classification_loss: 0.0697 222/500 [============>.................] - ETA: 1:46 - loss: 0.6812 - regression_loss: 0.6111 - classification_loss: 0.0701 223/500 [============>.................] - ETA: 1:45 - loss: 0.6809 - regression_loss: 0.6108 - classification_loss: 0.0701 224/500 [============>.................] - ETA: 1:45 - loss: 0.6797 - regression_loss: 0.6097 - classification_loss: 0.0700 225/500 [============>.................] - ETA: 1:45 - loss: 0.6795 - regression_loss: 0.6095 - classification_loss: 0.0700 226/500 [============>.................] - ETA: 1:44 - loss: 0.6787 - regression_loss: 0.6088 - classification_loss: 0.0699 227/500 [============>.................] - ETA: 1:44 - loss: 0.6790 - regression_loss: 0.6091 - classification_loss: 0.0699 228/500 [============>.................] - ETA: 1:43 - loss: 0.6797 - regression_loss: 0.6098 - classification_loss: 0.0699 229/500 [============>.................] - ETA: 1:43 - loss: 0.6795 - regression_loss: 0.6097 - classification_loss: 0.0698 230/500 [============>.................] - ETA: 1:43 - loss: 0.6796 - regression_loss: 0.6098 - classification_loss: 0.0698 231/500 [============>.................] - ETA: 1:42 - loss: 0.6798 - regression_loss: 0.6100 - classification_loss: 0.0698 232/500 [============>.................] - ETA: 1:42 - loss: 0.6788 - regression_loss: 0.6092 - classification_loss: 0.0696 233/500 [============>.................] - ETA: 1:41 - loss: 0.6794 - regression_loss: 0.6097 - classification_loss: 0.0697 234/500 [=============>................] - ETA: 1:41 - loss: 0.6799 - regression_loss: 0.6100 - classification_loss: 0.0699 235/500 [=============>................] - ETA: 1:41 - loss: 0.6794 - regression_loss: 0.6096 - classification_loss: 0.0698 236/500 [=============>................] - ETA: 1:40 - loss: 0.6793 - regression_loss: 0.6097 - classification_loss: 0.0696 237/500 [=============>................] - ETA: 1:40 - loss: 0.6799 - regression_loss: 0.6102 - classification_loss: 0.0697 238/500 [=============>................] - ETA: 1:39 - loss: 0.6824 - regression_loss: 0.6124 - classification_loss: 0.0700 239/500 [=============>................] - ETA: 1:39 - loss: 0.6812 - regression_loss: 0.6114 - classification_loss: 0.0698 240/500 [=============>................] - ETA: 1:39 - loss: 0.6817 - regression_loss: 0.6119 - classification_loss: 0.0698 241/500 [=============>................] - ETA: 1:38 - loss: 0.6804 - regression_loss: 0.6108 - classification_loss: 0.0697 242/500 [=============>................] - ETA: 1:38 - loss: 0.6820 - regression_loss: 0.6121 - classification_loss: 0.0698 243/500 [=============>................] - ETA: 1:38 - loss: 0.6828 - regression_loss: 0.6128 - classification_loss: 0.0699 244/500 [=============>................] - ETA: 1:37 - loss: 0.6833 - regression_loss: 0.6133 - classification_loss: 0.0700 245/500 [=============>................] - ETA: 1:37 - loss: 0.6823 - regression_loss: 0.6122 - classification_loss: 0.0700 246/500 [=============>................] - ETA: 1:36 - loss: 0.6813 - regression_loss: 0.6113 - classification_loss: 0.0700 247/500 [=============>................] - ETA: 1:36 - loss: 0.6809 - regression_loss: 0.6109 - classification_loss: 0.0700 248/500 [=============>................] - ETA: 1:36 - loss: 0.6821 - regression_loss: 0.6119 - classification_loss: 0.0702 249/500 [=============>................] - ETA: 1:35 - loss: 0.6809 - regression_loss: 0.6109 - classification_loss: 0.0700 250/500 [==============>...............] - ETA: 1:35 - loss: 0.6808 - regression_loss: 0.6108 - classification_loss: 0.0700 251/500 [==============>...............] - ETA: 1:35 - loss: 0.6800 - regression_loss: 0.6101 - classification_loss: 0.0699 252/500 [==============>...............] - ETA: 1:34 - loss: 0.6811 - regression_loss: 0.6109 - classification_loss: 0.0702 253/500 [==============>...............] - ETA: 1:34 - loss: 0.6812 - regression_loss: 0.6111 - classification_loss: 0.0701 254/500 [==============>...............] - ETA: 1:33 - loss: 0.6822 - regression_loss: 0.6119 - classification_loss: 0.0703 255/500 [==============>...............] - ETA: 1:33 - loss: 0.6822 - regression_loss: 0.6118 - classification_loss: 0.0704 256/500 [==============>...............] - ETA: 1:33 - loss: 0.6812 - regression_loss: 0.6110 - classification_loss: 0.0702 257/500 [==============>...............] - ETA: 1:32 - loss: 0.6809 - regression_loss: 0.6107 - classification_loss: 0.0702 258/500 [==============>...............] - ETA: 1:32 - loss: 0.6792 - regression_loss: 0.6093 - classification_loss: 0.0700 259/500 [==============>...............] - ETA: 1:31 - loss: 0.6794 - regression_loss: 0.6094 - classification_loss: 0.0700 260/500 [==============>...............] - ETA: 1:31 - loss: 0.6787 - regression_loss: 0.6086 - classification_loss: 0.0702 261/500 [==============>...............] - ETA: 1:31 - loss: 0.6795 - regression_loss: 0.6092 - classification_loss: 0.0703 262/500 [==============>...............] - ETA: 1:30 - loss: 0.6785 - regression_loss: 0.6084 - classification_loss: 0.0701 263/500 [==============>...............] - ETA: 1:30 - loss: 0.6798 - regression_loss: 0.6092 - classification_loss: 0.0706 264/500 [==============>...............] - ETA: 1:30 - loss: 0.6793 - regression_loss: 0.6087 - classification_loss: 0.0706 265/500 [==============>...............] - ETA: 1:29 - loss: 0.6789 - regression_loss: 0.6084 - classification_loss: 0.0705 266/500 [==============>...............] - ETA: 1:29 - loss: 0.6791 - regression_loss: 0.6085 - classification_loss: 0.0706 267/500 [===============>..............] - ETA: 1:28 - loss: 0.6800 - regression_loss: 0.6094 - classification_loss: 0.0706 268/500 [===============>..............] - ETA: 1:28 - loss: 0.6808 - regression_loss: 0.6103 - classification_loss: 0.0705 269/500 [===============>..............] - ETA: 1:28 - loss: 0.6798 - regression_loss: 0.6094 - classification_loss: 0.0704 270/500 [===============>..............] - ETA: 1:27 - loss: 0.6801 - regression_loss: 0.6097 - classification_loss: 0.0704 271/500 [===============>..............] - ETA: 1:27 - loss: 0.6788 - regression_loss: 0.6086 - classification_loss: 0.0702 272/500 [===============>..............] - ETA: 1:26 - loss: 0.6779 - regression_loss: 0.6078 - classification_loss: 0.0701 273/500 [===============>..............] - ETA: 1:26 - loss: 0.6790 - regression_loss: 0.6090 - classification_loss: 0.0700 274/500 [===============>..............] - ETA: 1:26 - loss: 0.6784 - regression_loss: 0.6085 - classification_loss: 0.0699 275/500 [===============>..............] - ETA: 1:25 - loss: 0.6794 - regression_loss: 0.6094 - classification_loss: 0.0700 276/500 [===============>..............] - ETA: 1:25 - loss: 0.6800 - regression_loss: 0.6099 - classification_loss: 0.0701 277/500 [===============>..............] - ETA: 1:24 - loss: 0.6796 - regression_loss: 0.6095 - classification_loss: 0.0701 278/500 [===============>..............] - ETA: 1:24 - loss: 0.6813 - regression_loss: 0.6112 - classification_loss: 0.0702 279/500 [===============>..............] - ETA: 1:24 - loss: 0.6802 - regression_loss: 0.6102 - classification_loss: 0.0701 280/500 [===============>..............] - ETA: 1:23 - loss: 0.6807 - regression_loss: 0.6106 - classification_loss: 0.0701 281/500 [===============>..............] - ETA: 1:23 - loss: 0.6811 - regression_loss: 0.6109 - classification_loss: 0.0702 282/500 [===============>..............] - ETA: 1:23 - loss: 0.6806 - regression_loss: 0.6105 - classification_loss: 0.0701 283/500 [===============>..............] - ETA: 1:22 - loss: 0.6796 - regression_loss: 0.6096 - classification_loss: 0.0700 284/500 [================>.............] - ETA: 1:22 - loss: 0.6803 - regression_loss: 0.6103 - classification_loss: 0.0701 285/500 [================>.............] - ETA: 1:21 - loss: 0.6809 - regression_loss: 0.6109 - classification_loss: 0.0701 286/500 [================>.............] - ETA: 1:21 - loss: 0.6811 - regression_loss: 0.6111 - classification_loss: 0.0700 287/500 [================>.............] - ETA: 1:21 - loss: 0.6811 - regression_loss: 0.6110 - classification_loss: 0.0701 288/500 [================>.............] - ETA: 1:20 - loss: 0.6811 - regression_loss: 0.6109 - classification_loss: 0.0702 289/500 [================>.............] - ETA: 1:20 - loss: 0.6811 - regression_loss: 0.6110 - classification_loss: 0.0701 290/500 [================>.............] - ETA: 1:20 - loss: 0.6816 - regression_loss: 0.6113 - classification_loss: 0.0703 291/500 [================>.............] - ETA: 1:19 - loss: 0.6813 - regression_loss: 0.6110 - classification_loss: 0.0704 292/500 [================>.............] - ETA: 1:19 - loss: 0.6810 - regression_loss: 0.6107 - classification_loss: 0.0703 293/500 [================>.............] - ETA: 1:18 - loss: 0.6797 - regression_loss: 0.6095 - classification_loss: 0.0702 294/500 [================>.............] - ETA: 1:18 - loss: 0.6801 - regression_loss: 0.6100 - classification_loss: 0.0702 295/500 [================>.............] - ETA: 1:18 - loss: 0.6806 - regression_loss: 0.6105 - classification_loss: 0.0702 296/500 [================>.............] - ETA: 1:17 - loss: 0.6798 - regression_loss: 0.6097 - classification_loss: 0.0700 297/500 [================>.............] - ETA: 1:17 - loss: 0.6796 - regression_loss: 0.6096 - classification_loss: 0.0700 298/500 [================>.............] - ETA: 1:16 - loss: 0.6792 - regression_loss: 0.6092 - classification_loss: 0.0700 299/500 [================>.............] - ETA: 1:16 - loss: 0.6785 - regression_loss: 0.6086 - classification_loss: 0.0699 300/500 [=================>............] - ETA: 1:16 - loss: 0.6776 - regression_loss: 0.6079 - classification_loss: 0.0698 301/500 [=================>............] - ETA: 1:15 - loss: 0.6776 - regression_loss: 0.6079 - classification_loss: 0.0697 302/500 [=================>............] - ETA: 1:15 - loss: 0.6765 - regression_loss: 0.6068 - classification_loss: 0.0696 303/500 [=================>............] - ETA: 1:15 - loss: 0.6765 - regression_loss: 0.6069 - classification_loss: 0.0696 304/500 [=================>............] - ETA: 1:14 - loss: 0.6770 - regression_loss: 0.6074 - classification_loss: 0.0697 305/500 [=================>............] - ETA: 1:14 - loss: 0.6774 - regression_loss: 0.6078 - classification_loss: 0.0696 306/500 [=================>............] - ETA: 1:13 - loss: 0.6776 - regression_loss: 0.6080 - classification_loss: 0.0696 307/500 [=================>............] - ETA: 1:13 - loss: 0.6770 - regression_loss: 0.6075 - classification_loss: 0.0695 308/500 [=================>............] - ETA: 1:13 - loss: 0.6771 - regression_loss: 0.6076 - classification_loss: 0.0695 309/500 [=================>............] - ETA: 1:12 - loss: 0.6775 - regression_loss: 0.6079 - classification_loss: 0.0696 310/500 [=================>............] - ETA: 1:12 - loss: 0.6777 - regression_loss: 0.6081 - classification_loss: 0.0696 311/500 [=================>............] - ETA: 1:11 - loss: 0.6783 - regression_loss: 0.6086 - classification_loss: 0.0697 312/500 [=================>............] - ETA: 1:11 - loss: 0.6784 - regression_loss: 0.6086 - classification_loss: 0.0697 313/500 [=================>............] - ETA: 1:11 - loss: 0.6798 - regression_loss: 0.6100 - classification_loss: 0.0698 314/500 [=================>............] - ETA: 1:10 - loss: 0.6811 - regression_loss: 0.6113 - classification_loss: 0.0698 315/500 [=================>............] - ETA: 1:10 - loss: 0.6800 - regression_loss: 0.6104 - classification_loss: 0.0697 316/500 [=================>............] - ETA: 1:10 - loss: 0.6810 - regression_loss: 0.6112 - classification_loss: 0.0698 317/500 [==================>...........] - ETA: 1:09 - loss: 0.6803 - regression_loss: 0.6107 - classification_loss: 0.0697 318/500 [==================>...........] - ETA: 1:09 - loss: 0.6792 - regression_loss: 0.6097 - classification_loss: 0.0695 319/500 [==================>...........] - ETA: 1:08 - loss: 0.6784 - regression_loss: 0.6090 - classification_loss: 0.0695 320/500 [==================>...........] - ETA: 1:08 - loss: 0.6789 - regression_loss: 0.6095 - classification_loss: 0.0694 321/500 [==================>...........] - ETA: 1:08 - loss: 0.6780 - regression_loss: 0.6086 - classification_loss: 0.0693 322/500 [==================>...........] - ETA: 1:07 - loss: 0.6787 - regression_loss: 0.6092 - classification_loss: 0.0695 323/500 [==================>...........] - ETA: 1:07 - loss: 0.6791 - regression_loss: 0.6096 - classification_loss: 0.0695 324/500 [==================>...........] - ETA: 1:07 - loss: 0.6790 - regression_loss: 0.6095 - classification_loss: 0.0695 325/500 [==================>...........] - ETA: 1:06 - loss: 0.6784 - regression_loss: 0.6089 - classification_loss: 0.0694 326/500 [==================>...........] - ETA: 1:06 - loss: 0.6781 - regression_loss: 0.6087 - classification_loss: 0.0694 327/500 [==================>...........] - ETA: 1:05 - loss: 0.6777 - regression_loss: 0.6084 - classification_loss: 0.0694 328/500 [==================>...........] - ETA: 1:05 - loss: 0.6768 - regression_loss: 0.6074 - classification_loss: 0.0693 329/500 [==================>...........] - ETA: 1:05 - loss: 0.6774 - regression_loss: 0.6080 - classification_loss: 0.0694 330/500 [==================>...........] - ETA: 1:04 - loss: 0.6780 - regression_loss: 0.6085 - classification_loss: 0.0695 331/500 [==================>...........] - ETA: 1:04 - loss: 0.6779 - regression_loss: 0.6084 - classification_loss: 0.0695 332/500 [==================>...........] - ETA: 1:04 - loss: 0.6776 - regression_loss: 0.6081 - classification_loss: 0.0695 333/500 [==================>...........] - ETA: 1:03 - loss: 0.6792 - regression_loss: 0.6096 - classification_loss: 0.0696 334/500 [===================>..........] - ETA: 1:03 - loss: 0.6789 - regression_loss: 0.6095 - classification_loss: 0.0694 335/500 [===================>..........] - ETA: 1:02 - loss: 0.6782 - regression_loss: 0.6089 - classification_loss: 0.0693 336/500 [===================>..........] - ETA: 1:02 - loss: 0.6785 - regression_loss: 0.6092 - classification_loss: 0.0693 337/500 [===================>..........] - ETA: 1:02 - loss: 0.6779 - regression_loss: 0.6087 - classification_loss: 0.0692 338/500 [===================>..........] - ETA: 1:01 - loss: 0.6784 - regression_loss: 0.6090 - classification_loss: 0.0693 339/500 [===================>..........] - ETA: 1:01 - loss: 0.6778 - regression_loss: 0.6086 - classification_loss: 0.0692 340/500 [===================>..........] - ETA: 1:00 - loss: 0.6786 - regression_loss: 0.6093 - classification_loss: 0.0693 341/500 [===================>..........] - ETA: 1:00 - loss: 0.6783 - regression_loss: 0.6091 - classification_loss: 0.0692 342/500 [===================>..........] - ETA: 1:00 - loss: 0.6786 - regression_loss: 0.6093 - classification_loss: 0.0693 343/500 [===================>..........] - ETA: 59s - loss: 0.6785 - regression_loss: 0.6092 - classification_loss: 0.0694  344/500 [===================>..........] - ETA: 59s - loss: 0.6778 - regression_loss: 0.6085 - classification_loss: 0.0693 345/500 [===================>..........] - ETA: 59s - loss: 0.6769 - regression_loss: 0.6077 - classification_loss: 0.0691 346/500 [===================>..........] - ETA: 58s - loss: 0.6773 - regression_loss: 0.6081 - classification_loss: 0.0692 347/500 [===================>..........] - ETA: 58s - loss: 0.6772 - regression_loss: 0.6080 - classification_loss: 0.0692 348/500 [===================>..........] - ETA: 57s - loss: 0.6762 - regression_loss: 0.6071 - classification_loss: 0.0691 349/500 [===================>..........] - ETA: 57s - loss: 0.6766 - regression_loss: 0.6075 - classification_loss: 0.0691 350/500 [====================>.........] - ETA: 57s - loss: 0.6771 - regression_loss: 0.6080 - classification_loss: 0.0692 351/500 [====================>.........] - ETA: 56s - loss: 0.6771 - regression_loss: 0.6078 - classification_loss: 0.0693 352/500 [====================>.........] - ETA: 56s - loss: 0.6772 - regression_loss: 0.6080 - classification_loss: 0.0693 353/500 [====================>.........] - ETA: 56s - loss: 0.6784 - regression_loss: 0.6089 - classification_loss: 0.0695 354/500 [====================>.........] - ETA: 55s - loss: 0.6787 - regression_loss: 0.6091 - classification_loss: 0.0696 355/500 [====================>.........] - ETA: 55s - loss: 0.6780 - regression_loss: 0.6084 - classification_loss: 0.0696 356/500 [====================>.........] - ETA: 54s - loss: 0.6782 - regression_loss: 0.6086 - classification_loss: 0.0696 357/500 [====================>.........] - ETA: 54s - loss: 0.6776 - regression_loss: 0.6082 - classification_loss: 0.0694 358/500 [====================>.........] - ETA: 54s - loss: 0.6776 - regression_loss: 0.6082 - classification_loss: 0.0694 359/500 [====================>.........] - ETA: 53s - loss: 0.6776 - regression_loss: 0.6082 - classification_loss: 0.0694 360/500 [====================>.........] - ETA: 53s - loss: 0.6765 - regression_loss: 0.6071 - classification_loss: 0.0693 361/500 [====================>.........] - ETA: 52s - loss: 0.6761 - regression_loss: 0.6068 - classification_loss: 0.0693 362/500 [====================>.........] - ETA: 52s - loss: 0.6757 - regression_loss: 0.6064 - classification_loss: 0.0692 363/500 [====================>.........] - ETA: 52s - loss: 0.6755 - regression_loss: 0.6063 - classification_loss: 0.0691 364/500 [====================>.........] - ETA: 51s - loss: 0.6749 - regression_loss: 0.6059 - classification_loss: 0.0690 365/500 [====================>.........] - ETA: 51s - loss: 0.6736 - regression_loss: 0.6047 - classification_loss: 0.0689 366/500 [====================>.........] - ETA: 51s - loss: 0.6741 - regression_loss: 0.6051 - classification_loss: 0.0690 367/500 [=====================>........] - ETA: 50s - loss: 0.6731 - regression_loss: 0.6042 - classification_loss: 0.0689 368/500 [=====================>........] - ETA: 50s - loss: 0.6744 - regression_loss: 0.6053 - classification_loss: 0.0691 369/500 [=====================>........] - ETA: 49s - loss: 0.6741 - regression_loss: 0.6050 - classification_loss: 0.0691 370/500 [=====================>........] - ETA: 49s - loss: 0.6737 - regression_loss: 0.6046 - classification_loss: 0.0691 371/500 [=====================>........] - ETA: 49s - loss: 0.6745 - regression_loss: 0.6053 - classification_loss: 0.0692 372/500 [=====================>........] - ETA: 48s - loss: 0.6748 - regression_loss: 0.6055 - classification_loss: 0.0693 373/500 [=====================>........] - ETA: 48s - loss: 0.6745 - regression_loss: 0.6052 - classification_loss: 0.0693 374/500 [=====================>........] - ETA: 48s - loss: 0.6746 - regression_loss: 0.6052 - classification_loss: 0.0693 375/500 [=====================>........] - ETA: 47s - loss: 0.6735 - regression_loss: 0.6043 - classification_loss: 0.0692 376/500 [=====================>........] - ETA: 47s - loss: 0.6731 - regression_loss: 0.6039 - classification_loss: 0.0692 377/500 [=====================>........] - ETA: 46s - loss: 0.6739 - regression_loss: 0.6046 - classification_loss: 0.0694 378/500 [=====================>........] - ETA: 46s - loss: 0.6728 - regression_loss: 0.6036 - classification_loss: 0.0692 379/500 [=====================>........] - ETA: 46s - loss: 0.6731 - regression_loss: 0.6039 - classification_loss: 0.0692 380/500 [=====================>........] - ETA: 45s - loss: 0.6734 - regression_loss: 0.6042 - classification_loss: 0.0692 381/500 [=====================>........] - ETA: 45s - loss: 0.6731 - regression_loss: 0.6039 - classification_loss: 0.0692 382/500 [=====================>........] - ETA: 45s - loss: 0.6728 - regression_loss: 0.6036 - classification_loss: 0.0691 383/500 [=====================>........] - ETA: 44s - loss: 0.6731 - regression_loss: 0.6039 - classification_loss: 0.0692 384/500 [======================>.......] - ETA: 44s - loss: 0.6734 - regression_loss: 0.6042 - classification_loss: 0.0692 385/500 [======================>.......] - ETA: 43s - loss: 0.6729 - regression_loss: 0.6037 - classification_loss: 0.0691 386/500 [======================>.......] - ETA: 43s - loss: 0.6723 - regression_loss: 0.6032 - classification_loss: 0.0691 387/500 [======================>.......] - ETA: 43s - loss: 0.6729 - regression_loss: 0.6038 - classification_loss: 0.0690 388/500 [======================>.......] - ETA: 42s - loss: 0.6732 - regression_loss: 0.6042 - classification_loss: 0.0690 389/500 [======================>.......] - ETA: 42s - loss: 0.6728 - regression_loss: 0.6038 - classification_loss: 0.0690 390/500 [======================>.......] - ETA: 41s - loss: 0.6722 - regression_loss: 0.6033 - classification_loss: 0.0689 391/500 [======================>.......] - ETA: 41s - loss: 0.6714 - regression_loss: 0.6026 - classification_loss: 0.0688 392/500 [======================>.......] - ETA: 41s - loss: 0.6705 - regression_loss: 0.6017 - classification_loss: 0.0688 393/500 [======================>.......] - ETA: 40s - loss: 0.6708 - regression_loss: 0.6020 - classification_loss: 0.0687 394/500 [======================>.......] - ETA: 40s - loss: 0.6705 - regression_loss: 0.6018 - classification_loss: 0.0686 395/500 [======================>.......] - ETA: 40s - loss: 0.6699 - regression_loss: 0.6013 - classification_loss: 0.0686 396/500 [======================>.......] - ETA: 39s - loss: 0.6702 - regression_loss: 0.6017 - classification_loss: 0.0685 397/500 [======================>.......] - ETA: 39s - loss: 0.6706 - regression_loss: 0.6021 - classification_loss: 0.0685 398/500 [======================>.......] - ETA: 38s - loss: 0.6707 - regression_loss: 0.6021 - classification_loss: 0.0686 399/500 [======================>.......] - ETA: 38s - loss: 0.6702 - regression_loss: 0.6018 - classification_loss: 0.0685 400/500 [=======================>......] - ETA: 38s - loss: 0.6699 - regression_loss: 0.6016 - classification_loss: 0.0684 401/500 [=======================>......] - ETA: 37s - loss: 0.6694 - regression_loss: 0.6011 - classification_loss: 0.0683 402/500 [=======================>......] - ETA: 37s - loss: 0.6699 - regression_loss: 0.6016 - classification_loss: 0.0684 403/500 [=======================>......] - ETA: 37s - loss: 0.6696 - regression_loss: 0.6013 - classification_loss: 0.0683 404/500 [=======================>......] - ETA: 36s - loss: 0.6698 - regression_loss: 0.6015 - classification_loss: 0.0683 405/500 [=======================>......] - ETA: 36s - loss: 0.6695 - regression_loss: 0.6012 - classification_loss: 0.0683 406/500 [=======================>......] - ETA: 35s - loss: 0.6705 - regression_loss: 0.6021 - classification_loss: 0.0685 407/500 [=======================>......] - ETA: 35s - loss: 0.6697 - regression_loss: 0.6014 - classification_loss: 0.0683 408/500 [=======================>......] - ETA: 35s - loss: 0.6691 - regression_loss: 0.6009 - classification_loss: 0.0682 409/500 [=======================>......] - ETA: 34s - loss: 0.6687 - regression_loss: 0.6005 - classification_loss: 0.0681 410/500 [=======================>......] - ETA: 34s - loss: 0.6680 - regression_loss: 0.5999 - classification_loss: 0.0680 411/500 [=======================>......] - ETA: 33s - loss: 0.6675 - regression_loss: 0.5996 - classification_loss: 0.0679 412/500 [=======================>......] - ETA: 33s - loss: 0.6682 - regression_loss: 0.6002 - classification_loss: 0.0680 413/500 [=======================>......] - ETA: 33s - loss: 0.6684 - regression_loss: 0.6004 - classification_loss: 0.0680 414/500 [=======================>......] - ETA: 32s - loss: 0.6684 - regression_loss: 0.6003 - classification_loss: 0.0680 415/500 [=======================>......] - ETA: 32s - loss: 0.6682 - regression_loss: 0.6002 - classification_loss: 0.0680 416/500 [=======================>......] - ETA: 32s - loss: 0.6682 - regression_loss: 0.6003 - classification_loss: 0.0680 417/500 [========================>.....] - ETA: 31s - loss: 0.6685 - regression_loss: 0.6005 - classification_loss: 0.0680 418/500 [========================>.....] - ETA: 31s - loss: 0.6687 - regression_loss: 0.6008 - classification_loss: 0.0680 419/500 [========================>.....] - ETA: 30s - loss: 0.6682 - regression_loss: 0.6003 - classification_loss: 0.0679 420/500 [========================>.....] - ETA: 30s - loss: 0.6691 - regression_loss: 0.6010 - classification_loss: 0.0681 421/500 [========================>.....] - ETA: 30s - loss: 0.6692 - regression_loss: 0.6012 - classification_loss: 0.0681 422/500 [========================>.....] - ETA: 29s - loss: 0.6688 - regression_loss: 0.6008 - classification_loss: 0.0680 423/500 [========================>.....] - ETA: 29s - loss: 0.6701 - regression_loss: 0.6021 - classification_loss: 0.0679 424/500 [========================>.....] - ETA: 29s - loss: 0.6701 - regression_loss: 0.6021 - classification_loss: 0.0680 425/500 [========================>.....] - ETA: 28s - loss: 0.6697 - regression_loss: 0.6017 - classification_loss: 0.0679 426/500 [========================>.....] - ETA: 28s - loss: 0.6694 - regression_loss: 0.6015 - classification_loss: 0.0680 427/500 [========================>.....] - ETA: 27s - loss: 0.6695 - regression_loss: 0.6015 - classification_loss: 0.0680 428/500 [========================>.....] - ETA: 27s - loss: 0.6694 - regression_loss: 0.6015 - classification_loss: 0.0680 429/500 [========================>.....] - ETA: 27s - loss: 0.6694 - regression_loss: 0.6016 - classification_loss: 0.0679 430/500 [========================>.....] - ETA: 26s - loss: 0.6688 - regression_loss: 0.6009 - classification_loss: 0.0678 431/500 [========================>.....] - ETA: 26s - loss: 0.6689 - regression_loss: 0.6010 - classification_loss: 0.0678 432/500 [========================>.....] - ETA: 25s - loss: 0.6683 - regression_loss: 0.6005 - classification_loss: 0.0678 433/500 [========================>.....] - ETA: 25s - loss: 0.6680 - regression_loss: 0.6001 - classification_loss: 0.0679 434/500 [=========================>....] - ETA: 25s - loss: 0.6680 - regression_loss: 0.6002 - classification_loss: 0.0679 435/500 [=========================>....] - ETA: 24s - loss: 0.6685 - regression_loss: 0.6006 - classification_loss: 0.0679 436/500 [=========================>....] - ETA: 24s - loss: 0.6687 - regression_loss: 0.6008 - classification_loss: 0.0679 437/500 [=========================>....] - ETA: 24s - loss: 0.6692 - regression_loss: 0.6012 - classification_loss: 0.0680 438/500 [=========================>....] - ETA: 23s - loss: 0.6685 - regression_loss: 0.6005 - classification_loss: 0.0680 439/500 [=========================>....] - ETA: 23s - loss: 0.6689 - regression_loss: 0.6009 - classification_loss: 0.0680 440/500 [=========================>....] - ETA: 22s - loss: 0.6689 - regression_loss: 0.6010 - classification_loss: 0.0679 441/500 [=========================>....] - ETA: 22s - loss: 0.6685 - regression_loss: 0.6007 - classification_loss: 0.0678 442/500 [=========================>....] - ETA: 22s - loss: 0.6682 - regression_loss: 0.6004 - classification_loss: 0.0678 443/500 [=========================>....] - ETA: 21s - loss: 0.6685 - regression_loss: 0.6006 - classification_loss: 0.0679 444/500 [=========================>....] - ETA: 21s - loss: 0.6686 - regression_loss: 0.6006 - classification_loss: 0.0680 445/500 [=========================>....] - ETA: 20s - loss: 0.6677 - regression_loss: 0.5998 - classification_loss: 0.0679 446/500 [=========================>....] - ETA: 20s - loss: 0.6682 - regression_loss: 0.6004 - classification_loss: 0.0678 447/500 [=========================>....] - ETA: 20s - loss: 0.6691 - regression_loss: 0.6011 - classification_loss: 0.0679 448/500 [=========================>....] - ETA: 19s - loss: 0.6690 - regression_loss: 0.6010 - classification_loss: 0.0680 449/500 [=========================>....] - ETA: 19s - loss: 0.6690 - regression_loss: 0.6010 - classification_loss: 0.0680 450/500 [==========================>...] - ETA: 19s - loss: 0.6683 - regression_loss: 0.6004 - classification_loss: 0.0679 451/500 [==========================>...] - ETA: 18s - loss: 0.6679 - regression_loss: 0.6001 - classification_loss: 0.0678 452/500 [==========================>...] - ETA: 18s - loss: 0.6683 - regression_loss: 0.6005 - classification_loss: 0.0678 453/500 [==========================>...] - ETA: 17s - loss: 0.6688 - regression_loss: 0.6009 - classification_loss: 0.0679 454/500 [==========================>...] - ETA: 17s - loss: 0.6683 - regression_loss: 0.6005 - classification_loss: 0.0678 455/500 [==========================>...] - ETA: 17s - loss: 0.6678 - regression_loss: 0.6001 - classification_loss: 0.0677 456/500 [==========================>...] - ETA: 16s - loss: 0.6672 - regression_loss: 0.5996 - classification_loss: 0.0676 457/500 [==========================>...] - ETA: 16s - loss: 0.6671 - regression_loss: 0.5995 - classification_loss: 0.0676 458/500 [==========================>...] - ETA: 16s - loss: 0.6674 - regression_loss: 0.5998 - classification_loss: 0.0677 459/500 [==========================>...] - ETA: 15s - loss: 0.6668 - regression_loss: 0.5992 - classification_loss: 0.0676 460/500 [==========================>...] - ETA: 15s - loss: 0.6672 - regression_loss: 0.5995 - classification_loss: 0.0676 461/500 [==========================>...] - ETA: 14s - loss: 0.6672 - regression_loss: 0.5996 - classification_loss: 0.0676 462/500 [==========================>...] - ETA: 14s - loss: 0.6669 - regression_loss: 0.5993 - classification_loss: 0.0675 463/500 [==========================>...] - ETA: 14s - loss: 0.6664 - regression_loss: 0.5988 - classification_loss: 0.0675 464/500 [==========================>...] - ETA: 13s - loss: 0.6659 - regression_loss: 0.5985 - classification_loss: 0.0675 465/500 [==========================>...] - ETA: 13s - loss: 0.6662 - regression_loss: 0.5987 - classification_loss: 0.0675 466/500 [==========================>...] - ETA: 12s - loss: 0.6659 - regression_loss: 0.5985 - classification_loss: 0.0675 467/500 [===========================>..] - ETA: 12s - loss: 0.6671 - regression_loss: 0.5995 - classification_loss: 0.0676 468/500 [===========================>..] - ETA: 12s - loss: 0.6669 - regression_loss: 0.5994 - classification_loss: 0.0675 469/500 [===========================>..] - ETA: 11s - loss: 0.6673 - regression_loss: 0.5997 - classification_loss: 0.0676 470/500 [===========================>..] - ETA: 11s - loss: 0.6677 - regression_loss: 0.6001 - classification_loss: 0.0676 471/500 [===========================>..] - ETA: 11s - loss: 0.6671 - regression_loss: 0.5995 - classification_loss: 0.0676 472/500 [===========================>..] - ETA: 10s - loss: 0.6669 - regression_loss: 0.5993 - classification_loss: 0.0676 473/500 [===========================>..] - ETA: 10s - loss: 0.6672 - regression_loss: 0.5996 - classification_loss: 0.0676 474/500 [===========================>..] - ETA: 9s - loss: 0.6668 - regression_loss: 0.5992 - classification_loss: 0.0676  475/500 [===========================>..] - ETA: 9s - loss: 0.6668 - regression_loss: 0.5992 - classification_loss: 0.0676 476/500 [===========================>..] - ETA: 9s - loss: 0.6666 - regression_loss: 0.5990 - classification_loss: 0.0675 477/500 [===========================>..] - ETA: 8s - loss: 0.6666 - regression_loss: 0.5990 - classification_loss: 0.0676 478/500 [===========================>..] - ETA: 8s - loss: 0.6657 - regression_loss: 0.5982 - classification_loss: 0.0675 479/500 [===========================>..] - ETA: 8s - loss: 0.6663 - regression_loss: 0.5988 - classification_loss: 0.0675 480/500 [===========================>..] - ETA: 7s - loss: 0.6664 - regression_loss: 0.5989 - classification_loss: 0.0675 481/500 [===========================>..] - ETA: 7s - loss: 0.6660 - regression_loss: 0.5986 - classification_loss: 0.0675 482/500 [===========================>..] - ETA: 6s - loss: 0.6668 - regression_loss: 0.5993 - classification_loss: 0.0675 483/500 [===========================>..] - ETA: 6s - loss: 0.6663 - regression_loss: 0.5988 - classification_loss: 0.0675 484/500 [============================>.] - ETA: 6s - loss: 0.6668 - regression_loss: 0.5992 - classification_loss: 0.0676 485/500 [============================>.] - ETA: 5s - loss: 0.6667 - regression_loss: 0.5991 - classification_loss: 0.0675 486/500 [============================>.] - ETA: 5s - loss: 0.6663 - regression_loss: 0.5987 - classification_loss: 0.0675 487/500 [============================>.] - ETA: 4s - loss: 0.6657 - regression_loss: 0.5982 - classification_loss: 0.0675 488/500 [============================>.] - ETA: 4s - loss: 0.6663 - regression_loss: 0.5987 - classification_loss: 0.0676 489/500 [============================>.] - ETA: 4s - loss: 0.6656 - regression_loss: 0.5982 - classification_loss: 0.0675 490/500 [============================>.] - ETA: 3s - loss: 0.6658 - regression_loss: 0.5983 - classification_loss: 0.0675 491/500 [============================>.] - ETA: 3s - loss: 0.6652 - regression_loss: 0.5977 - classification_loss: 0.0676 492/500 [============================>.] - ETA: 3s - loss: 0.6661 - regression_loss: 0.5984 - classification_loss: 0.0677 493/500 [============================>.] - ETA: 2s - loss: 0.6667 - regression_loss: 0.5989 - classification_loss: 0.0678 494/500 [============================>.] - ETA: 2s - loss: 0.6674 - regression_loss: 0.5995 - classification_loss: 0.0678 495/500 [============================>.] - ETA: 1s - loss: 0.6677 - regression_loss: 0.5999 - classification_loss: 0.0678 496/500 [============================>.] - ETA: 1s - loss: 0.6675 - regression_loss: 0.5997 - classification_loss: 0.0678 497/500 [============================>.] - ETA: 1s - loss: 0.6678 - regression_loss: 0.6000 - classification_loss: 0.0679 498/500 [============================>.] - ETA: 0s - loss: 0.6675 - regression_loss: 0.5997 - classification_loss: 0.0678 499/500 [============================>.] - ETA: 0s - loss: 0.6670 - regression_loss: 0.5992 - classification_loss: 0.0678 500/500 [==============================] - 191s 382ms/step - loss: 0.6663 - regression_loss: 0.5986 - classification_loss: 0.0677 1172 instances of class plum with average precision: 0.7056 mAP: 0.7056 Epoch 00015: saving model to ./training/snapshots/resnet101_pascal_15.h5 Epoch 16/20 1/500 [..............................] - ETA: 2:58 - loss: 0.5732 - regression_loss: 0.5221 - classification_loss: 0.0511 2/500 [..............................] - ETA: 3:03 - loss: 0.6445 - regression_loss: 0.5850 - classification_loss: 0.0595 3/500 [..............................] - ETA: 3:09 - loss: 0.7015 - regression_loss: 0.6298 - classification_loss: 0.0717 4/500 [..............................] - ETA: 3:09 - loss: 0.7181 - regression_loss: 0.6490 - classification_loss: 0.0691 5/500 [..............................] - ETA: 3:10 - loss: 0.6721 - regression_loss: 0.6074 - classification_loss: 0.0647 6/500 [..............................] - ETA: 3:11 - loss: 0.6393 - regression_loss: 0.5801 - classification_loss: 0.0593 7/500 [..............................] - ETA: 3:11 - loss: 0.5979 - regression_loss: 0.5435 - classification_loss: 0.0544 8/500 [..............................] - ETA: 3:09 - loss: 0.5981 - regression_loss: 0.5414 - classification_loss: 0.0567 9/500 [..............................] - ETA: 3:08 - loss: 0.6349 - regression_loss: 0.5742 - classification_loss: 0.0607 10/500 [..............................] - ETA: 3:07 - loss: 0.6231 - regression_loss: 0.5648 - classification_loss: 0.0583 11/500 [..............................] - ETA: 3:07 - loss: 0.6536 - regression_loss: 0.5893 - classification_loss: 0.0643 12/500 [..............................] - ETA: 3:06 - loss: 0.6507 - regression_loss: 0.5869 - classification_loss: 0.0638 13/500 [..............................] - ETA: 3:05 - loss: 0.6543 - regression_loss: 0.5913 - classification_loss: 0.0629 14/500 [..............................] - ETA: 3:06 - loss: 0.6452 - regression_loss: 0.5835 - classification_loss: 0.0616 15/500 [..............................] - ETA: 3:06 - loss: 0.6702 - regression_loss: 0.6076 - classification_loss: 0.0626 16/500 [..............................] - ETA: 3:06 - loss: 0.6666 - regression_loss: 0.6030 - classification_loss: 0.0636 17/500 [>.............................] - ETA: 3:06 - loss: 0.6583 - regression_loss: 0.5969 - classification_loss: 0.0614 18/500 [>.............................] - ETA: 3:06 - loss: 0.6381 - regression_loss: 0.5785 - classification_loss: 0.0595 19/500 [>.............................] - ETA: 3:05 - loss: 0.6602 - regression_loss: 0.5953 - classification_loss: 0.0649 20/500 [>.............................] - ETA: 3:05 - loss: 0.6441 - regression_loss: 0.5806 - classification_loss: 0.0635 21/500 [>.............................] - ETA: 3:04 - loss: 0.6313 - regression_loss: 0.5699 - classification_loss: 0.0614 22/500 [>.............................] - ETA: 3:03 - loss: 0.6339 - regression_loss: 0.5716 - classification_loss: 0.0622 23/500 [>.............................] - ETA: 3:03 - loss: 0.6233 - regression_loss: 0.5625 - classification_loss: 0.0609 24/500 [>.............................] - ETA: 3:03 - loss: 0.6530 - regression_loss: 0.5887 - classification_loss: 0.0643 25/500 [>.............................] - ETA: 3:02 - loss: 0.6742 - regression_loss: 0.6072 - classification_loss: 0.0670 26/500 [>.............................] - ETA: 3:01 - loss: 0.6791 - regression_loss: 0.6100 - classification_loss: 0.0691 27/500 [>.............................] - ETA: 3:00 - loss: 0.6777 - regression_loss: 0.6099 - classification_loss: 0.0678 28/500 [>.............................] - ETA: 3:00 - loss: 0.6707 - regression_loss: 0.6041 - classification_loss: 0.0665 29/500 [>.............................] - ETA: 3:00 - loss: 0.6755 - regression_loss: 0.6083 - classification_loss: 0.0672 30/500 [>.............................] - ETA: 3:00 - loss: 0.6754 - regression_loss: 0.6083 - classification_loss: 0.0672 31/500 [>.............................] - ETA: 2:59 - loss: 0.6804 - regression_loss: 0.6124 - classification_loss: 0.0680 32/500 [>.............................] - ETA: 2:59 - loss: 0.6726 - regression_loss: 0.6048 - classification_loss: 0.0677 33/500 [>.............................] - ETA: 2:59 - loss: 0.6768 - regression_loss: 0.6075 - classification_loss: 0.0694 34/500 [=>............................] - ETA: 2:59 - loss: 0.6663 - regression_loss: 0.5979 - classification_loss: 0.0684 35/500 [=>............................] - ETA: 2:58 - loss: 0.6597 - regression_loss: 0.5916 - classification_loss: 0.0681 36/500 [=>............................] - ETA: 2:58 - loss: 0.6683 - regression_loss: 0.6003 - classification_loss: 0.0681 37/500 [=>............................] - ETA: 2:58 - loss: 0.6672 - regression_loss: 0.6000 - classification_loss: 0.0672 38/500 [=>............................] - ETA: 2:57 - loss: 0.6607 - regression_loss: 0.5943 - classification_loss: 0.0664 39/500 [=>............................] - ETA: 2:57 - loss: 0.6581 - regression_loss: 0.5920 - classification_loss: 0.0661 40/500 [=>............................] - ETA: 2:57 - loss: 0.6563 - regression_loss: 0.5903 - classification_loss: 0.0660 41/500 [=>............................] - ETA: 2:56 - loss: 0.6581 - regression_loss: 0.5918 - classification_loss: 0.0663 42/500 [=>............................] - ETA: 2:56 - loss: 0.6688 - regression_loss: 0.6013 - classification_loss: 0.0675 43/500 [=>............................] - ETA: 2:55 - loss: 0.6714 - regression_loss: 0.6036 - classification_loss: 0.0678 44/500 [=>............................] - ETA: 2:55 - loss: 0.6710 - regression_loss: 0.6036 - classification_loss: 0.0674 45/500 [=>............................] - ETA: 2:54 - loss: 0.6684 - regression_loss: 0.6012 - classification_loss: 0.0672 46/500 [=>............................] - ETA: 2:54 - loss: 0.6651 - regression_loss: 0.5985 - classification_loss: 0.0667 47/500 [=>............................] - ETA: 2:54 - loss: 0.6683 - regression_loss: 0.6014 - classification_loss: 0.0669 48/500 [=>............................] - ETA: 2:53 - loss: 0.6625 - regression_loss: 0.5963 - classification_loss: 0.0662 49/500 [=>............................] - ETA: 2:53 - loss: 0.6607 - regression_loss: 0.5950 - classification_loss: 0.0657 50/500 [==>...........................] - ETA: 2:52 - loss: 0.6561 - regression_loss: 0.5910 - classification_loss: 0.0651 51/500 [==>...........................] - ETA: 2:52 - loss: 0.6523 - regression_loss: 0.5878 - classification_loss: 0.0645 52/500 [==>...........................] - ETA: 2:52 - loss: 0.6579 - regression_loss: 0.5929 - classification_loss: 0.0650 53/500 [==>...........................] - ETA: 2:52 - loss: 0.6657 - regression_loss: 0.5997 - classification_loss: 0.0660 54/500 [==>...........................] - ETA: 2:51 - loss: 0.6708 - regression_loss: 0.6044 - classification_loss: 0.0664 55/500 [==>...........................] - ETA: 2:51 - loss: 0.6709 - regression_loss: 0.6043 - classification_loss: 0.0666 56/500 [==>...........................] - ETA: 2:50 - loss: 0.6801 - regression_loss: 0.6128 - classification_loss: 0.0673 57/500 [==>...........................] - ETA: 2:50 - loss: 0.6823 - regression_loss: 0.6150 - classification_loss: 0.0673 58/500 [==>...........................] - ETA: 2:50 - loss: 0.6832 - regression_loss: 0.6157 - classification_loss: 0.0674 59/500 [==>...........................] - ETA: 2:49 - loss: 0.6903 - regression_loss: 0.6217 - classification_loss: 0.0686 60/500 [==>...........................] - ETA: 2:48 - loss: 0.6876 - regression_loss: 0.6192 - classification_loss: 0.0683 61/500 [==>...........................] - ETA: 2:48 - loss: 0.6899 - regression_loss: 0.6208 - classification_loss: 0.0691 62/500 [==>...........................] - ETA: 2:48 - loss: 0.6844 - regression_loss: 0.6158 - classification_loss: 0.0686 63/500 [==>...........................] - ETA: 2:48 - loss: 0.6828 - regression_loss: 0.6145 - classification_loss: 0.0683 64/500 [==>...........................] - ETA: 2:47 - loss: 0.6834 - regression_loss: 0.6149 - classification_loss: 0.0685 65/500 [==>...........................] - ETA: 2:47 - loss: 0.6813 - regression_loss: 0.6128 - classification_loss: 0.0685 66/500 [==>...........................] - ETA: 2:47 - loss: 0.6772 - regression_loss: 0.6092 - classification_loss: 0.0680 67/500 [===>..........................] - ETA: 2:47 - loss: 0.6768 - regression_loss: 0.6088 - classification_loss: 0.0680 68/500 [===>..........................] - ETA: 2:46 - loss: 0.6724 - regression_loss: 0.6049 - classification_loss: 0.0675 69/500 [===>..........................] - ETA: 2:46 - loss: 0.6676 - regression_loss: 0.6007 - classification_loss: 0.0669 70/500 [===>..........................] - ETA: 2:45 - loss: 0.6668 - regression_loss: 0.5998 - classification_loss: 0.0670 71/500 [===>..........................] - ETA: 2:44 - loss: 0.6675 - regression_loss: 0.6006 - classification_loss: 0.0669 72/500 [===>..........................] - ETA: 2:44 - loss: 0.6696 - regression_loss: 0.6024 - classification_loss: 0.0672 73/500 [===>..........................] - ETA: 2:44 - loss: 0.6658 - regression_loss: 0.5989 - classification_loss: 0.0669 74/500 [===>..........................] - ETA: 2:44 - loss: 0.6675 - regression_loss: 0.6005 - classification_loss: 0.0670 75/500 [===>..........................] - ETA: 2:43 - loss: 0.6638 - regression_loss: 0.5972 - classification_loss: 0.0666 76/500 [===>..........................] - ETA: 2:43 - loss: 0.6649 - regression_loss: 0.5984 - classification_loss: 0.0665 77/500 [===>..........................] - ETA: 2:43 - loss: 0.6726 - regression_loss: 0.6055 - classification_loss: 0.0671 78/500 [===>..........................] - ETA: 2:42 - loss: 0.6749 - regression_loss: 0.6077 - classification_loss: 0.0672 79/500 [===>..........................] - ETA: 2:42 - loss: 0.6777 - regression_loss: 0.6102 - classification_loss: 0.0675 80/500 [===>..........................] - ETA: 2:42 - loss: 0.6767 - regression_loss: 0.6093 - classification_loss: 0.0673 81/500 [===>..........................] - ETA: 2:41 - loss: 0.6852 - regression_loss: 0.6164 - classification_loss: 0.0688 82/500 [===>..........................] - ETA: 2:41 - loss: 0.6808 - regression_loss: 0.6126 - classification_loss: 0.0682 83/500 [===>..........................] - ETA: 2:40 - loss: 0.6766 - regression_loss: 0.6089 - classification_loss: 0.0676 84/500 [====>.........................] - ETA: 2:40 - loss: 0.6767 - regression_loss: 0.6091 - classification_loss: 0.0676 85/500 [====>.........................] - ETA: 2:39 - loss: 0.6789 - regression_loss: 0.6112 - classification_loss: 0.0677 86/500 [====>.........................] - ETA: 2:39 - loss: 0.6767 - regression_loss: 0.6095 - classification_loss: 0.0672 87/500 [====>.........................] - ETA: 2:39 - loss: 0.6753 - regression_loss: 0.6082 - classification_loss: 0.0671 88/500 [====>.........................] - ETA: 2:38 - loss: 0.6795 - regression_loss: 0.6120 - classification_loss: 0.0675 89/500 [====>.........................] - ETA: 2:38 - loss: 0.6802 - regression_loss: 0.6126 - classification_loss: 0.0675 90/500 [====>.........................] - ETA: 2:37 - loss: 0.6774 - regression_loss: 0.6104 - classification_loss: 0.0670 91/500 [====>.........................] - ETA: 2:37 - loss: 0.6750 - regression_loss: 0.6083 - classification_loss: 0.0668 92/500 [====>.........................] - ETA: 2:36 - loss: 0.6736 - regression_loss: 0.6069 - classification_loss: 0.0667 93/500 [====>.........................] - ETA: 2:36 - loss: 0.6748 - regression_loss: 0.6080 - classification_loss: 0.0668 94/500 [====>.........................] - ETA: 2:35 - loss: 0.6748 - regression_loss: 0.6077 - classification_loss: 0.0671 95/500 [====>.........................] - ETA: 2:35 - loss: 0.6718 - regression_loss: 0.6050 - classification_loss: 0.0668 96/500 [====>.........................] - ETA: 2:35 - loss: 0.6761 - regression_loss: 0.6079 - classification_loss: 0.0683 97/500 [====>.........................] - ETA: 2:34 - loss: 0.6778 - regression_loss: 0.6092 - classification_loss: 0.0686 98/500 [====>.........................] - ETA: 2:34 - loss: 0.6789 - regression_loss: 0.6105 - classification_loss: 0.0684 99/500 [====>.........................] - ETA: 2:34 - loss: 0.6735 - regression_loss: 0.6057 - classification_loss: 0.0679 100/500 [=====>........................] - ETA: 2:33 - loss: 0.6738 - regression_loss: 0.6059 - classification_loss: 0.0679 101/500 [=====>........................] - ETA: 2:33 - loss: 0.6733 - regression_loss: 0.6056 - classification_loss: 0.0677 102/500 [=====>........................] - ETA: 2:32 - loss: 0.6728 - regression_loss: 0.6049 - classification_loss: 0.0679 103/500 [=====>........................] - ETA: 2:32 - loss: 0.6734 - regression_loss: 0.6053 - classification_loss: 0.0681 104/500 [=====>........................] - ETA: 2:32 - loss: 0.6700 - regression_loss: 0.6023 - classification_loss: 0.0678 105/500 [=====>........................] - ETA: 2:31 - loss: 0.6746 - regression_loss: 0.6062 - classification_loss: 0.0684 106/500 [=====>........................] - ETA: 2:31 - loss: 0.6718 - regression_loss: 0.6036 - classification_loss: 0.0681 107/500 [=====>........................] - ETA: 2:31 - loss: 0.6720 - regression_loss: 0.6041 - classification_loss: 0.0679 108/500 [=====>........................] - ETA: 2:30 - loss: 0.6689 - regression_loss: 0.6015 - classification_loss: 0.0674 109/500 [=====>........................] - ETA: 2:30 - loss: 0.6692 - regression_loss: 0.6018 - classification_loss: 0.0674 110/500 [=====>........................] - ETA: 2:30 - loss: 0.6689 - regression_loss: 0.6014 - classification_loss: 0.0674 111/500 [=====>........................] - ETA: 2:29 - loss: 0.6722 - regression_loss: 0.6050 - classification_loss: 0.0672 112/500 [=====>........................] - ETA: 2:29 - loss: 0.6763 - regression_loss: 0.6088 - classification_loss: 0.0675 113/500 [=====>........................] - ETA: 2:28 - loss: 0.6764 - regression_loss: 0.6089 - classification_loss: 0.0675 114/500 [=====>........................] - ETA: 2:28 - loss: 0.6751 - regression_loss: 0.6078 - classification_loss: 0.0673 115/500 [=====>........................] - ETA: 2:28 - loss: 0.6731 - regression_loss: 0.6061 - classification_loss: 0.0670 116/500 [=====>........................] - ETA: 2:27 - loss: 0.6746 - regression_loss: 0.6073 - classification_loss: 0.0672 117/500 [======>.......................] - ETA: 2:27 - loss: 0.6755 - regression_loss: 0.6081 - classification_loss: 0.0675 118/500 [======>.......................] - ETA: 2:26 - loss: 0.6754 - regression_loss: 0.6080 - classification_loss: 0.0675 119/500 [======>.......................] - ETA: 2:26 - loss: 0.6741 - regression_loss: 0.6068 - classification_loss: 0.0673 120/500 [======>.......................] - ETA: 2:25 - loss: 0.6743 - regression_loss: 0.6071 - classification_loss: 0.0673 121/500 [======>.......................] - ETA: 2:25 - loss: 0.6722 - regression_loss: 0.6053 - classification_loss: 0.0669 122/500 [======>.......................] - ETA: 2:25 - loss: 0.6744 - regression_loss: 0.6076 - classification_loss: 0.0668 123/500 [======>.......................] - ETA: 2:24 - loss: 0.6712 - regression_loss: 0.6047 - classification_loss: 0.0664 124/500 [======>.......................] - ETA: 2:24 - loss: 0.6687 - regression_loss: 0.6025 - classification_loss: 0.0662 125/500 [======>.......................] - ETA: 2:24 - loss: 0.6692 - regression_loss: 0.6031 - classification_loss: 0.0661 126/500 [======>.......................] - ETA: 2:23 - loss: 0.6700 - regression_loss: 0.6036 - classification_loss: 0.0664 127/500 [======>.......................] - ETA: 2:23 - loss: 0.6710 - regression_loss: 0.6047 - classification_loss: 0.0663 128/500 [======>.......................] - ETA: 2:23 - loss: 0.6716 - regression_loss: 0.6052 - classification_loss: 0.0664 129/500 [======>.......................] - ETA: 2:22 - loss: 0.6704 - regression_loss: 0.6042 - classification_loss: 0.0661 130/500 [======>.......................] - ETA: 2:22 - loss: 0.6708 - regression_loss: 0.6047 - classification_loss: 0.0661 131/500 [======>.......................] - ETA: 2:21 - loss: 0.6712 - regression_loss: 0.6048 - classification_loss: 0.0664 132/500 [======>.......................] - ETA: 2:21 - loss: 0.6715 - regression_loss: 0.6052 - classification_loss: 0.0663 133/500 [======>.......................] - ETA: 2:21 - loss: 0.6696 - regression_loss: 0.6035 - classification_loss: 0.0662 134/500 [=======>......................] - ETA: 2:20 - loss: 0.6690 - regression_loss: 0.6030 - classification_loss: 0.0660 135/500 [=======>......................] - ETA: 2:20 - loss: 0.6679 - regression_loss: 0.6020 - classification_loss: 0.0659 136/500 [=======>......................] - ETA: 2:19 - loss: 0.6678 - regression_loss: 0.6020 - classification_loss: 0.0659 137/500 [=======>......................] - ETA: 2:19 - loss: 0.6680 - regression_loss: 0.6021 - classification_loss: 0.0659 138/500 [=======>......................] - ETA: 2:19 - loss: 0.6666 - regression_loss: 0.6009 - classification_loss: 0.0657 139/500 [=======>......................] - ETA: 2:18 - loss: 0.6677 - regression_loss: 0.6019 - classification_loss: 0.0657 140/500 [=======>......................] - ETA: 2:18 - loss: 0.6667 - regression_loss: 0.6012 - classification_loss: 0.0655 141/500 [=======>......................] - ETA: 2:17 - loss: 0.6677 - regression_loss: 0.6020 - classification_loss: 0.0657 142/500 [=======>......................] - ETA: 2:17 - loss: 0.6669 - regression_loss: 0.6013 - classification_loss: 0.0656 143/500 [=======>......................] - ETA: 2:17 - loss: 0.6675 - regression_loss: 0.6016 - classification_loss: 0.0659 144/500 [=======>......................] - ETA: 2:16 - loss: 0.6680 - regression_loss: 0.6021 - classification_loss: 0.0659 145/500 [=======>......................] - ETA: 2:16 - loss: 0.6665 - regression_loss: 0.6009 - classification_loss: 0.0657 146/500 [=======>......................] - ETA: 2:16 - loss: 0.6664 - regression_loss: 0.6007 - classification_loss: 0.0657 147/500 [=======>......................] - ETA: 2:15 - loss: 0.6640 - regression_loss: 0.5985 - classification_loss: 0.0655 148/500 [=======>......................] - ETA: 2:15 - loss: 0.6627 - regression_loss: 0.5973 - classification_loss: 0.0654 149/500 [=======>......................] - ETA: 2:14 - loss: 0.6637 - regression_loss: 0.5982 - classification_loss: 0.0656 150/500 [========>.....................] - ETA: 2:14 - loss: 0.6632 - regression_loss: 0.5974 - classification_loss: 0.0657 151/500 [========>.....................] - ETA: 2:13 - loss: 0.6651 - regression_loss: 0.5993 - classification_loss: 0.0658 152/500 [========>.....................] - ETA: 2:13 - loss: 0.6659 - regression_loss: 0.6000 - classification_loss: 0.0659 153/500 [========>.....................] - ETA: 2:13 - loss: 0.6658 - regression_loss: 0.5998 - classification_loss: 0.0660 154/500 [========>.....................] - ETA: 2:12 - loss: 0.6659 - regression_loss: 0.5999 - classification_loss: 0.0660 155/500 [========>.....................] - ETA: 2:12 - loss: 0.6647 - regression_loss: 0.5988 - classification_loss: 0.0659 156/500 [========>.....................] - ETA: 2:12 - loss: 0.6639 - regression_loss: 0.5981 - classification_loss: 0.0658 157/500 [========>.....................] - ETA: 2:11 - loss: 0.6635 - regression_loss: 0.5978 - classification_loss: 0.0657 158/500 [========>.....................] - ETA: 2:11 - loss: 0.6638 - regression_loss: 0.5980 - classification_loss: 0.0658 159/500 [========>.....................] - ETA: 2:10 - loss: 0.6629 - regression_loss: 0.5972 - classification_loss: 0.0658 160/500 [========>.....................] - ETA: 2:10 - loss: 0.6666 - regression_loss: 0.6006 - classification_loss: 0.0660 161/500 [========>.....................] - ETA: 2:10 - loss: 0.6654 - regression_loss: 0.5995 - classification_loss: 0.0659 162/500 [========>.....................] - ETA: 2:09 - loss: 0.6650 - regression_loss: 0.5990 - classification_loss: 0.0661 163/500 [========>.....................] - ETA: 2:09 - loss: 0.6640 - regression_loss: 0.5980 - classification_loss: 0.0660 164/500 [========>.....................] - ETA: 2:08 - loss: 0.6637 - regression_loss: 0.5978 - classification_loss: 0.0659 165/500 [========>.....................] - ETA: 2:08 - loss: 0.6632 - regression_loss: 0.5975 - classification_loss: 0.0657 166/500 [========>.....................] - ETA: 2:08 - loss: 0.6628 - regression_loss: 0.5972 - classification_loss: 0.0656 167/500 [=========>....................] - ETA: 2:07 - loss: 0.6643 - regression_loss: 0.5984 - classification_loss: 0.0659 168/500 [=========>....................] - ETA: 2:07 - loss: 0.6655 - regression_loss: 0.5994 - classification_loss: 0.0661 169/500 [=========>....................] - ETA: 2:07 - loss: 0.6652 - regression_loss: 0.5991 - classification_loss: 0.0661 170/500 [=========>....................] - ETA: 2:06 - loss: 0.6644 - regression_loss: 0.5983 - classification_loss: 0.0661 171/500 [=========>....................] - ETA: 2:06 - loss: 0.6659 - regression_loss: 0.5992 - classification_loss: 0.0667 172/500 [=========>....................] - ETA: 2:06 - loss: 0.6643 - regression_loss: 0.5977 - classification_loss: 0.0665 173/500 [=========>....................] - ETA: 2:05 - loss: 0.6629 - regression_loss: 0.5965 - classification_loss: 0.0664 174/500 [=========>....................] - ETA: 2:05 - loss: 0.6630 - regression_loss: 0.5968 - classification_loss: 0.0662 175/500 [=========>....................] - ETA: 2:05 - loss: 0.6678 - regression_loss: 0.6004 - classification_loss: 0.0674 176/500 [=========>....................] - ETA: 2:04 - loss: 0.6698 - regression_loss: 0.6018 - classification_loss: 0.0680 177/500 [=========>....................] - ETA: 2:04 - loss: 0.6713 - regression_loss: 0.6030 - classification_loss: 0.0683 178/500 [=========>....................] - ETA: 2:03 - loss: 0.6714 - regression_loss: 0.6031 - classification_loss: 0.0683 179/500 [=========>....................] - ETA: 2:03 - loss: 0.6723 - regression_loss: 0.6041 - classification_loss: 0.0683 180/500 [=========>....................] - ETA: 2:03 - loss: 0.6717 - regression_loss: 0.6035 - classification_loss: 0.0682 181/500 [=========>....................] - ETA: 2:02 - loss: 0.6726 - regression_loss: 0.6041 - classification_loss: 0.0685 182/500 [=========>....................] - ETA: 2:02 - loss: 0.6709 - regression_loss: 0.6025 - classification_loss: 0.0684 183/500 [=========>....................] - ETA: 2:01 - loss: 0.6714 - regression_loss: 0.6029 - classification_loss: 0.0685 184/500 [==========>...................] - ETA: 2:01 - loss: 0.6722 - regression_loss: 0.6038 - classification_loss: 0.0684 185/500 [==========>...................] - ETA: 2:01 - loss: 0.6711 - regression_loss: 0.6029 - classification_loss: 0.0683 186/500 [==========>...................] - ETA: 2:00 - loss: 0.6730 - regression_loss: 0.6045 - classification_loss: 0.0685 187/500 [==========>...................] - ETA: 2:00 - loss: 0.6723 - regression_loss: 0.6040 - classification_loss: 0.0683 188/500 [==========>...................] - ETA: 2:00 - loss: 0.6703 - regression_loss: 0.6023 - classification_loss: 0.0681 189/500 [==========>...................] - ETA: 1:59 - loss: 0.6714 - regression_loss: 0.6033 - classification_loss: 0.0681 190/500 [==========>...................] - ETA: 1:59 - loss: 0.6721 - regression_loss: 0.6040 - classification_loss: 0.0681 191/500 [==========>...................] - ETA: 1:58 - loss: 0.6718 - regression_loss: 0.6037 - classification_loss: 0.0681 192/500 [==========>...................] - ETA: 1:58 - loss: 0.6725 - regression_loss: 0.6043 - classification_loss: 0.0682 193/500 [==========>...................] - ETA: 1:58 - loss: 0.6722 - regression_loss: 0.6041 - classification_loss: 0.0681 194/500 [==========>...................] - ETA: 1:57 - loss: 0.6724 - regression_loss: 0.6043 - classification_loss: 0.0681 195/500 [==========>...................] - ETA: 1:57 - loss: 0.6727 - regression_loss: 0.6047 - classification_loss: 0.0681 196/500 [==========>...................] - ETA: 1:56 - loss: 0.6707 - regression_loss: 0.6028 - classification_loss: 0.0679 197/500 [==========>...................] - ETA: 1:56 - loss: 0.6685 - regression_loss: 0.6009 - classification_loss: 0.0677 198/500 [==========>...................] - ETA: 1:56 - loss: 0.6680 - regression_loss: 0.6005 - classification_loss: 0.0676 199/500 [==========>...................] - ETA: 1:55 - loss: 0.6665 - regression_loss: 0.5991 - classification_loss: 0.0674 200/500 [===========>..................] - ETA: 1:55 - loss: 0.6649 - regression_loss: 0.5977 - classification_loss: 0.0672 201/500 [===========>..................] - ETA: 1:54 - loss: 0.6644 - regression_loss: 0.5973 - classification_loss: 0.0671 202/500 [===========>..................] - ETA: 1:54 - loss: 0.6651 - regression_loss: 0.5980 - classification_loss: 0.0671 203/500 [===========>..................] - ETA: 1:54 - loss: 0.6654 - regression_loss: 0.5981 - classification_loss: 0.0673 204/500 [===========>..................] - ETA: 1:53 - loss: 0.6656 - regression_loss: 0.5982 - classification_loss: 0.0674 205/500 [===========>..................] - ETA: 1:53 - loss: 0.6672 - regression_loss: 0.5997 - classification_loss: 0.0676 206/500 [===========>..................] - ETA: 1:53 - loss: 0.6678 - regression_loss: 0.6003 - classification_loss: 0.0676 207/500 [===========>..................] - ETA: 1:52 - loss: 0.6681 - regression_loss: 0.6005 - classification_loss: 0.0675 208/500 [===========>..................] - ETA: 1:52 - loss: 0.6682 - regression_loss: 0.6005 - classification_loss: 0.0677 209/500 [===========>..................] - ETA: 1:51 - loss: 0.6686 - regression_loss: 0.6008 - classification_loss: 0.0678 210/500 [===========>..................] - ETA: 1:51 - loss: 0.6701 - regression_loss: 0.6021 - classification_loss: 0.0680 211/500 [===========>..................] - ETA: 1:51 - loss: 0.6690 - regression_loss: 0.6011 - classification_loss: 0.0679 212/500 [===========>..................] - ETA: 1:50 - loss: 0.6705 - regression_loss: 0.6024 - classification_loss: 0.0681 213/500 [===========>..................] - ETA: 1:50 - loss: 0.6686 - regression_loss: 0.6007 - classification_loss: 0.0679 214/500 [===========>..................] - ETA: 1:50 - loss: 0.6684 - regression_loss: 0.6004 - classification_loss: 0.0680 215/500 [===========>..................] - ETA: 1:49 - loss: 0.6678 - regression_loss: 0.5999 - classification_loss: 0.0679 216/500 [===========>..................] - ETA: 1:49 - loss: 0.6686 - regression_loss: 0.6006 - classification_loss: 0.0680 217/500 [============>.................] - ETA: 1:48 - loss: 0.6670 - regression_loss: 0.5992 - classification_loss: 0.0678 218/500 [============>.................] - ETA: 1:48 - loss: 0.6694 - regression_loss: 0.6013 - classification_loss: 0.0681 219/500 [============>.................] - ETA: 1:48 - loss: 0.6688 - regression_loss: 0.6007 - classification_loss: 0.0681 220/500 [============>.................] - ETA: 1:47 - loss: 0.6664 - regression_loss: 0.5985 - classification_loss: 0.0679 221/500 [============>.................] - ETA: 1:47 - loss: 0.6654 - regression_loss: 0.5976 - classification_loss: 0.0678 222/500 [============>.................] - ETA: 1:46 - loss: 0.6650 - regression_loss: 0.5972 - classification_loss: 0.0678 223/500 [============>.................] - ETA: 1:46 - loss: 0.6656 - regression_loss: 0.5977 - classification_loss: 0.0678 224/500 [============>.................] - ETA: 1:46 - loss: 0.6647 - regression_loss: 0.5970 - classification_loss: 0.0677 225/500 [============>.................] - ETA: 1:45 - loss: 0.6636 - regression_loss: 0.5961 - classification_loss: 0.0676 226/500 [============>.................] - ETA: 1:45 - loss: 0.6639 - regression_loss: 0.5963 - classification_loss: 0.0676 227/500 [============>.................] - ETA: 1:44 - loss: 0.6628 - regression_loss: 0.5954 - classification_loss: 0.0675 228/500 [============>.................] - ETA: 1:44 - loss: 0.6634 - regression_loss: 0.5959 - classification_loss: 0.0674 229/500 [============>.................] - ETA: 1:44 - loss: 0.6624 - regression_loss: 0.5951 - classification_loss: 0.0673 230/500 [============>.................] - ETA: 1:43 - loss: 0.6609 - regression_loss: 0.5937 - classification_loss: 0.0672 231/500 [============>.................] - ETA: 1:43 - loss: 0.6622 - regression_loss: 0.5948 - classification_loss: 0.0674 232/500 [============>.................] - ETA: 1:42 - loss: 0.6615 - regression_loss: 0.5942 - classification_loss: 0.0673 233/500 [============>.................] - ETA: 1:42 - loss: 0.6604 - regression_loss: 0.5931 - classification_loss: 0.0673 234/500 [=============>................] - ETA: 1:42 - loss: 0.6605 - regression_loss: 0.5933 - classification_loss: 0.0672 235/500 [=============>................] - ETA: 1:41 - loss: 0.6609 - regression_loss: 0.5935 - classification_loss: 0.0673 236/500 [=============>................] - ETA: 1:41 - loss: 0.6615 - regression_loss: 0.5942 - classification_loss: 0.0673 237/500 [=============>................] - ETA: 1:40 - loss: 0.6608 - regression_loss: 0.5937 - classification_loss: 0.0671 238/500 [=============>................] - ETA: 1:40 - loss: 0.6602 - regression_loss: 0.5931 - classification_loss: 0.0670 239/500 [=============>................] - ETA: 1:40 - loss: 0.6607 - regression_loss: 0.5936 - classification_loss: 0.0671 240/500 [=============>................] - ETA: 1:39 - loss: 0.6609 - regression_loss: 0.5938 - classification_loss: 0.0671 241/500 [=============>................] - ETA: 1:39 - loss: 0.6596 - regression_loss: 0.5927 - classification_loss: 0.0669 242/500 [=============>................] - ETA: 1:38 - loss: 0.6587 - regression_loss: 0.5918 - classification_loss: 0.0669 243/500 [=============>................] - ETA: 1:38 - loss: 0.6585 - regression_loss: 0.5916 - classification_loss: 0.0669 244/500 [=============>................] - ETA: 1:38 - loss: 0.6598 - regression_loss: 0.5927 - classification_loss: 0.0671 245/500 [=============>................] - ETA: 1:37 - loss: 0.6591 - regression_loss: 0.5921 - classification_loss: 0.0670 246/500 [=============>................] - ETA: 1:37 - loss: 0.6610 - regression_loss: 0.5936 - classification_loss: 0.0674 247/500 [=============>................] - ETA: 1:36 - loss: 0.6610 - regression_loss: 0.5937 - classification_loss: 0.0673 248/500 [=============>................] - ETA: 1:36 - loss: 0.6607 - regression_loss: 0.5934 - classification_loss: 0.0673 249/500 [=============>................] - ETA: 1:36 - loss: 0.6597 - regression_loss: 0.5925 - classification_loss: 0.0671 250/500 [==============>...............] - ETA: 1:35 - loss: 0.6608 - regression_loss: 0.5935 - classification_loss: 0.0673 251/500 [==============>...............] - ETA: 1:35 - loss: 0.6610 - regression_loss: 0.5938 - classification_loss: 0.0673 252/500 [==============>...............] - ETA: 1:34 - loss: 0.6599 - regression_loss: 0.5928 - classification_loss: 0.0671 253/500 [==============>...............] - ETA: 1:34 - loss: 0.6600 - regression_loss: 0.5928 - classification_loss: 0.0672 254/500 [==============>...............] - ETA: 1:34 - loss: 0.6604 - regression_loss: 0.5931 - classification_loss: 0.0672 255/500 [==============>...............] - ETA: 1:33 - loss: 0.6603 - regression_loss: 0.5931 - classification_loss: 0.0672 256/500 [==============>...............] - ETA: 1:33 - loss: 0.6600 - regression_loss: 0.5928 - classification_loss: 0.0672 257/500 [==============>...............] - ETA: 1:33 - loss: 0.6592 - regression_loss: 0.5922 - classification_loss: 0.0670 258/500 [==============>...............] - ETA: 1:32 - loss: 0.6572 - regression_loss: 0.5904 - classification_loss: 0.0668 259/500 [==============>...............] - ETA: 1:32 - loss: 0.6568 - regression_loss: 0.5900 - classification_loss: 0.0668 260/500 [==============>...............] - ETA: 1:31 - loss: 0.6575 - regression_loss: 0.5906 - classification_loss: 0.0669 261/500 [==============>...............] - ETA: 1:31 - loss: 0.6573 - regression_loss: 0.5904 - classification_loss: 0.0669 262/500 [==============>...............] - ETA: 1:31 - loss: 0.6589 - regression_loss: 0.5920 - classification_loss: 0.0669 263/500 [==============>...............] - ETA: 1:30 - loss: 0.6594 - regression_loss: 0.5926 - classification_loss: 0.0669 264/500 [==============>...............] - ETA: 1:30 - loss: 0.6613 - regression_loss: 0.5942 - classification_loss: 0.0670 265/500 [==============>...............] - ETA: 1:29 - loss: 0.6627 - regression_loss: 0.5957 - classification_loss: 0.0670 266/500 [==============>...............] - ETA: 1:29 - loss: 0.6629 - regression_loss: 0.5959 - classification_loss: 0.0670 267/500 [===============>..............] - ETA: 1:29 - loss: 0.6623 - regression_loss: 0.5954 - classification_loss: 0.0669 268/500 [===============>..............] - ETA: 1:28 - loss: 0.6623 - regression_loss: 0.5954 - classification_loss: 0.0669 269/500 [===============>..............] - ETA: 1:28 - loss: 0.6650 - regression_loss: 0.5978 - classification_loss: 0.0672 270/500 [===============>..............] - ETA: 1:28 - loss: 0.6693 - regression_loss: 0.6016 - classification_loss: 0.0678 271/500 [===============>..............] - ETA: 1:27 - loss: 0.6703 - regression_loss: 0.6024 - classification_loss: 0.0679 272/500 [===============>..............] - ETA: 1:27 - loss: 0.6705 - regression_loss: 0.6027 - classification_loss: 0.0678 273/500 [===============>..............] - ETA: 1:26 - loss: 0.6713 - regression_loss: 0.6034 - classification_loss: 0.0680 274/500 [===============>..............] - ETA: 1:26 - loss: 0.6702 - regression_loss: 0.6024 - classification_loss: 0.0678 275/500 [===============>..............] - ETA: 1:26 - loss: 0.6715 - regression_loss: 0.6036 - classification_loss: 0.0679 276/500 [===============>..............] - ETA: 1:25 - loss: 0.6726 - regression_loss: 0.6046 - classification_loss: 0.0680 277/500 [===============>..............] - ETA: 1:25 - loss: 0.6729 - regression_loss: 0.6049 - classification_loss: 0.0680 278/500 [===============>..............] - ETA: 1:24 - loss: 0.6741 - regression_loss: 0.6060 - classification_loss: 0.0681 279/500 [===============>..............] - ETA: 1:24 - loss: 0.6739 - regression_loss: 0.6058 - classification_loss: 0.0681 280/500 [===============>..............] - ETA: 1:24 - loss: 0.6727 - regression_loss: 0.6047 - classification_loss: 0.0680 281/500 [===============>..............] - ETA: 1:23 - loss: 0.6717 - regression_loss: 0.6038 - classification_loss: 0.0679 282/500 [===============>..............] - ETA: 1:23 - loss: 0.6716 - regression_loss: 0.6037 - classification_loss: 0.0679 283/500 [===============>..............] - ETA: 1:23 - loss: 0.6738 - regression_loss: 0.6057 - classification_loss: 0.0680 284/500 [================>.............] - ETA: 1:22 - loss: 0.6740 - regression_loss: 0.6060 - classification_loss: 0.0680 285/500 [================>.............] - ETA: 1:22 - loss: 0.6753 - regression_loss: 0.6072 - classification_loss: 0.0681 286/500 [================>.............] - ETA: 1:21 - loss: 0.6768 - regression_loss: 0.6086 - classification_loss: 0.0682 287/500 [================>.............] - ETA: 1:21 - loss: 0.6776 - regression_loss: 0.6094 - classification_loss: 0.0682 288/500 [================>.............] - ETA: 1:21 - loss: 0.6784 - regression_loss: 0.6100 - classification_loss: 0.0684 289/500 [================>.............] - ETA: 1:20 - loss: 0.6779 - regression_loss: 0.6095 - classification_loss: 0.0683 290/500 [================>.............] - ETA: 1:20 - loss: 0.6776 - regression_loss: 0.6093 - classification_loss: 0.0683 291/500 [================>.............] - ETA: 1:20 - loss: 0.6772 - regression_loss: 0.6090 - classification_loss: 0.0682 292/500 [================>.............] - ETA: 1:19 - loss: 0.6774 - regression_loss: 0.6091 - classification_loss: 0.0682 293/500 [================>.............] - ETA: 1:19 - loss: 0.6767 - regression_loss: 0.6086 - classification_loss: 0.0681 294/500 [================>.............] - ETA: 1:18 - loss: 0.6768 - regression_loss: 0.6087 - classification_loss: 0.0681 295/500 [================>.............] - ETA: 1:18 - loss: 0.6760 - regression_loss: 0.6080 - classification_loss: 0.0680 296/500 [================>.............] - ETA: 1:18 - loss: 0.6757 - regression_loss: 0.6078 - classification_loss: 0.0679 297/500 [================>.............] - ETA: 1:17 - loss: 0.6745 - regression_loss: 0.6066 - classification_loss: 0.0679 298/500 [================>.............] - ETA: 1:17 - loss: 0.6747 - regression_loss: 0.6068 - classification_loss: 0.0679 299/500 [================>.............] - ETA: 1:16 - loss: 0.6750 - regression_loss: 0.6071 - classification_loss: 0.0679 300/500 [=================>............] - ETA: 1:16 - loss: 0.6759 - regression_loss: 0.6080 - classification_loss: 0.0680 301/500 [=================>............] - ETA: 1:16 - loss: 0.6770 - regression_loss: 0.6090 - classification_loss: 0.0680 302/500 [=================>............] - ETA: 1:15 - loss: 0.6770 - regression_loss: 0.6091 - classification_loss: 0.0680 303/500 [=================>............] - ETA: 1:15 - loss: 0.6788 - regression_loss: 0.6107 - classification_loss: 0.0681 304/500 [=================>............] - ETA: 1:14 - loss: 0.6794 - regression_loss: 0.6113 - classification_loss: 0.0681 305/500 [=================>............] - ETA: 1:14 - loss: 0.6788 - regression_loss: 0.6106 - classification_loss: 0.0681 306/500 [=================>............] - ETA: 1:14 - loss: 0.6784 - regression_loss: 0.6103 - classification_loss: 0.0681 307/500 [=================>............] - ETA: 1:13 - loss: 0.6774 - regression_loss: 0.6094 - classification_loss: 0.0679 308/500 [=================>............] - ETA: 1:13 - loss: 0.6786 - regression_loss: 0.6106 - classification_loss: 0.0680 309/500 [=================>............] - ETA: 1:13 - loss: 0.6792 - regression_loss: 0.6111 - classification_loss: 0.0681 310/500 [=================>............] - ETA: 1:12 - loss: 0.6799 - regression_loss: 0.6118 - classification_loss: 0.0681 311/500 [=================>............] - ETA: 1:12 - loss: 0.6788 - regression_loss: 0.6109 - classification_loss: 0.0679 312/500 [=================>............] - ETA: 1:11 - loss: 0.6787 - regression_loss: 0.6109 - classification_loss: 0.0679 313/500 [=================>............] - ETA: 1:11 - loss: 0.6801 - regression_loss: 0.6122 - classification_loss: 0.0680 314/500 [=================>............] - ETA: 1:11 - loss: 0.6790 - regression_loss: 0.6112 - classification_loss: 0.0678 315/500 [=================>............] - ETA: 1:10 - loss: 0.6802 - regression_loss: 0.6123 - classification_loss: 0.0679 316/500 [=================>............] - ETA: 1:10 - loss: 0.6791 - regression_loss: 0.6113 - classification_loss: 0.0677 317/500 [==================>...........] - ETA: 1:10 - loss: 0.6779 - regression_loss: 0.6103 - classification_loss: 0.0676 318/500 [==================>...........] - ETA: 1:09 - loss: 0.6785 - regression_loss: 0.6108 - classification_loss: 0.0677 319/500 [==================>...........] - ETA: 1:09 - loss: 0.6788 - regression_loss: 0.6110 - classification_loss: 0.0678 320/500 [==================>...........] - ETA: 1:08 - loss: 0.6788 - regression_loss: 0.6111 - classification_loss: 0.0677 321/500 [==================>...........] - ETA: 1:08 - loss: 0.6785 - regression_loss: 0.6108 - classification_loss: 0.0677 322/500 [==================>...........] - ETA: 1:08 - loss: 0.6780 - regression_loss: 0.6104 - classification_loss: 0.0676 323/500 [==================>...........] - ETA: 1:07 - loss: 0.6783 - regression_loss: 0.6107 - classification_loss: 0.0677 324/500 [==================>...........] - ETA: 1:07 - loss: 0.6780 - regression_loss: 0.6104 - classification_loss: 0.0676 325/500 [==================>...........] - ETA: 1:06 - loss: 0.6769 - regression_loss: 0.6094 - classification_loss: 0.0675 326/500 [==================>...........] - ETA: 1:06 - loss: 0.6757 - regression_loss: 0.6084 - classification_loss: 0.0673 327/500 [==================>...........] - ETA: 1:06 - loss: 0.6751 - regression_loss: 0.6078 - classification_loss: 0.0673 328/500 [==================>...........] - ETA: 1:05 - loss: 0.6746 - regression_loss: 0.6074 - classification_loss: 0.0672 329/500 [==================>...........] - ETA: 1:05 - loss: 0.6742 - regression_loss: 0.6071 - classification_loss: 0.0672 330/500 [==================>...........] - ETA: 1:05 - loss: 0.6746 - regression_loss: 0.6074 - classification_loss: 0.0672 331/500 [==================>...........] - ETA: 1:04 - loss: 0.6738 - regression_loss: 0.6066 - classification_loss: 0.0671 332/500 [==================>...........] - ETA: 1:04 - loss: 0.6733 - regression_loss: 0.6063 - classification_loss: 0.0670 333/500 [==================>...........] - ETA: 1:03 - loss: 0.6724 - regression_loss: 0.6054 - classification_loss: 0.0670 334/500 [===================>..........] - ETA: 1:03 - loss: 0.6720 - regression_loss: 0.6050 - classification_loss: 0.0670 335/500 [===================>..........] - ETA: 1:03 - loss: 0.6714 - regression_loss: 0.6045 - classification_loss: 0.0670 336/500 [===================>..........] - ETA: 1:02 - loss: 0.6718 - regression_loss: 0.6049 - classification_loss: 0.0669 337/500 [===================>..........] - ETA: 1:02 - loss: 0.6711 - regression_loss: 0.6042 - classification_loss: 0.0669 338/500 [===================>..........] - ETA: 1:01 - loss: 0.6720 - regression_loss: 0.6050 - classification_loss: 0.0670 339/500 [===================>..........] - ETA: 1:01 - loss: 0.6711 - regression_loss: 0.6041 - classification_loss: 0.0669 340/500 [===================>..........] - ETA: 1:01 - loss: 0.6710 - regression_loss: 0.6041 - classification_loss: 0.0669 341/500 [===================>..........] - ETA: 1:00 - loss: 0.6713 - regression_loss: 0.6044 - classification_loss: 0.0669 342/500 [===================>..........] - ETA: 1:00 - loss: 0.6716 - regression_loss: 0.6046 - classification_loss: 0.0670 343/500 [===================>..........] - ETA: 1:00 - loss: 0.6706 - regression_loss: 0.6038 - classification_loss: 0.0669 344/500 [===================>..........] - ETA: 59s - loss: 0.6707 - regression_loss: 0.6038 - classification_loss: 0.0669  345/500 [===================>..........] - ETA: 59s - loss: 0.6701 - regression_loss: 0.6033 - classification_loss: 0.0668 346/500 [===================>..........] - ETA: 58s - loss: 0.6694 - regression_loss: 0.6027 - classification_loss: 0.0667 347/500 [===================>..........] - ETA: 58s - loss: 0.6694 - regression_loss: 0.6027 - classification_loss: 0.0667 348/500 [===================>..........] - ETA: 58s - loss: 0.6685 - regression_loss: 0.6019 - classification_loss: 0.0665 349/500 [===================>..........] - ETA: 57s - loss: 0.6687 - regression_loss: 0.6021 - classification_loss: 0.0666 350/500 [====================>.........] - ETA: 57s - loss: 0.6677 - regression_loss: 0.6012 - classification_loss: 0.0665 351/500 [====================>.........] - ETA: 56s - loss: 0.6675 - regression_loss: 0.6011 - classification_loss: 0.0665 352/500 [====================>.........] - ETA: 56s - loss: 0.6688 - regression_loss: 0.6022 - classification_loss: 0.0666 353/500 [====================>.........] - ETA: 56s - loss: 0.6697 - regression_loss: 0.6030 - classification_loss: 0.0667 354/500 [====================>.........] - ETA: 55s - loss: 0.6699 - regression_loss: 0.6032 - classification_loss: 0.0666 355/500 [====================>.........] - ETA: 55s - loss: 0.6694 - regression_loss: 0.6027 - classification_loss: 0.0667 356/500 [====================>.........] - ETA: 55s - loss: 0.6692 - regression_loss: 0.6026 - classification_loss: 0.0666 357/500 [====================>.........] - ETA: 54s - loss: 0.6712 - regression_loss: 0.6044 - classification_loss: 0.0668 358/500 [====================>.........] - ETA: 54s - loss: 0.6716 - regression_loss: 0.6047 - classification_loss: 0.0668 359/500 [====================>.........] - ETA: 53s - loss: 0.6703 - regression_loss: 0.6036 - classification_loss: 0.0667 360/500 [====================>.........] - ETA: 53s - loss: 0.6703 - regression_loss: 0.6036 - classification_loss: 0.0667 361/500 [====================>.........] - ETA: 53s - loss: 0.6706 - regression_loss: 0.6039 - classification_loss: 0.0668 362/500 [====================>.........] - ETA: 52s - loss: 0.6703 - regression_loss: 0.6036 - classification_loss: 0.0667 363/500 [====================>.........] - ETA: 52s - loss: 0.6706 - regression_loss: 0.6038 - classification_loss: 0.0668 364/500 [====================>.........] - ETA: 52s - loss: 0.6709 - regression_loss: 0.6041 - classification_loss: 0.0668 365/500 [====================>.........] - ETA: 51s - loss: 0.6716 - regression_loss: 0.6046 - classification_loss: 0.0670 366/500 [====================>.........] - ETA: 51s - loss: 0.6706 - regression_loss: 0.6038 - classification_loss: 0.0668 367/500 [=====================>........] - ETA: 50s - loss: 0.6711 - regression_loss: 0.6042 - classification_loss: 0.0669 368/500 [=====================>........] - ETA: 50s - loss: 0.6699 - regression_loss: 0.6032 - classification_loss: 0.0667 369/500 [=====================>........] - ETA: 50s - loss: 0.6709 - regression_loss: 0.6039 - classification_loss: 0.0669 370/500 [=====================>........] - ETA: 49s - loss: 0.6707 - regression_loss: 0.6039 - classification_loss: 0.0668 371/500 [=====================>........] - ETA: 49s - loss: 0.6695 - regression_loss: 0.6028 - classification_loss: 0.0667 372/500 [=====================>........] - ETA: 48s - loss: 0.6698 - regression_loss: 0.6031 - classification_loss: 0.0668 373/500 [=====================>........] - ETA: 48s - loss: 0.6701 - regression_loss: 0.6033 - classification_loss: 0.0669 374/500 [=====================>........] - ETA: 48s - loss: 0.6709 - regression_loss: 0.6039 - classification_loss: 0.0670 375/500 [=====================>........] - ETA: 47s - loss: 0.6704 - regression_loss: 0.6035 - classification_loss: 0.0669 376/500 [=====================>........] - ETA: 47s - loss: 0.6708 - regression_loss: 0.6040 - classification_loss: 0.0668 377/500 [=====================>........] - ETA: 47s - loss: 0.6708 - regression_loss: 0.6040 - classification_loss: 0.0668 378/500 [=====================>........] - ETA: 46s - loss: 0.6695 - regression_loss: 0.6028 - classification_loss: 0.0667 379/500 [=====================>........] - ETA: 46s - loss: 0.6691 - regression_loss: 0.6025 - classification_loss: 0.0666 380/500 [=====================>........] - ETA: 45s - loss: 0.6685 - regression_loss: 0.6019 - classification_loss: 0.0666 381/500 [=====================>........] - ETA: 45s - loss: 0.6694 - regression_loss: 0.6027 - classification_loss: 0.0667 382/500 [=====================>........] - ETA: 45s - loss: 0.6693 - regression_loss: 0.6026 - classification_loss: 0.0667 383/500 [=====================>........] - ETA: 44s - loss: 0.6689 - regression_loss: 0.6022 - classification_loss: 0.0666 384/500 [======================>.......] - ETA: 44s - loss: 0.6696 - regression_loss: 0.6029 - classification_loss: 0.0667 385/500 [======================>.......] - ETA: 44s - loss: 0.6698 - regression_loss: 0.6030 - classification_loss: 0.0668 386/500 [======================>.......] - ETA: 43s - loss: 0.6692 - regression_loss: 0.6025 - classification_loss: 0.0668 387/500 [======================>.......] - ETA: 43s - loss: 0.6688 - regression_loss: 0.6021 - classification_loss: 0.0667 388/500 [======================>.......] - ETA: 42s - loss: 0.6686 - regression_loss: 0.6019 - classification_loss: 0.0667 389/500 [======================>.......] - ETA: 42s - loss: 0.6700 - regression_loss: 0.6030 - classification_loss: 0.0670 390/500 [======================>.......] - ETA: 42s - loss: 0.6702 - regression_loss: 0.6032 - classification_loss: 0.0670 391/500 [======================>.......] - ETA: 41s - loss: 0.6705 - regression_loss: 0.6035 - classification_loss: 0.0670 392/500 [======================>.......] - ETA: 41s - loss: 0.6700 - regression_loss: 0.6030 - classification_loss: 0.0670 393/500 [======================>.......] - ETA: 40s - loss: 0.6698 - regression_loss: 0.6027 - classification_loss: 0.0670 394/500 [======================>.......] - ETA: 40s - loss: 0.6702 - regression_loss: 0.6031 - classification_loss: 0.0671 395/500 [======================>.......] - ETA: 40s - loss: 0.6704 - regression_loss: 0.6033 - classification_loss: 0.0671 396/500 [======================>.......] - ETA: 39s - loss: 0.6703 - regression_loss: 0.6032 - classification_loss: 0.0671 397/500 [======================>.......] - ETA: 39s - loss: 0.6698 - regression_loss: 0.6028 - classification_loss: 0.0670 398/500 [======================>.......] - ETA: 39s - loss: 0.6701 - regression_loss: 0.6031 - classification_loss: 0.0671 399/500 [======================>.......] - ETA: 38s - loss: 0.6693 - regression_loss: 0.6023 - classification_loss: 0.0670 400/500 [=======================>......] - ETA: 38s - loss: 0.6694 - regression_loss: 0.6024 - classification_loss: 0.0670 401/500 [=======================>......] - ETA: 37s - loss: 0.6687 - regression_loss: 0.6018 - classification_loss: 0.0669 402/500 [=======================>......] - ETA: 37s - loss: 0.6682 - regression_loss: 0.6014 - classification_loss: 0.0669 403/500 [=======================>......] - ETA: 37s - loss: 0.6685 - regression_loss: 0.6016 - classification_loss: 0.0669 404/500 [=======================>......] - ETA: 36s - loss: 0.6679 - regression_loss: 0.6011 - classification_loss: 0.0668 405/500 [=======================>......] - ETA: 36s - loss: 0.6689 - regression_loss: 0.6020 - classification_loss: 0.0669 406/500 [=======================>......] - ETA: 35s - loss: 0.6689 - regression_loss: 0.6020 - classification_loss: 0.0669 407/500 [=======================>......] - ETA: 35s - loss: 0.6688 - regression_loss: 0.6019 - classification_loss: 0.0669 408/500 [=======================>......] - ETA: 35s - loss: 0.6681 - regression_loss: 0.6013 - classification_loss: 0.0668 409/500 [=======================>......] - ETA: 34s - loss: 0.6673 - regression_loss: 0.6005 - classification_loss: 0.0667 410/500 [=======================>......] - ETA: 34s - loss: 0.6676 - regression_loss: 0.6008 - classification_loss: 0.0668 411/500 [=======================>......] - ETA: 34s - loss: 0.6672 - regression_loss: 0.6005 - classification_loss: 0.0667 412/500 [=======================>......] - ETA: 33s - loss: 0.6669 - regression_loss: 0.6001 - classification_loss: 0.0667 413/500 [=======================>......] - ETA: 33s - loss: 0.6666 - regression_loss: 0.5999 - classification_loss: 0.0667 414/500 [=======================>......] - ETA: 32s - loss: 0.6672 - regression_loss: 0.6004 - classification_loss: 0.0668 415/500 [=======================>......] - ETA: 32s - loss: 0.6673 - regression_loss: 0.6005 - classification_loss: 0.0668 416/500 [=======================>......] - ETA: 32s - loss: 0.6672 - regression_loss: 0.6004 - classification_loss: 0.0668 417/500 [========================>.....] - ETA: 31s - loss: 0.6667 - regression_loss: 0.6000 - classification_loss: 0.0667 418/500 [========================>.....] - ETA: 31s - loss: 0.6663 - regression_loss: 0.5996 - classification_loss: 0.0667 419/500 [========================>.....] - ETA: 31s - loss: 0.6664 - regression_loss: 0.5997 - classification_loss: 0.0667 420/500 [========================>.....] - ETA: 30s - loss: 0.6667 - regression_loss: 0.5999 - classification_loss: 0.0668 421/500 [========================>.....] - ETA: 30s - loss: 0.6667 - regression_loss: 0.5999 - classification_loss: 0.0668 422/500 [========================>.....] - ETA: 29s - loss: 0.6665 - regression_loss: 0.5997 - classification_loss: 0.0668 423/500 [========================>.....] - ETA: 29s - loss: 0.6657 - regression_loss: 0.5990 - classification_loss: 0.0667 424/500 [========================>.....] - ETA: 29s - loss: 0.6651 - regression_loss: 0.5985 - classification_loss: 0.0666 425/500 [========================>.....] - ETA: 28s - loss: 0.6648 - regression_loss: 0.5982 - classification_loss: 0.0666 426/500 [========================>.....] - ETA: 28s - loss: 0.6648 - regression_loss: 0.5982 - classification_loss: 0.0666 427/500 [========================>.....] - ETA: 27s - loss: 0.6647 - regression_loss: 0.5981 - classification_loss: 0.0665 428/500 [========================>.....] - ETA: 27s - loss: 0.6643 - regression_loss: 0.5978 - classification_loss: 0.0665 429/500 [========================>.....] - ETA: 27s - loss: 0.6641 - regression_loss: 0.5975 - classification_loss: 0.0666 430/500 [========================>.....] - ETA: 26s - loss: 0.6645 - regression_loss: 0.5979 - classification_loss: 0.0666 431/500 [========================>.....] - ETA: 26s - loss: 0.6646 - regression_loss: 0.5981 - classification_loss: 0.0666 432/500 [========================>.....] - ETA: 26s - loss: 0.6646 - regression_loss: 0.5980 - classification_loss: 0.0666 433/500 [========================>.....] - ETA: 25s - loss: 0.6641 - regression_loss: 0.5976 - classification_loss: 0.0665 434/500 [=========================>....] - ETA: 25s - loss: 0.6637 - regression_loss: 0.5972 - classification_loss: 0.0665 435/500 [=========================>....] - ETA: 24s - loss: 0.6631 - regression_loss: 0.5966 - classification_loss: 0.0665 436/500 [=========================>....] - ETA: 24s - loss: 0.6636 - regression_loss: 0.5971 - classification_loss: 0.0665 437/500 [=========================>....] - ETA: 24s - loss: 0.6632 - regression_loss: 0.5967 - classification_loss: 0.0665 438/500 [=========================>....] - ETA: 23s - loss: 0.6637 - regression_loss: 0.5972 - classification_loss: 0.0665 439/500 [=========================>....] - ETA: 23s - loss: 0.6632 - regression_loss: 0.5968 - classification_loss: 0.0664 440/500 [=========================>....] - ETA: 22s - loss: 0.6629 - regression_loss: 0.5965 - classification_loss: 0.0664 441/500 [=========================>....] - ETA: 22s - loss: 0.6632 - regression_loss: 0.5968 - classification_loss: 0.0664 442/500 [=========================>....] - ETA: 22s - loss: 0.6631 - regression_loss: 0.5967 - classification_loss: 0.0664 443/500 [=========================>....] - ETA: 21s - loss: 0.6629 - regression_loss: 0.5965 - classification_loss: 0.0664 444/500 [=========================>....] - ETA: 21s - loss: 0.6625 - regression_loss: 0.5961 - classification_loss: 0.0663 445/500 [=========================>....] - ETA: 21s - loss: 0.6634 - regression_loss: 0.5969 - classification_loss: 0.0665 446/500 [=========================>....] - ETA: 20s - loss: 0.6633 - regression_loss: 0.5968 - classification_loss: 0.0665 447/500 [=========================>....] - ETA: 20s - loss: 0.6639 - regression_loss: 0.5974 - classification_loss: 0.0665 448/500 [=========================>....] - ETA: 19s - loss: 0.6634 - regression_loss: 0.5969 - classification_loss: 0.0665 449/500 [=========================>....] - ETA: 19s - loss: 0.6628 - regression_loss: 0.5965 - classification_loss: 0.0664 450/500 [==========================>...] - ETA: 19s - loss: 0.6628 - regression_loss: 0.5964 - classification_loss: 0.0663 451/500 [==========================>...] - ETA: 18s - loss: 0.6623 - regression_loss: 0.5961 - classification_loss: 0.0663 452/500 [==========================>...] - ETA: 18s - loss: 0.6627 - regression_loss: 0.5963 - classification_loss: 0.0663 453/500 [==========================>...] - ETA: 18s - loss: 0.6625 - regression_loss: 0.5962 - classification_loss: 0.0663 454/500 [==========================>...] - ETA: 17s - loss: 0.6622 - regression_loss: 0.5960 - classification_loss: 0.0662 455/500 [==========================>...] - ETA: 17s - loss: 0.6613 - regression_loss: 0.5951 - classification_loss: 0.0661 456/500 [==========================>...] - ETA: 16s - loss: 0.6610 - regression_loss: 0.5949 - classification_loss: 0.0661 457/500 [==========================>...] - ETA: 16s - loss: 0.6619 - regression_loss: 0.5957 - classification_loss: 0.0662 458/500 [==========================>...] - ETA: 16s - loss: 0.6616 - regression_loss: 0.5954 - classification_loss: 0.0662 459/500 [==========================>...] - ETA: 15s - loss: 0.6621 - regression_loss: 0.5959 - classification_loss: 0.0662 460/500 [==========================>...] - ETA: 15s - loss: 0.6616 - regression_loss: 0.5955 - classification_loss: 0.0662 461/500 [==========================>...] - ETA: 14s - loss: 0.6614 - regression_loss: 0.5952 - classification_loss: 0.0662 462/500 [==========================>...] - ETA: 14s - loss: 0.6613 - regression_loss: 0.5951 - classification_loss: 0.0662 463/500 [==========================>...] - ETA: 14s - loss: 0.6612 - regression_loss: 0.5949 - classification_loss: 0.0662 464/500 [==========================>...] - ETA: 13s - loss: 0.6610 - regression_loss: 0.5948 - classification_loss: 0.0662 465/500 [==========================>...] - ETA: 13s - loss: 0.6609 - regression_loss: 0.5948 - classification_loss: 0.0661 466/500 [==========================>...] - ETA: 13s - loss: 0.6611 - regression_loss: 0.5950 - classification_loss: 0.0661 467/500 [===========================>..] - ETA: 12s - loss: 0.6614 - regression_loss: 0.5953 - classification_loss: 0.0661 468/500 [===========================>..] - ETA: 12s - loss: 0.6610 - regression_loss: 0.5950 - classification_loss: 0.0660 469/500 [===========================>..] - ETA: 11s - loss: 0.6612 - regression_loss: 0.5952 - classification_loss: 0.0660 470/500 [===========================>..] - ETA: 11s - loss: 0.6604 - regression_loss: 0.5944 - classification_loss: 0.0660 471/500 [===========================>..] - ETA: 11s - loss: 0.6603 - regression_loss: 0.5943 - classification_loss: 0.0660 472/500 [===========================>..] - ETA: 10s - loss: 0.6601 - regression_loss: 0.5942 - classification_loss: 0.0659 473/500 [===========================>..] - ETA: 10s - loss: 0.6596 - regression_loss: 0.5937 - classification_loss: 0.0658 474/500 [===========================>..] - ETA: 9s - loss: 0.6591 - regression_loss: 0.5933 - classification_loss: 0.0658  475/500 [===========================>..] - ETA: 9s - loss: 0.6598 - regression_loss: 0.5939 - classification_loss: 0.0659 476/500 [===========================>..] - ETA: 9s - loss: 0.6591 - regression_loss: 0.5932 - classification_loss: 0.0658 477/500 [===========================>..] - ETA: 8s - loss: 0.6596 - regression_loss: 0.5937 - classification_loss: 0.0659 478/500 [===========================>..] - ETA: 8s - loss: 0.6601 - regression_loss: 0.5940 - classification_loss: 0.0660 479/500 [===========================>..] - ETA: 8s - loss: 0.6592 - regression_loss: 0.5933 - classification_loss: 0.0659 480/500 [===========================>..] - ETA: 7s - loss: 0.6601 - regression_loss: 0.5940 - classification_loss: 0.0661 481/500 [===========================>..] - ETA: 7s - loss: 0.6601 - regression_loss: 0.5940 - classification_loss: 0.0661 482/500 [===========================>..] - ETA: 6s - loss: 0.6602 - regression_loss: 0.5941 - classification_loss: 0.0661 483/500 [===========================>..] - ETA: 6s - loss: 0.6603 - regression_loss: 0.5941 - classification_loss: 0.0661 484/500 [============================>.] - ETA: 6s - loss: 0.6600 - regression_loss: 0.5938 - classification_loss: 0.0661 485/500 [============================>.] - ETA: 5s - loss: 0.6593 - regression_loss: 0.5933 - classification_loss: 0.0660 486/500 [============================>.] - ETA: 5s - loss: 0.6592 - regression_loss: 0.5932 - classification_loss: 0.0661 487/500 [============================>.] - ETA: 4s - loss: 0.6590 - regression_loss: 0.5930 - classification_loss: 0.0660 488/500 [============================>.] - ETA: 4s - loss: 0.6591 - regression_loss: 0.5931 - classification_loss: 0.0661 489/500 [============================>.] - ETA: 4s - loss: 0.6590 - regression_loss: 0.5929 - classification_loss: 0.0660 490/500 [============================>.] - ETA: 3s - loss: 0.6591 - regression_loss: 0.5931 - classification_loss: 0.0661 491/500 [============================>.] - ETA: 3s - loss: 0.6588 - regression_loss: 0.5928 - classification_loss: 0.0660 492/500 [============================>.] - ETA: 3s - loss: 0.6591 - regression_loss: 0.5931 - classification_loss: 0.0660 493/500 [============================>.] - ETA: 2s - loss: 0.6599 - regression_loss: 0.5939 - classification_loss: 0.0660 494/500 [============================>.] - ETA: 2s - loss: 0.6596 - regression_loss: 0.5936 - classification_loss: 0.0660 495/500 [============================>.] - ETA: 1s - loss: 0.6594 - regression_loss: 0.5935 - classification_loss: 0.0659 496/500 [============================>.] - ETA: 1s - loss: 0.6592 - regression_loss: 0.5933 - classification_loss: 0.0659 497/500 [============================>.] - ETA: 1s - loss: 0.6587 - regression_loss: 0.5928 - classification_loss: 0.0658 498/500 [============================>.] - ETA: 0s - loss: 0.6587 - regression_loss: 0.5928 - classification_loss: 0.0659 499/500 [============================>.] - ETA: 0s - loss: 0.6588 - regression_loss: 0.5929 - classification_loss: 0.0659 500/500 [==============================] - 191s 383ms/step - loss: 0.6591 - regression_loss: 0.5932 - classification_loss: 0.0659 1172 instances of class plum with average precision: 0.7110 mAP: 0.7110 Epoch 00016: saving model to ./training/snapshots/resnet101_pascal_16.h5 Epoch 17/20 1/500 [..............................] - ETA: 2:55 - loss: 0.3724 - regression_loss: 0.3014 - classification_loss: 0.0710 2/500 [..............................] - ETA: 3:04 - loss: 0.7257 - regression_loss: 0.6707 - classification_loss: 0.0550 3/500 [..............................] - ETA: 3:07 - loss: 0.7275 - regression_loss: 0.6639 - classification_loss: 0.0635 4/500 [..............................] - ETA: 3:08 - loss: 0.7755 - regression_loss: 0.7067 - classification_loss: 0.0688 5/500 [..............................] - ETA: 3:10 - loss: 0.7334 - regression_loss: 0.6674 - classification_loss: 0.0659 6/500 [..............................] - ETA: 3:10 - loss: 0.6635 - regression_loss: 0.6055 - classification_loss: 0.0580 7/500 [..............................] - ETA: 3:11 - loss: 0.6857 - regression_loss: 0.6216 - classification_loss: 0.0641 8/500 [..............................] - ETA: 3:10 - loss: 0.6668 - regression_loss: 0.6034 - classification_loss: 0.0634 9/500 [..............................] - ETA: 3:09 - loss: 0.7146 - regression_loss: 0.6519 - classification_loss: 0.0626 10/500 [..............................] - ETA: 3:07 - loss: 0.7103 - regression_loss: 0.6479 - classification_loss: 0.0624 11/500 [..............................] - ETA: 3:06 - loss: 0.7169 - regression_loss: 0.6527 - classification_loss: 0.0642 12/500 [..............................] - ETA: 3:05 - loss: 0.7142 - regression_loss: 0.6491 - classification_loss: 0.0651 13/500 [..............................] - ETA: 3:03 - loss: 0.7137 - regression_loss: 0.6480 - classification_loss: 0.0657 14/500 [..............................] - ETA: 3:03 - loss: 0.7183 - regression_loss: 0.6536 - classification_loss: 0.0648 15/500 [..............................] - ETA: 3:03 - loss: 0.7517 - regression_loss: 0.6816 - classification_loss: 0.0701 16/500 [..............................] - ETA: 3:03 - loss: 0.7213 - regression_loss: 0.6526 - classification_loss: 0.0687 17/500 [>.............................] - ETA: 3:04 - loss: 0.7288 - regression_loss: 0.6596 - classification_loss: 0.0692 18/500 [>.............................] - ETA: 3:04 - loss: 0.7206 - regression_loss: 0.6524 - classification_loss: 0.0682 19/500 [>.............................] - ETA: 3:04 - loss: 0.7592 - regression_loss: 0.6869 - classification_loss: 0.0723 20/500 [>.............................] - ETA: 3:04 - loss: 0.7425 - regression_loss: 0.6727 - classification_loss: 0.0698 21/500 [>.............................] - ETA: 3:04 - loss: 0.7785 - regression_loss: 0.7051 - classification_loss: 0.0734 22/500 [>.............................] - ETA: 3:04 - loss: 0.7651 - regression_loss: 0.6935 - classification_loss: 0.0716 23/500 [>.............................] - ETA: 3:04 - loss: 0.7664 - regression_loss: 0.6953 - classification_loss: 0.0711 24/500 [>.............................] - ETA: 3:03 - loss: 0.7598 - regression_loss: 0.6892 - classification_loss: 0.0706 25/500 [>.............................] - ETA: 3:03 - loss: 0.7681 - regression_loss: 0.6960 - classification_loss: 0.0721 26/500 [>.............................] - ETA: 3:02 - loss: 0.7649 - regression_loss: 0.6932 - classification_loss: 0.0717 27/500 [>.............................] - ETA: 3:01 - loss: 0.7607 - regression_loss: 0.6889 - classification_loss: 0.0718 28/500 [>.............................] - ETA: 3:00 - loss: 0.7623 - regression_loss: 0.6906 - classification_loss: 0.0717 29/500 [>.............................] - ETA: 3:00 - loss: 0.7488 - regression_loss: 0.6790 - classification_loss: 0.0698 30/500 [>.............................] - ETA: 3:00 - loss: 0.7524 - regression_loss: 0.6824 - classification_loss: 0.0700 31/500 [>.............................] - ETA: 2:59 - loss: 0.7475 - regression_loss: 0.6786 - classification_loss: 0.0688 32/500 [>.............................] - ETA: 2:59 - loss: 0.7443 - regression_loss: 0.6760 - classification_loss: 0.0683 33/500 [>.............................] - ETA: 2:59 - loss: 0.7499 - regression_loss: 0.6803 - classification_loss: 0.0696 34/500 [=>............................] - ETA: 2:58 - loss: 0.7363 - regression_loss: 0.6679 - classification_loss: 0.0683 35/500 [=>............................] - ETA: 2:57 - loss: 0.7432 - regression_loss: 0.6737 - classification_loss: 0.0695 36/500 [=>............................] - ETA: 2:57 - loss: 0.7448 - regression_loss: 0.6748 - classification_loss: 0.0699 37/500 [=>............................] - ETA: 2:57 - loss: 0.7398 - regression_loss: 0.6706 - classification_loss: 0.0692 38/500 [=>............................] - ETA: 2:57 - loss: 0.7405 - regression_loss: 0.6721 - classification_loss: 0.0684 39/500 [=>............................] - ETA: 2:56 - loss: 0.7392 - regression_loss: 0.6708 - classification_loss: 0.0684 40/500 [=>............................] - ETA: 2:56 - loss: 0.7416 - regression_loss: 0.6722 - classification_loss: 0.0693 41/500 [=>............................] - ETA: 2:55 - loss: 0.7392 - regression_loss: 0.6700 - classification_loss: 0.0692 42/500 [=>............................] - ETA: 2:55 - loss: 0.7324 - regression_loss: 0.6634 - classification_loss: 0.0690 43/500 [=>............................] - ETA: 2:55 - loss: 0.7224 - regression_loss: 0.6544 - classification_loss: 0.0680 44/500 [=>............................] - ETA: 2:55 - loss: 0.7230 - regression_loss: 0.6552 - classification_loss: 0.0678 45/500 [=>............................] - ETA: 2:54 - loss: 0.7190 - regression_loss: 0.6521 - classification_loss: 0.0669 46/500 [=>............................] - ETA: 2:54 - loss: 0.7138 - regression_loss: 0.6472 - classification_loss: 0.0666 47/500 [=>............................] - ETA: 2:53 - loss: 0.7106 - regression_loss: 0.6440 - classification_loss: 0.0667 48/500 [=>............................] - ETA: 2:53 - loss: 0.7075 - regression_loss: 0.6412 - classification_loss: 0.0663 49/500 [=>............................] - ETA: 2:52 - loss: 0.7136 - regression_loss: 0.6457 - classification_loss: 0.0679 50/500 [==>...........................] - ETA: 2:52 - loss: 0.7125 - regression_loss: 0.6448 - classification_loss: 0.0677 51/500 [==>...........................] - ETA: 2:52 - loss: 0.7196 - regression_loss: 0.6509 - classification_loss: 0.0687 52/500 [==>...........................] - ETA: 2:51 - loss: 0.7170 - regression_loss: 0.6484 - classification_loss: 0.0686 53/500 [==>...........................] - ETA: 2:51 - loss: 0.7186 - regression_loss: 0.6499 - classification_loss: 0.0687 54/500 [==>...........................] - ETA: 2:51 - loss: 0.7113 - regression_loss: 0.6435 - classification_loss: 0.0677 55/500 [==>...........................] - ETA: 2:51 - loss: 0.7127 - regression_loss: 0.6451 - classification_loss: 0.0676 56/500 [==>...........................] - ETA: 2:50 - loss: 0.7118 - regression_loss: 0.6444 - classification_loss: 0.0675 57/500 [==>...........................] - ETA: 2:50 - loss: 0.7041 - regression_loss: 0.6373 - classification_loss: 0.0667 58/500 [==>...........................] - ETA: 2:50 - loss: 0.7045 - regression_loss: 0.6375 - classification_loss: 0.0670 59/500 [==>...........................] - ETA: 2:49 - loss: 0.7055 - regression_loss: 0.6386 - classification_loss: 0.0670 60/500 [==>...........................] - ETA: 2:49 - loss: 0.7092 - regression_loss: 0.6418 - classification_loss: 0.0675 61/500 [==>...........................] - ETA: 2:48 - loss: 0.7107 - regression_loss: 0.6434 - classification_loss: 0.0674 62/500 [==>...........................] - ETA: 2:48 - loss: 0.7082 - regression_loss: 0.6414 - classification_loss: 0.0668 63/500 [==>...........................] - ETA: 2:48 - loss: 0.7030 - regression_loss: 0.6367 - classification_loss: 0.0663 64/500 [==>...........................] - ETA: 2:47 - loss: 0.7027 - regression_loss: 0.6360 - classification_loss: 0.0667 65/500 [==>...........................] - ETA: 2:47 - loss: 0.6974 - regression_loss: 0.6313 - classification_loss: 0.0661 66/500 [==>...........................] - ETA: 2:46 - loss: 0.6940 - regression_loss: 0.6284 - classification_loss: 0.0656 67/500 [===>..........................] - ETA: 2:46 - loss: 0.6918 - regression_loss: 0.6260 - classification_loss: 0.0657 68/500 [===>..........................] - ETA: 2:45 - loss: 0.6932 - regression_loss: 0.6268 - classification_loss: 0.0664 69/500 [===>..........................] - ETA: 2:45 - loss: 0.6898 - regression_loss: 0.6239 - classification_loss: 0.0659 70/500 [===>..........................] - ETA: 2:45 - loss: 0.6848 - regression_loss: 0.6195 - classification_loss: 0.0653 71/500 [===>..........................] - ETA: 2:44 - loss: 0.6867 - regression_loss: 0.6210 - classification_loss: 0.0657 72/500 [===>..........................] - ETA: 2:44 - loss: 0.6829 - regression_loss: 0.6177 - classification_loss: 0.0652 73/500 [===>..........................] - ETA: 2:44 - loss: 0.6842 - regression_loss: 0.6188 - classification_loss: 0.0654 74/500 [===>..........................] - ETA: 2:43 - loss: 0.6829 - regression_loss: 0.6176 - classification_loss: 0.0653 75/500 [===>..........................] - ETA: 2:43 - loss: 0.6823 - regression_loss: 0.6167 - classification_loss: 0.0656 76/500 [===>..........................] - ETA: 2:43 - loss: 0.6823 - regression_loss: 0.6167 - classification_loss: 0.0656 77/500 [===>..........................] - ETA: 2:42 - loss: 0.6819 - regression_loss: 0.6160 - classification_loss: 0.0658 78/500 [===>..........................] - ETA: 2:42 - loss: 0.6819 - regression_loss: 0.6159 - classification_loss: 0.0661 79/500 [===>..........................] - ETA: 2:42 - loss: 0.6779 - regression_loss: 0.6123 - classification_loss: 0.0656 80/500 [===>..........................] - ETA: 2:41 - loss: 0.6765 - regression_loss: 0.6113 - classification_loss: 0.0652 81/500 [===>..........................] - ETA: 2:41 - loss: 0.6794 - regression_loss: 0.6136 - classification_loss: 0.0658 82/500 [===>..........................] - ETA: 2:40 - loss: 0.6793 - regression_loss: 0.6138 - classification_loss: 0.0655 83/500 [===>..........................] - ETA: 2:40 - loss: 0.6812 - regression_loss: 0.6153 - classification_loss: 0.0660 84/500 [====>.........................] - ETA: 2:40 - loss: 0.6824 - regression_loss: 0.6160 - classification_loss: 0.0665 85/500 [====>.........................] - ETA: 2:39 - loss: 0.6818 - regression_loss: 0.6153 - classification_loss: 0.0665 86/500 [====>.........................] - ETA: 2:39 - loss: 0.6776 - regression_loss: 0.6117 - classification_loss: 0.0659 87/500 [====>.........................] - ETA: 2:39 - loss: 0.6763 - regression_loss: 0.6103 - classification_loss: 0.0660 88/500 [====>.........................] - ETA: 2:39 - loss: 0.6771 - regression_loss: 0.6110 - classification_loss: 0.0660 89/500 [====>.........................] - ETA: 2:38 - loss: 0.6741 - regression_loss: 0.6086 - classification_loss: 0.0656 90/500 [====>.........................] - ETA: 2:38 - loss: 0.6713 - regression_loss: 0.6061 - classification_loss: 0.0653 91/500 [====>.........................] - ETA: 2:37 - loss: 0.6735 - regression_loss: 0.6084 - classification_loss: 0.0651 92/500 [====>.........................] - ETA: 2:37 - loss: 0.6713 - regression_loss: 0.6067 - classification_loss: 0.0646 93/500 [====>.........................] - ETA: 2:36 - loss: 0.6684 - regression_loss: 0.6044 - classification_loss: 0.0640 94/500 [====>.........................] - ETA: 2:36 - loss: 0.6676 - regression_loss: 0.6037 - classification_loss: 0.0639 95/500 [====>.........................] - ETA: 2:35 - loss: 0.6697 - regression_loss: 0.6057 - classification_loss: 0.0639 96/500 [====>.........................] - ETA: 2:35 - loss: 0.6700 - regression_loss: 0.6061 - classification_loss: 0.0639 97/500 [====>.........................] - ETA: 2:35 - loss: 0.6752 - regression_loss: 0.6111 - classification_loss: 0.0641 98/500 [====>.........................] - ETA: 2:34 - loss: 0.6819 - regression_loss: 0.6166 - classification_loss: 0.0653 99/500 [====>.........................] - ETA: 2:34 - loss: 0.6858 - regression_loss: 0.6199 - classification_loss: 0.0658 100/500 [=====>........................] - ETA: 2:33 - loss: 0.6828 - regression_loss: 0.6173 - classification_loss: 0.0655 101/500 [=====>........................] - ETA: 2:33 - loss: 0.6819 - regression_loss: 0.6166 - classification_loss: 0.0652 102/500 [=====>........................] - ETA: 2:33 - loss: 0.6852 - regression_loss: 0.6193 - classification_loss: 0.0658 103/500 [=====>........................] - ETA: 2:32 - loss: 0.6888 - regression_loss: 0.6227 - classification_loss: 0.0661 104/500 [=====>........................] - ETA: 2:32 - loss: 0.6909 - regression_loss: 0.6249 - classification_loss: 0.0660 105/500 [=====>........................] - ETA: 2:31 - loss: 0.6924 - regression_loss: 0.6260 - classification_loss: 0.0664 106/500 [=====>........................] - ETA: 2:31 - loss: 0.6905 - regression_loss: 0.6244 - classification_loss: 0.0661 107/500 [=====>........................] - ETA: 2:31 - loss: 0.6884 - regression_loss: 0.6225 - classification_loss: 0.0659 108/500 [=====>........................] - ETA: 2:30 - loss: 0.6928 - regression_loss: 0.6249 - classification_loss: 0.0679 109/500 [=====>........................] - ETA: 2:30 - loss: 0.6888 - regression_loss: 0.6212 - classification_loss: 0.0675 110/500 [=====>........................] - ETA: 2:29 - loss: 0.6868 - regression_loss: 0.6195 - classification_loss: 0.0673 111/500 [=====>........................] - ETA: 2:29 - loss: 0.6934 - regression_loss: 0.6252 - classification_loss: 0.0682 112/500 [=====>........................] - ETA: 2:29 - loss: 0.6961 - regression_loss: 0.6277 - classification_loss: 0.0685 113/500 [=====>........................] - ETA: 2:28 - loss: 0.6941 - regression_loss: 0.6259 - classification_loss: 0.0682 114/500 [=====>........................] - ETA: 2:28 - loss: 0.6938 - regression_loss: 0.6257 - classification_loss: 0.0681 115/500 [=====>........................] - ETA: 2:27 - loss: 0.6919 - regression_loss: 0.6239 - classification_loss: 0.0680 116/500 [=====>........................] - ETA: 2:27 - loss: 0.6927 - regression_loss: 0.6231 - classification_loss: 0.0696 117/500 [======>.......................] - ETA: 2:27 - loss: 0.6946 - regression_loss: 0.6243 - classification_loss: 0.0702 118/500 [======>.......................] - ETA: 2:26 - loss: 0.6984 - regression_loss: 0.6278 - classification_loss: 0.0706 119/500 [======>.......................] - ETA: 2:26 - loss: 0.6990 - regression_loss: 0.6284 - classification_loss: 0.0706 120/500 [======>.......................] - ETA: 2:26 - loss: 0.6974 - regression_loss: 0.6269 - classification_loss: 0.0705 121/500 [======>.......................] - ETA: 2:25 - loss: 0.6976 - regression_loss: 0.6274 - classification_loss: 0.0703 122/500 [======>.......................] - ETA: 2:25 - loss: 0.7027 - regression_loss: 0.6321 - classification_loss: 0.0706 123/500 [======>.......................] - ETA: 2:24 - loss: 0.7068 - regression_loss: 0.6359 - classification_loss: 0.0709 124/500 [======>.......................] - ETA: 2:24 - loss: 0.7081 - regression_loss: 0.6372 - classification_loss: 0.0709 125/500 [======>.......................] - ETA: 2:24 - loss: 0.7141 - regression_loss: 0.6423 - classification_loss: 0.0718 126/500 [======>.......................] - ETA: 2:23 - loss: 0.7126 - regression_loss: 0.6409 - classification_loss: 0.0717 127/500 [======>.......................] - ETA: 2:23 - loss: 0.7134 - regression_loss: 0.6413 - classification_loss: 0.0721 128/500 [======>.......................] - ETA: 2:23 - loss: 0.7206 - regression_loss: 0.6476 - classification_loss: 0.0731 129/500 [======>.......................] - ETA: 2:22 - loss: 0.7209 - regression_loss: 0.6478 - classification_loss: 0.0732 130/500 [======>.......................] - ETA: 2:22 - loss: 0.7215 - regression_loss: 0.6481 - classification_loss: 0.0734 131/500 [======>.......................] - ETA: 2:21 - loss: 0.7192 - regression_loss: 0.6459 - classification_loss: 0.0733 132/500 [======>.......................] - ETA: 2:21 - loss: 0.7205 - regression_loss: 0.6469 - classification_loss: 0.0736 133/500 [======>.......................] - ETA: 2:21 - loss: 0.7194 - regression_loss: 0.6458 - classification_loss: 0.0736 134/500 [=======>......................] - ETA: 2:20 - loss: 0.7167 - regression_loss: 0.6434 - classification_loss: 0.0733 135/500 [=======>......................] - ETA: 2:20 - loss: 0.7135 - regression_loss: 0.6406 - classification_loss: 0.0730 136/500 [=======>......................] - ETA: 2:20 - loss: 0.7152 - regression_loss: 0.6420 - classification_loss: 0.0732 137/500 [=======>......................] - ETA: 2:19 - loss: 0.7139 - regression_loss: 0.6410 - classification_loss: 0.0729 138/500 [=======>......................] - ETA: 2:19 - loss: 0.7115 - regression_loss: 0.6390 - classification_loss: 0.0725 139/500 [=======>......................] - ETA: 2:18 - loss: 0.7126 - regression_loss: 0.6398 - classification_loss: 0.0728 140/500 [=======>......................] - ETA: 2:18 - loss: 0.7109 - regression_loss: 0.6383 - classification_loss: 0.0726 141/500 [=======>......................] - ETA: 2:17 - loss: 0.7111 - regression_loss: 0.6384 - classification_loss: 0.0727 142/500 [=======>......................] - ETA: 2:17 - loss: 0.7095 - regression_loss: 0.6366 - classification_loss: 0.0729 143/500 [=======>......................] - ETA: 2:17 - loss: 0.7090 - regression_loss: 0.6358 - classification_loss: 0.0732 144/500 [=======>......................] - ETA: 2:16 - loss: 0.7092 - regression_loss: 0.6360 - classification_loss: 0.0732 145/500 [=======>......................] - ETA: 2:16 - loss: 0.7098 - regression_loss: 0.6366 - classification_loss: 0.0732 146/500 [=======>......................] - ETA: 2:15 - loss: 0.7121 - regression_loss: 0.6386 - classification_loss: 0.0735 147/500 [=======>......................] - ETA: 2:15 - loss: 0.7116 - regression_loss: 0.6382 - classification_loss: 0.0733 148/500 [=======>......................] - ETA: 2:15 - loss: 0.7098 - regression_loss: 0.6367 - classification_loss: 0.0731 149/500 [=======>......................] - ETA: 2:14 - loss: 0.7071 - regression_loss: 0.6342 - classification_loss: 0.0728 150/500 [========>.....................] - ETA: 2:14 - loss: 0.7073 - regression_loss: 0.6344 - classification_loss: 0.0729 151/500 [========>.....................] - ETA: 2:13 - loss: 0.7095 - regression_loss: 0.6363 - classification_loss: 0.0733 152/500 [========>.....................] - ETA: 2:13 - loss: 0.7069 - regression_loss: 0.6338 - classification_loss: 0.0730 153/500 [========>.....................] - ETA: 2:13 - loss: 0.7076 - regression_loss: 0.6344 - classification_loss: 0.0732 154/500 [========>.....................] - ETA: 2:12 - loss: 0.7082 - regression_loss: 0.6348 - classification_loss: 0.0734 155/500 [========>.....................] - ETA: 2:12 - loss: 0.7095 - regression_loss: 0.6359 - classification_loss: 0.0736 156/500 [========>.....................] - ETA: 2:11 - loss: 0.7102 - regression_loss: 0.6367 - classification_loss: 0.0735 157/500 [========>.....................] - ETA: 2:11 - loss: 0.7084 - regression_loss: 0.6352 - classification_loss: 0.0732 158/500 [========>.....................] - ETA: 2:11 - loss: 0.7074 - regression_loss: 0.6343 - classification_loss: 0.0731 159/500 [========>.....................] - ETA: 2:10 - loss: 0.7083 - regression_loss: 0.6350 - classification_loss: 0.0733 160/500 [========>.....................] - ETA: 2:10 - loss: 0.7099 - regression_loss: 0.6365 - classification_loss: 0.0734 161/500 [========>.....................] - ETA: 2:09 - loss: 0.7091 - regression_loss: 0.6358 - classification_loss: 0.0733 162/500 [========>.....................] - ETA: 2:09 - loss: 0.7084 - regression_loss: 0.6353 - classification_loss: 0.0731 163/500 [========>.....................] - ETA: 2:09 - loss: 0.7086 - regression_loss: 0.6355 - classification_loss: 0.0731 164/500 [========>.....................] - ETA: 2:08 - loss: 0.7075 - regression_loss: 0.6346 - classification_loss: 0.0730 165/500 [========>.....................] - ETA: 2:08 - loss: 0.7058 - regression_loss: 0.6329 - classification_loss: 0.0729 166/500 [========>.....................] - ETA: 2:08 - loss: 0.7057 - regression_loss: 0.6328 - classification_loss: 0.0729 167/500 [=========>....................] - ETA: 2:07 - loss: 0.7037 - regression_loss: 0.6310 - classification_loss: 0.0728 168/500 [=========>....................] - ETA: 2:07 - loss: 0.7034 - regression_loss: 0.6307 - classification_loss: 0.0727 169/500 [=========>....................] - ETA: 2:06 - loss: 0.7022 - regression_loss: 0.6295 - classification_loss: 0.0726 170/500 [=========>....................] - ETA: 2:06 - loss: 0.7012 - regression_loss: 0.6288 - classification_loss: 0.0724 171/500 [=========>....................] - ETA: 2:06 - loss: 0.7013 - regression_loss: 0.6290 - classification_loss: 0.0723 172/500 [=========>....................] - ETA: 2:05 - loss: 0.7000 - regression_loss: 0.6278 - classification_loss: 0.0721 173/500 [=========>....................] - ETA: 2:05 - loss: 0.6993 - regression_loss: 0.6273 - classification_loss: 0.0721 174/500 [=========>....................] - ETA: 2:04 - loss: 0.6992 - regression_loss: 0.6272 - classification_loss: 0.0721 175/500 [=========>....................] - ETA: 2:04 - loss: 0.6997 - regression_loss: 0.6277 - classification_loss: 0.0720 176/500 [=========>....................] - ETA: 2:04 - loss: 0.6989 - regression_loss: 0.6271 - classification_loss: 0.0718 177/500 [=========>....................] - ETA: 2:03 - loss: 0.6981 - regression_loss: 0.6262 - classification_loss: 0.0719 178/500 [=========>....................] - ETA: 2:03 - loss: 0.6958 - regression_loss: 0.6242 - classification_loss: 0.0716 179/500 [=========>....................] - ETA: 2:02 - loss: 0.6960 - regression_loss: 0.6242 - classification_loss: 0.0718 180/500 [=========>....................] - ETA: 2:02 - loss: 0.7002 - regression_loss: 0.6280 - classification_loss: 0.0721 181/500 [=========>....................] - ETA: 2:02 - loss: 0.6991 - regression_loss: 0.6272 - classification_loss: 0.0720 182/500 [=========>....................] - ETA: 2:01 - loss: 0.7005 - regression_loss: 0.6284 - classification_loss: 0.0721 183/500 [=========>....................] - ETA: 2:01 - loss: 0.7044 - regression_loss: 0.6316 - classification_loss: 0.0728 184/500 [==========>...................] - ETA: 2:00 - loss: 0.7050 - regression_loss: 0.6322 - classification_loss: 0.0728 185/500 [==========>...................] - ETA: 2:00 - loss: 0.7027 - regression_loss: 0.6302 - classification_loss: 0.0725 186/500 [==========>...................] - ETA: 2:00 - loss: 0.7024 - regression_loss: 0.6300 - classification_loss: 0.0724 187/500 [==========>...................] - ETA: 1:59 - loss: 0.7021 - regression_loss: 0.6297 - classification_loss: 0.0724 188/500 [==========>...................] - ETA: 1:59 - loss: 0.7014 - regression_loss: 0.6292 - classification_loss: 0.0723 189/500 [==========>...................] - ETA: 1:58 - loss: 0.7036 - regression_loss: 0.6310 - classification_loss: 0.0726 190/500 [==========>...................] - ETA: 1:58 - loss: 0.7035 - regression_loss: 0.6309 - classification_loss: 0.0726 191/500 [==========>...................] - ETA: 1:58 - loss: 0.7021 - regression_loss: 0.6297 - classification_loss: 0.0724 192/500 [==========>...................] - ETA: 1:57 - loss: 0.7025 - regression_loss: 0.6299 - classification_loss: 0.0726 193/500 [==========>...................] - ETA: 1:57 - loss: 0.7034 - regression_loss: 0.6307 - classification_loss: 0.0727 194/500 [==========>...................] - ETA: 1:56 - loss: 0.7039 - regression_loss: 0.6312 - classification_loss: 0.0727 195/500 [==========>...................] - ETA: 1:56 - loss: 0.7043 - regression_loss: 0.6316 - classification_loss: 0.0727 196/500 [==========>...................] - ETA: 1:56 - loss: 0.7040 - regression_loss: 0.6315 - classification_loss: 0.0725 197/500 [==========>...................] - ETA: 1:55 - loss: 0.7031 - regression_loss: 0.6308 - classification_loss: 0.0724 198/500 [==========>...................] - ETA: 1:55 - loss: 0.7030 - regression_loss: 0.6306 - classification_loss: 0.0723 199/500 [==========>...................] - ETA: 1:55 - loss: 0.7013 - regression_loss: 0.6292 - classification_loss: 0.0721 200/500 [===========>..................] - ETA: 1:54 - loss: 0.7004 - regression_loss: 0.6285 - classification_loss: 0.0720 201/500 [===========>..................] - ETA: 1:54 - loss: 0.6991 - regression_loss: 0.6273 - classification_loss: 0.0718 202/500 [===========>..................] - ETA: 1:53 - loss: 0.6973 - regression_loss: 0.6257 - classification_loss: 0.0716 203/500 [===========>..................] - ETA: 1:53 - loss: 0.6957 - regression_loss: 0.6243 - classification_loss: 0.0714 204/500 [===========>..................] - ETA: 1:53 - loss: 0.6941 - regression_loss: 0.6229 - classification_loss: 0.0712 205/500 [===========>..................] - ETA: 1:52 - loss: 0.6935 - regression_loss: 0.6224 - classification_loss: 0.0711 206/500 [===========>..................] - ETA: 1:52 - loss: 0.6931 - regression_loss: 0.6220 - classification_loss: 0.0711 207/500 [===========>..................] - ETA: 1:51 - loss: 0.6929 - regression_loss: 0.6219 - classification_loss: 0.0710 208/500 [===========>..................] - ETA: 1:51 - loss: 0.6924 - regression_loss: 0.6214 - classification_loss: 0.0710 209/500 [===========>..................] - ETA: 1:51 - loss: 0.6911 - regression_loss: 0.6202 - classification_loss: 0.0709 210/500 [===========>..................] - ETA: 1:50 - loss: 0.6920 - regression_loss: 0.6212 - classification_loss: 0.0708 211/500 [===========>..................] - ETA: 1:50 - loss: 0.6913 - regression_loss: 0.6206 - classification_loss: 0.0707 212/500 [===========>..................] - ETA: 1:50 - loss: 0.6902 - regression_loss: 0.6197 - classification_loss: 0.0705 213/500 [===========>..................] - ETA: 1:49 - loss: 0.6886 - regression_loss: 0.6183 - classification_loss: 0.0703 214/500 [===========>..................] - ETA: 1:49 - loss: 0.6889 - regression_loss: 0.6186 - classification_loss: 0.0703 215/500 [===========>..................] - ETA: 1:48 - loss: 0.6895 - regression_loss: 0.6192 - classification_loss: 0.0703 216/500 [===========>..................] - ETA: 1:48 - loss: 0.6909 - regression_loss: 0.6204 - classification_loss: 0.0705 217/500 [============>.................] - ETA: 1:48 - loss: 0.6913 - regression_loss: 0.6208 - classification_loss: 0.0706 218/500 [============>.................] - ETA: 1:47 - loss: 0.6918 - regression_loss: 0.6212 - classification_loss: 0.0706 219/500 [============>.................] - ETA: 1:47 - loss: 0.6906 - regression_loss: 0.6202 - classification_loss: 0.0705 220/500 [============>.................] - ETA: 1:47 - loss: 0.6904 - regression_loss: 0.6200 - classification_loss: 0.0704 221/500 [============>.................] - ETA: 1:46 - loss: 0.6898 - regression_loss: 0.6195 - classification_loss: 0.0704 222/500 [============>.................] - ETA: 1:46 - loss: 0.6881 - regression_loss: 0.6179 - classification_loss: 0.0702 223/500 [============>.................] - ETA: 1:45 - loss: 0.6871 - regression_loss: 0.6171 - classification_loss: 0.0700 224/500 [============>.................] - ETA: 1:45 - loss: 0.6855 - regression_loss: 0.6157 - classification_loss: 0.0697 225/500 [============>.................] - ETA: 1:45 - loss: 0.6843 - regression_loss: 0.6148 - classification_loss: 0.0695 226/500 [============>.................] - ETA: 1:44 - loss: 0.6835 - regression_loss: 0.6142 - classification_loss: 0.0693 227/500 [============>.................] - ETA: 1:44 - loss: 0.6825 - regression_loss: 0.6134 - classification_loss: 0.0691 228/500 [============>.................] - ETA: 1:43 - loss: 0.6825 - regression_loss: 0.6134 - classification_loss: 0.0691 229/500 [============>.................] - ETA: 1:43 - loss: 0.6828 - regression_loss: 0.6138 - classification_loss: 0.0690 230/500 [============>.................] - ETA: 1:43 - loss: 0.6804 - regression_loss: 0.6116 - classification_loss: 0.0687 231/500 [============>.................] - ETA: 1:42 - loss: 0.6797 - regression_loss: 0.6111 - classification_loss: 0.0687 232/500 [============>.................] - ETA: 1:42 - loss: 0.6793 - regression_loss: 0.6107 - classification_loss: 0.0685 233/500 [============>.................] - ETA: 1:42 - loss: 0.6791 - regression_loss: 0.6105 - classification_loss: 0.0686 234/500 [=============>................] - ETA: 1:41 - loss: 0.6795 - regression_loss: 0.6108 - classification_loss: 0.0687 235/500 [=============>................] - ETA: 1:41 - loss: 0.6792 - regression_loss: 0.6106 - classification_loss: 0.0686 236/500 [=============>................] - ETA: 1:40 - loss: 0.6804 - regression_loss: 0.6117 - classification_loss: 0.0686 237/500 [=============>................] - ETA: 1:40 - loss: 0.6805 - regression_loss: 0.6119 - classification_loss: 0.0686 238/500 [=============>................] - ETA: 1:40 - loss: 0.6807 - regression_loss: 0.6121 - classification_loss: 0.0686 239/500 [=============>................] - ETA: 1:39 - loss: 0.6828 - regression_loss: 0.6140 - classification_loss: 0.0688 240/500 [=============>................] - ETA: 1:39 - loss: 0.6829 - regression_loss: 0.6141 - classification_loss: 0.0689 241/500 [=============>................] - ETA: 1:39 - loss: 0.6819 - regression_loss: 0.6132 - classification_loss: 0.0687 242/500 [=============>................] - ETA: 1:38 - loss: 0.6810 - regression_loss: 0.6123 - classification_loss: 0.0687 243/500 [=============>................] - ETA: 1:38 - loss: 0.6798 - regression_loss: 0.6113 - classification_loss: 0.0685 244/500 [=============>................] - ETA: 1:37 - loss: 0.6802 - regression_loss: 0.6117 - classification_loss: 0.0685 245/500 [=============>................] - ETA: 1:37 - loss: 0.6800 - regression_loss: 0.6114 - classification_loss: 0.0686 246/500 [=============>................] - ETA: 1:37 - loss: 0.6794 - regression_loss: 0.6109 - classification_loss: 0.0685 247/500 [=============>................] - ETA: 1:36 - loss: 0.6792 - regression_loss: 0.6109 - classification_loss: 0.0684 248/500 [=============>................] - ETA: 1:36 - loss: 0.6797 - regression_loss: 0.6112 - classification_loss: 0.0685 249/500 [=============>................] - ETA: 1:35 - loss: 0.6794 - regression_loss: 0.6109 - classification_loss: 0.0684 250/500 [==============>...............] - ETA: 1:35 - loss: 0.6800 - regression_loss: 0.6113 - classification_loss: 0.0686 251/500 [==============>...............] - ETA: 1:35 - loss: 0.6810 - regression_loss: 0.6122 - classification_loss: 0.0687 252/500 [==============>...............] - ETA: 1:34 - loss: 0.6803 - regression_loss: 0.6115 - classification_loss: 0.0687 253/500 [==============>...............] - ETA: 1:34 - loss: 0.6792 - regression_loss: 0.6107 - classification_loss: 0.0686 254/500 [==============>...............] - ETA: 1:33 - loss: 0.6800 - regression_loss: 0.6114 - classification_loss: 0.0686 255/500 [==============>...............] - ETA: 1:33 - loss: 0.6787 - regression_loss: 0.6102 - classification_loss: 0.0685 256/500 [==============>...............] - ETA: 1:33 - loss: 0.6800 - regression_loss: 0.6113 - classification_loss: 0.0686 257/500 [==============>...............] - ETA: 1:32 - loss: 0.6799 - regression_loss: 0.6113 - classification_loss: 0.0686 258/500 [==============>...............] - ETA: 1:32 - loss: 0.6805 - regression_loss: 0.6119 - classification_loss: 0.0687 259/500 [==============>...............] - ETA: 1:32 - loss: 0.6809 - regression_loss: 0.6121 - classification_loss: 0.0688 260/500 [==============>...............] - ETA: 1:31 - loss: 0.6810 - regression_loss: 0.6122 - classification_loss: 0.0688 261/500 [==============>...............] - ETA: 1:31 - loss: 0.6809 - regression_loss: 0.6122 - classification_loss: 0.0687 262/500 [==============>...............] - ETA: 1:30 - loss: 0.6805 - regression_loss: 0.6118 - classification_loss: 0.0687 263/500 [==============>...............] - ETA: 1:30 - loss: 0.6789 - regression_loss: 0.6104 - classification_loss: 0.0685 264/500 [==============>...............] - ETA: 1:30 - loss: 0.6794 - regression_loss: 0.6108 - classification_loss: 0.0686 265/500 [==============>...............] - ETA: 1:29 - loss: 0.6793 - regression_loss: 0.6107 - classification_loss: 0.0685 266/500 [==============>...............] - ETA: 1:29 - loss: 0.6802 - regression_loss: 0.6117 - classification_loss: 0.0686 267/500 [===============>..............] - ETA: 1:29 - loss: 0.6797 - regression_loss: 0.6113 - classification_loss: 0.0684 268/500 [===============>..............] - ETA: 1:28 - loss: 0.6783 - regression_loss: 0.6100 - classification_loss: 0.0683 269/500 [===============>..............] - ETA: 1:28 - loss: 0.6774 - regression_loss: 0.6092 - classification_loss: 0.0682 270/500 [===============>..............] - ETA: 1:27 - loss: 0.6783 - regression_loss: 0.6102 - classification_loss: 0.0680 271/500 [===============>..............] - ETA: 1:27 - loss: 0.6767 - regression_loss: 0.6088 - classification_loss: 0.0679 272/500 [===============>..............] - ETA: 1:27 - loss: 0.6768 - regression_loss: 0.6089 - classification_loss: 0.0680 273/500 [===============>..............] - ETA: 1:26 - loss: 0.6760 - regression_loss: 0.6082 - classification_loss: 0.0678 274/500 [===============>..............] - ETA: 1:26 - loss: 0.6755 - regression_loss: 0.6078 - classification_loss: 0.0678 275/500 [===============>..............] - ETA: 1:26 - loss: 0.6748 - regression_loss: 0.6071 - classification_loss: 0.0677 276/500 [===============>..............] - ETA: 1:25 - loss: 0.6741 - regression_loss: 0.6066 - classification_loss: 0.0676 277/500 [===============>..............] - ETA: 1:25 - loss: 0.6728 - regression_loss: 0.6054 - classification_loss: 0.0674 278/500 [===============>..............] - ETA: 1:24 - loss: 0.6720 - regression_loss: 0.6047 - classification_loss: 0.0673 279/500 [===============>..............] - ETA: 1:24 - loss: 0.6721 - regression_loss: 0.6047 - classification_loss: 0.0674 280/500 [===============>..............] - ETA: 1:24 - loss: 0.6721 - regression_loss: 0.6047 - classification_loss: 0.0675 281/500 [===============>..............] - ETA: 1:23 - loss: 0.6722 - regression_loss: 0.6047 - classification_loss: 0.0675 282/500 [===============>..............] - ETA: 1:23 - loss: 0.6724 - regression_loss: 0.6050 - classification_loss: 0.0674 283/500 [===============>..............] - ETA: 1:23 - loss: 0.6714 - regression_loss: 0.6042 - classification_loss: 0.0672 284/500 [================>.............] - ETA: 1:22 - loss: 0.6718 - regression_loss: 0.6046 - classification_loss: 0.0672 285/500 [================>.............] - ETA: 1:22 - loss: 0.6737 - regression_loss: 0.6063 - classification_loss: 0.0674 286/500 [================>.............] - ETA: 1:21 - loss: 0.6727 - regression_loss: 0.6054 - classification_loss: 0.0673 287/500 [================>.............] - ETA: 1:21 - loss: 0.6737 - regression_loss: 0.6064 - classification_loss: 0.0673 288/500 [================>.............] - ETA: 1:21 - loss: 0.6739 - regression_loss: 0.6065 - classification_loss: 0.0675 289/500 [================>.............] - ETA: 1:20 - loss: 0.6728 - regression_loss: 0.6055 - classification_loss: 0.0673 290/500 [================>.............] - ETA: 1:20 - loss: 0.6721 - regression_loss: 0.6049 - classification_loss: 0.0672 291/500 [================>.............] - ETA: 1:20 - loss: 0.6724 - regression_loss: 0.6051 - classification_loss: 0.0672 292/500 [================>.............] - ETA: 1:19 - loss: 0.6712 - regression_loss: 0.6042 - classification_loss: 0.0671 293/500 [================>.............] - ETA: 1:19 - loss: 0.6705 - regression_loss: 0.6036 - classification_loss: 0.0670 294/500 [================>.............] - ETA: 1:18 - loss: 0.6693 - regression_loss: 0.6025 - classification_loss: 0.0668 295/500 [================>.............] - ETA: 1:18 - loss: 0.6681 - regression_loss: 0.6014 - classification_loss: 0.0667 296/500 [================>.............] - ETA: 1:18 - loss: 0.6680 - regression_loss: 0.6013 - classification_loss: 0.0667 297/500 [================>.............] - ETA: 1:17 - loss: 0.6680 - regression_loss: 0.6014 - classification_loss: 0.0667 298/500 [================>.............] - ETA: 1:17 - loss: 0.6681 - regression_loss: 0.6014 - classification_loss: 0.0667 299/500 [================>.............] - ETA: 1:16 - loss: 0.6680 - regression_loss: 0.6013 - classification_loss: 0.0667 300/500 [=================>............] - ETA: 1:16 - loss: 0.6668 - regression_loss: 0.6002 - classification_loss: 0.0666 301/500 [=================>............] - ETA: 1:16 - loss: 0.6660 - regression_loss: 0.5995 - classification_loss: 0.0665 302/500 [=================>............] - ETA: 1:15 - loss: 0.6655 - regression_loss: 0.5991 - classification_loss: 0.0664 303/500 [=================>............] - ETA: 1:15 - loss: 0.6649 - regression_loss: 0.5986 - classification_loss: 0.0663 304/500 [=================>............] - ETA: 1:15 - loss: 0.6652 - regression_loss: 0.5989 - classification_loss: 0.0663 305/500 [=================>............] - ETA: 1:14 - loss: 0.6649 - regression_loss: 0.5986 - classification_loss: 0.0662 306/500 [=================>............] - ETA: 1:14 - loss: 0.6645 - regression_loss: 0.5983 - classification_loss: 0.0662 307/500 [=================>............] - ETA: 1:13 - loss: 0.6645 - regression_loss: 0.5982 - classification_loss: 0.0662 308/500 [=================>............] - ETA: 1:13 - loss: 0.6643 - regression_loss: 0.5980 - classification_loss: 0.0662 309/500 [=================>............] - ETA: 1:13 - loss: 0.6640 - regression_loss: 0.5978 - classification_loss: 0.0662 310/500 [=================>............] - ETA: 1:12 - loss: 0.6640 - regression_loss: 0.5979 - classification_loss: 0.0661 311/500 [=================>............] - ETA: 1:12 - loss: 0.6639 - regression_loss: 0.5979 - classification_loss: 0.0661 312/500 [=================>............] - ETA: 1:11 - loss: 0.6637 - regression_loss: 0.5977 - classification_loss: 0.0660 313/500 [=================>............] - ETA: 1:11 - loss: 0.6633 - regression_loss: 0.5973 - classification_loss: 0.0660 314/500 [=================>............] - ETA: 1:11 - loss: 0.6639 - regression_loss: 0.5978 - classification_loss: 0.0661 315/500 [=================>............] - ETA: 1:10 - loss: 0.6639 - regression_loss: 0.5977 - classification_loss: 0.0662 316/500 [=================>............] - ETA: 1:10 - loss: 0.6636 - regression_loss: 0.5975 - classification_loss: 0.0661 317/500 [==================>...........] - ETA: 1:10 - loss: 0.6638 - regression_loss: 0.5977 - classification_loss: 0.0661 318/500 [==================>...........] - ETA: 1:09 - loss: 0.6634 - regression_loss: 0.5974 - classification_loss: 0.0660 319/500 [==================>...........] - ETA: 1:09 - loss: 0.6634 - regression_loss: 0.5974 - classification_loss: 0.0660 320/500 [==================>...........] - ETA: 1:08 - loss: 0.6627 - regression_loss: 0.5968 - classification_loss: 0.0659 321/500 [==================>...........] - ETA: 1:08 - loss: 0.6627 - regression_loss: 0.5967 - classification_loss: 0.0659 322/500 [==================>...........] - ETA: 1:08 - loss: 0.6616 - regression_loss: 0.5957 - classification_loss: 0.0658 323/500 [==================>...........] - ETA: 1:07 - loss: 0.6607 - regression_loss: 0.5950 - classification_loss: 0.0658 324/500 [==================>...........] - ETA: 1:07 - loss: 0.6606 - regression_loss: 0.5949 - classification_loss: 0.0657 325/500 [==================>...........] - ETA: 1:06 - loss: 0.6601 - regression_loss: 0.5944 - classification_loss: 0.0657 326/500 [==================>...........] - ETA: 1:06 - loss: 0.6595 - regression_loss: 0.5939 - classification_loss: 0.0656 327/500 [==================>...........] - ETA: 1:06 - loss: 0.6597 - regression_loss: 0.5940 - classification_loss: 0.0657 328/500 [==================>...........] - ETA: 1:05 - loss: 0.6593 - regression_loss: 0.5937 - classification_loss: 0.0657 329/500 [==================>...........] - ETA: 1:05 - loss: 0.6590 - regression_loss: 0.5933 - classification_loss: 0.0656 330/500 [==================>...........] - ETA: 1:05 - loss: 0.6589 - regression_loss: 0.5933 - classification_loss: 0.0656 331/500 [==================>...........] - ETA: 1:04 - loss: 0.6597 - regression_loss: 0.5941 - classification_loss: 0.0656 332/500 [==================>...........] - ETA: 1:04 - loss: 0.6603 - regression_loss: 0.5945 - classification_loss: 0.0658 333/500 [==================>...........] - ETA: 1:03 - loss: 0.6594 - regression_loss: 0.5937 - classification_loss: 0.0656 334/500 [===================>..........] - ETA: 1:03 - loss: 0.6591 - regression_loss: 0.5935 - classification_loss: 0.0656 335/500 [===================>..........] - ETA: 1:03 - loss: 0.6589 - regression_loss: 0.5934 - classification_loss: 0.0656 336/500 [===================>..........] - ETA: 1:02 - loss: 0.6587 - regression_loss: 0.5932 - classification_loss: 0.0655 337/500 [===================>..........] - ETA: 1:02 - loss: 0.6577 - regression_loss: 0.5923 - classification_loss: 0.0653 338/500 [===================>..........] - ETA: 1:01 - loss: 0.6566 - regression_loss: 0.5914 - classification_loss: 0.0652 339/500 [===================>..........] - ETA: 1:01 - loss: 0.6565 - regression_loss: 0.5914 - classification_loss: 0.0652 340/500 [===================>..........] - ETA: 1:01 - loss: 0.6569 - regression_loss: 0.5918 - classification_loss: 0.0652 341/500 [===================>..........] - ETA: 1:00 - loss: 0.6570 - regression_loss: 0.5919 - classification_loss: 0.0650 342/500 [===================>..........] - ETA: 1:00 - loss: 0.6563 - regression_loss: 0.5913 - classification_loss: 0.0650 343/500 [===================>..........] - ETA: 1:00 - loss: 0.6559 - regression_loss: 0.5911 - classification_loss: 0.0649 344/500 [===================>..........] - ETA: 59s - loss: 0.6561 - regression_loss: 0.5912 - classification_loss: 0.0649  345/500 [===================>..........] - ETA: 59s - loss: 0.6556 - regression_loss: 0.5907 - classification_loss: 0.0649 346/500 [===================>..........] - ETA: 58s - loss: 0.6556 - regression_loss: 0.5907 - classification_loss: 0.0648 347/500 [===================>..........] - ETA: 58s - loss: 0.6551 - regression_loss: 0.5903 - classification_loss: 0.0648 348/500 [===================>..........] - ETA: 58s - loss: 0.6552 - regression_loss: 0.5906 - classification_loss: 0.0647 349/500 [===================>..........] - ETA: 57s - loss: 0.6560 - regression_loss: 0.5912 - classification_loss: 0.0648 350/500 [====================>.........] - ETA: 57s - loss: 0.6549 - regression_loss: 0.5903 - classification_loss: 0.0647 351/500 [====================>.........] - ETA: 57s - loss: 0.6545 - regression_loss: 0.5898 - classification_loss: 0.0646 352/500 [====================>.........] - ETA: 56s - loss: 0.6557 - regression_loss: 0.5908 - classification_loss: 0.0649 353/500 [====================>.........] - ETA: 56s - loss: 0.6560 - regression_loss: 0.5911 - classification_loss: 0.0648 354/500 [====================>.........] - ETA: 55s - loss: 0.6561 - regression_loss: 0.5912 - classification_loss: 0.0648 355/500 [====================>.........] - ETA: 55s - loss: 0.6566 - regression_loss: 0.5918 - classification_loss: 0.0648 356/500 [====================>.........] - ETA: 55s - loss: 0.6568 - regression_loss: 0.5921 - classification_loss: 0.0648 357/500 [====================>.........] - ETA: 54s - loss: 0.6557 - regression_loss: 0.5910 - classification_loss: 0.0647 358/500 [====================>.........] - ETA: 54s - loss: 0.6550 - regression_loss: 0.5904 - classification_loss: 0.0646 359/500 [====================>.........] - ETA: 53s - loss: 0.6540 - regression_loss: 0.5894 - classification_loss: 0.0646 360/500 [====================>.........] - ETA: 53s - loss: 0.6541 - regression_loss: 0.5896 - classification_loss: 0.0646 361/500 [====================>.........] - ETA: 53s - loss: 0.6550 - regression_loss: 0.5903 - classification_loss: 0.0647 362/500 [====================>.........] - ETA: 52s - loss: 0.6539 - regression_loss: 0.5892 - classification_loss: 0.0646 363/500 [====================>.........] - ETA: 52s - loss: 0.6541 - regression_loss: 0.5894 - classification_loss: 0.0647 364/500 [====================>.........] - ETA: 51s - loss: 0.6535 - regression_loss: 0.5890 - classification_loss: 0.0646 365/500 [====================>.........] - ETA: 51s - loss: 0.6530 - regression_loss: 0.5884 - classification_loss: 0.0646 366/500 [====================>.........] - ETA: 51s - loss: 0.6525 - regression_loss: 0.5880 - classification_loss: 0.0645 367/500 [=====================>........] - ETA: 50s - loss: 0.6519 - regression_loss: 0.5875 - classification_loss: 0.0645 368/500 [=====================>........] - ETA: 50s - loss: 0.6521 - regression_loss: 0.5876 - classification_loss: 0.0645 369/500 [=====================>........] - ETA: 50s - loss: 0.6518 - regression_loss: 0.5873 - classification_loss: 0.0644 370/500 [=====================>........] - ETA: 49s - loss: 0.6525 - regression_loss: 0.5881 - classification_loss: 0.0645 371/500 [=====================>........] - ETA: 49s - loss: 0.6522 - regression_loss: 0.5877 - classification_loss: 0.0645 372/500 [=====================>........] - ETA: 48s - loss: 0.6521 - regression_loss: 0.5877 - classification_loss: 0.0644 373/500 [=====================>........] - ETA: 48s - loss: 0.6512 - regression_loss: 0.5868 - classification_loss: 0.0643 374/500 [=====================>........] - ETA: 48s - loss: 0.6509 - regression_loss: 0.5866 - classification_loss: 0.0643 375/500 [=====================>........] - ETA: 47s - loss: 0.6501 - regression_loss: 0.5859 - classification_loss: 0.0642 376/500 [=====================>........] - ETA: 47s - loss: 0.6511 - regression_loss: 0.5867 - classification_loss: 0.0644 377/500 [=====================>........] - ETA: 47s - loss: 0.6505 - regression_loss: 0.5862 - classification_loss: 0.0643 378/500 [=====================>........] - ETA: 46s - loss: 0.6508 - regression_loss: 0.5864 - classification_loss: 0.0643 379/500 [=====================>........] - ETA: 46s - loss: 0.6516 - regression_loss: 0.5872 - classification_loss: 0.0643 380/500 [=====================>........] - ETA: 45s - loss: 0.6523 - regression_loss: 0.5878 - classification_loss: 0.0645 381/500 [=====================>........] - ETA: 45s - loss: 0.6514 - regression_loss: 0.5870 - classification_loss: 0.0644 382/500 [=====================>........] - ETA: 45s - loss: 0.6518 - regression_loss: 0.5873 - classification_loss: 0.0645 383/500 [=====================>........] - ETA: 44s - loss: 0.6517 - regression_loss: 0.5871 - classification_loss: 0.0645 384/500 [======================>.......] - ETA: 44s - loss: 0.6521 - regression_loss: 0.5876 - classification_loss: 0.0646 385/500 [======================>.......] - ETA: 43s - loss: 0.6521 - regression_loss: 0.5875 - classification_loss: 0.0646 386/500 [======================>.......] - ETA: 43s - loss: 0.6524 - regression_loss: 0.5878 - classification_loss: 0.0646 387/500 [======================>.......] - ETA: 43s - loss: 0.6531 - regression_loss: 0.5884 - classification_loss: 0.0646 388/500 [======================>.......] - ETA: 42s - loss: 0.6530 - regression_loss: 0.5884 - classification_loss: 0.0646 389/500 [======================>.......] - ETA: 42s - loss: 0.6537 - regression_loss: 0.5890 - classification_loss: 0.0647 390/500 [======================>.......] - ETA: 42s - loss: 0.6540 - regression_loss: 0.5893 - classification_loss: 0.0647 391/500 [======================>.......] - ETA: 41s - loss: 0.6531 - regression_loss: 0.5885 - classification_loss: 0.0646 392/500 [======================>.......] - ETA: 41s - loss: 0.6526 - regression_loss: 0.5880 - classification_loss: 0.0645 393/500 [======================>.......] - ETA: 40s - loss: 0.6529 - regression_loss: 0.5883 - classification_loss: 0.0646 394/500 [======================>.......] - ETA: 40s - loss: 0.6540 - regression_loss: 0.5892 - classification_loss: 0.0648 395/500 [======================>.......] - ETA: 40s - loss: 0.6539 - regression_loss: 0.5891 - classification_loss: 0.0648 396/500 [======================>.......] - ETA: 39s - loss: 0.6535 - regression_loss: 0.5887 - classification_loss: 0.0647 397/500 [======================>.......] - ETA: 39s - loss: 0.6537 - regression_loss: 0.5889 - classification_loss: 0.0648 398/500 [======================>.......] - ETA: 39s - loss: 0.6525 - regression_loss: 0.5879 - classification_loss: 0.0646 399/500 [======================>.......] - ETA: 38s - loss: 0.6529 - regression_loss: 0.5882 - classification_loss: 0.0647 400/500 [=======================>......] - ETA: 38s - loss: 0.6521 - regression_loss: 0.5875 - classification_loss: 0.0646 401/500 [=======================>......] - ETA: 37s - loss: 0.6519 - regression_loss: 0.5873 - classification_loss: 0.0646 402/500 [=======================>......] - ETA: 37s - loss: 0.6509 - regression_loss: 0.5864 - classification_loss: 0.0645 403/500 [=======================>......] - ETA: 37s - loss: 0.6512 - regression_loss: 0.5867 - classification_loss: 0.0645 404/500 [=======================>......] - ETA: 36s - loss: 0.6521 - regression_loss: 0.5875 - classification_loss: 0.0646 405/500 [=======================>......] - ETA: 36s - loss: 0.6517 - regression_loss: 0.5871 - classification_loss: 0.0646 406/500 [=======================>......] - ETA: 35s - loss: 0.6525 - regression_loss: 0.5879 - classification_loss: 0.0646 407/500 [=======================>......] - ETA: 35s - loss: 0.6533 - regression_loss: 0.5887 - classification_loss: 0.0646 408/500 [=======================>......] - ETA: 35s - loss: 0.6544 - regression_loss: 0.5895 - classification_loss: 0.0649 409/500 [=======================>......] - ETA: 34s - loss: 0.6540 - regression_loss: 0.5892 - classification_loss: 0.0648 410/500 [=======================>......] - ETA: 34s - loss: 0.6541 - regression_loss: 0.5892 - classification_loss: 0.0648 411/500 [=======================>......] - ETA: 34s - loss: 0.6537 - regression_loss: 0.5889 - classification_loss: 0.0648 412/500 [=======================>......] - ETA: 33s - loss: 0.6537 - regression_loss: 0.5889 - classification_loss: 0.0648 413/500 [=======================>......] - ETA: 33s - loss: 0.6531 - regression_loss: 0.5883 - classification_loss: 0.0648 414/500 [=======================>......] - ETA: 32s - loss: 0.6530 - regression_loss: 0.5881 - classification_loss: 0.0648 415/500 [=======================>......] - ETA: 32s - loss: 0.6531 - regression_loss: 0.5882 - classification_loss: 0.0649 416/500 [=======================>......] - ETA: 32s - loss: 0.6532 - regression_loss: 0.5883 - classification_loss: 0.0649 417/500 [========================>.....] - ETA: 31s - loss: 0.6531 - regression_loss: 0.5882 - classification_loss: 0.0649 418/500 [========================>.....] - ETA: 31s - loss: 0.6533 - regression_loss: 0.5884 - classification_loss: 0.0649 419/500 [========================>.....] - ETA: 30s - loss: 0.6531 - regression_loss: 0.5882 - classification_loss: 0.0648 420/500 [========================>.....] - ETA: 30s - loss: 0.6528 - regression_loss: 0.5880 - classification_loss: 0.0648 421/500 [========================>.....] - ETA: 30s - loss: 0.6523 - regression_loss: 0.5876 - classification_loss: 0.0647 422/500 [========================>.....] - ETA: 29s - loss: 0.6516 - regression_loss: 0.5869 - classification_loss: 0.0647 423/500 [========================>.....] - ETA: 29s - loss: 0.6510 - regression_loss: 0.5864 - classification_loss: 0.0646 424/500 [========================>.....] - ETA: 29s - loss: 0.6510 - regression_loss: 0.5864 - classification_loss: 0.0646 425/500 [========================>.....] - ETA: 28s - loss: 0.6510 - regression_loss: 0.5864 - classification_loss: 0.0647 426/500 [========================>.....] - ETA: 28s - loss: 0.6508 - regression_loss: 0.5861 - classification_loss: 0.0646 427/500 [========================>.....] - ETA: 27s - loss: 0.6505 - regression_loss: 0.5858 - classification_loss: 0.0647 428/500 [========================>.....] - ETA: 27s - loss: 0.6499 - regression_loss: 0.5853 - classification_loss: 0.0646 429/500 [========================>.....] - ETA: 27s - loss: 0.6504 - regression_loss: 0.5857 - classification_loss: 0.0647 430/500 [========================>.....] - ETA: 26s - loss: 0.6500 - regression_loss: 0.5853 - classification_loss: 0.0647 431/500 [========================>.....] - ETA: 26s - loss: 0.6500 - regression_loss: 0.5854 - classification_loss: 0.0646 432/500 [========================>.....] - ETA: 26s - loss: 0.6499 - regression_loss: 0.5853 - classification_loss: 0.0646 433/500 [========================>.....] - ETA: 25s - loss: 0.6501 - regression_loss: 0.5855 - classification_loss: 0.0646 434/500 [=========================>....] - ETA: 25s - loss: 0.6498 - regression_loss: 0.5852 - classification_loss: 0.0645 435/500 [=========================>....] - ETA: 24s - loss: 0.6493 - regression_loss: 0.5848 - classification_loss: 0.0645 436/500 [=========================>....] - ETA: 24s - loss: 0.6488 - regression_loss: 0.5844 - classification_loss: 0.0644 437/500 [=========================>....] - ETA: 24s - loss: 0.6485 - regression_loss: 0.5841 - classification_loss: 0.0644 438/500 [=========================>....] - ETA: 23s - loss: 0.6487 - regression_loss: 0.5841 - classification_loss: 0.0645 439/500 [=========================>....] - ETA: 23s - loss: 0.6489 - regression_loss: 0.5842 - classification_loss: 0.0646 440/500 [=========================>....] - ETA: 22s - loss: 0.6481 - regression_loss: 0.5836 - classification_loss: 0.0645 441/500 [=========================>....] - ETA: 22s - loss: 0.6478 - regression_loss: 0.5833 - classification_loss: 0.0644 442/500 [=========================>....] - ETA: 22s - loss: 0.6474 - regression_loss: 0.5831 - classification_loss: 0.0644 443/500 [=========================>....] - ETA: 21s - loss: 0.6472 - regression_loss: 0.5829 - classification_loss: 0.0643 444/500 [=========================>....] - ETA: 21s - loss: 0.6465 - regression_loss: 0.5823 - classification_loss: 0.0642 445/500 [=========================>....] - ETA: 21s - loss: 0.6472 - regression_loss: 0.5828 - classification_loss: 0.0644 446/500 [=========================>....] - ETA: 20s - loss: 0.6473 - regression_loss: 0.5829 - classification_loss: 0.0644 447/500 [=========================>....] - ETA: 20s - loss: 0.6476 - regression_loss: 0.5832 - classification_loss: 0.0644 448/500 [=========================>....] - ETA: 19s - loss: 0.6486 - regression_loss: 0.5840 - classification_loss: 0.0645 449/500 [=========================>....] - ETA: 19s - loss: 0.6491 - regression_loss: 0.5843 - classification_loss: 0.0647 450/500 [==========================>...] - ETA: 19s - loss: 0.6489 - regression_loss: 0.5842 - classification_loss: 0.0647 451/500 [==========================>...] - ETA: 18s - loss: 0.6487 - regression_loss: 0.5840 - classification_loss: 0.0647 452/500 [==========================>...] - ETA: 18s - loss: 0.6496 - regression_loss: 0.5849 - classification_loss: 0.0647 453/500 [==========================>...] - ETA: 17s - loss: 0.6497 - regression_loss: 0.5849 - classification_loss: 0.0647 454/500 [==========================>...] - ETA: 17s - loss: 0.6503 - regression_loss: 0.5854 - classification_loss: 0.0649 455/500 [==========================>...] - ETA: 17s - loss: 0.6499 - regression_loss: 0.5851 - classification_loss: 0.0647 456/500 [==========================>...] - ETA: 16s - loss: 0.6507 - regression_loss: 0.5858 - classification_loss: 0.0649 457/500 [==========================>...] - ETA: 16s - loss: 0.6499 - regression_loss: 0.5851 - classification_loss: 0.0648 458/500 [==========================>...] - ETA: 16s - loss: 0.6494 - regression_loss: 0.5846 - classification_loss: 0.0647 459/500 [==========================>...] - ETA: 15s - loss: 0.6485 - regression_loss: 0.5839 - classification_loss: 0.0646 460/500 [==========================>...] - ETA: 15s - loss: 0.6480 - regression_loss: 0.5834 - classification_loss: 0.0646 461/500 [==========================>...] - ETA: 14s - loss: 0.6479 - regression_loss: 0.5833 - classification_loss: 0.0646 462/500 [==========================>...] - ETA: 14s - loss: 0.6472 - regression_loss: 0.5827 - classification_loss: 0.0645 463/500 [==========================>...] - ETA: 14s - loss: 0.6465 - regression_loss: 0.5822 - classification_loss: 0.0644 464/500 [==========================>...] - ETA: 13s - loss: 0.6466 - regression_loss: 0.5822 - classification_loss: 0.0644 465/500 [==========================>...] - ETA: 13s - loss: 0.6465 - regression_loss: 0.5821 - classification_loss: 0.0644 466/500 [==========================>...] - ETA: 12s - loss: 0.6468 - regression_loss: 0.5824 - classification_loss: 0.0644 467/500 [===========================>..] - ETA: 12s - loss: 0.6472 - regression_loss: 0.5828 - classification_loss: 0.0644 468/500 [===========================>..] - ETA: 12s - loss: 0.6483 - regression_loss: 0.5838 - classification_loss: 0.0645 469/500 [===========================>..] - ETA: 11s - loss: 0.6477 - regression_loss: 0.5833 - classification_loss: 0.0644 470/500 [===========================>..] - ETA: 11s - loss: 0.6477 - regression_loss: 0.5833 - classification_loss: 0.0644 471/500 [===========================>..] - ETA: 11s - loss: 0.6479 - regression_loss: 0.5834 - classification_loss: 0.0644 472/500 [===========================>..] - ETA: 10s - loss: 0.6473 - regression_loss: 0.5830 - classification_loss: 0.0644 473/500 [===========================>..] - ETA: 10s - loss: 0.6476 - regression_loss: 0.5832 - classification_loss: 0.0644 474/500 [===========================>..] - ETA: 9s - loss: 0.6482 - regression_loss: 0.5838 - classification_loss: 0.0644  475/500 [===========================>..] - ETA: 9s - loss: 0.6482 - regression_loss: 0.5837 - classification_loss: 0.0644 476/500 [===========================>..] - ETA: 9s - loss: 0.6484 - regression_loss: 0.5840 - classification_loss: 0.0644 477/500 [===========================>..] - ETA: 8s - loss: 0.6486 - regression_loss: 0.5842 - classification_loss: 0.0644 478/500 [===========================>..] - ETA: 8s - loss: 0.6486 - regression_loss: 0.5841 - classification_loss: 0.0644 479/500 [===========================>..] - ETA: 8s - loss: 0.6486 - regression_loss: 0.5841 - classification_loss: 0.0644 480/500 [===========================>..] - ETA: 7s - loss: 0.6484 - regression_loss: 0.5840 - classification_loss: 0.0644 481/500 [===========================>..] - ETA: 7s - loss: 0.6490 - regression_loss: 0.5846 - classification_loss: 0.0644 482/500 [===========================>..] - ETA: 6s - loss: 0.6498 - regression_loss: 0.5852 - classification_loss: 0.0646 483/500 [===========================>..] - ETA: 6s - loss: 0.6494 - regression_loss: 0.5849 - classification_loss: 0.0645 484/500 [============================>.] - ETA: 6s - loss: 0.6485 - regression_loss: 0.5841 - classification_loss: 0.0644 485/500 [============================>.] - ETA: 5s - loss: 0.6482 - regression_loss: 0.5839 - classification_loss: 0.0644 486/500 [============================>.] - ETA: 5s - loss: 0.6485 - regression_loss: 0.5841 - classification_loss: 0.0644 487/500 [============================>.] - ETA: 4s - loss: 0.6493 - regression_loss: 0.5848 - classification_loss: 0.0645 488/500 [============================>.] - ETA: 4s - loss: 0.6496 - regression_loss: 0.5851 - classification_loss: 0.0645 489/500 [============================>.] - ETA: 4s - loss: 0.6494 - regression_loss: 0.5850 - classification_loss: 0.0644 490/500 [============================>.] - ETA: 3s - loss: 0.6491 - regression_loss: 0.5847 - classification_loss: 0.0644 491/500 [============================>.] - ETA: 3s - loss: 0.6491 - regression_loss: 0.5847 - classification_loss: 0.0644 492/500 [============================>.] - ETA: 3s - loss: 0.6490 - regression_loss: 0.5846 - classification_loss: 0.0644 493/500 [============================>.] - ETA: 2s - loss: 0.6488 - regression_loss: 0.5844 - classification_loss: 0.0644 494/500 [============================>.] - ETA: 2s - loss: 0.6485 - regression_loss: 0.5841 - classification_loss: 0.0644 495/500 [============================>.] - ETA: 1s - loss: 0.6483 - regression_loss: 0.5840 - classification_loss: 0.0643 496/500 [============================>.] - ETA: 1s - loss: 0.6476 - regression_loss: 0.5833 - classification_loss: 0.0642 497/500 [============================>.] - ETA: 1s - loss: 0.6469 - regression_loss: 0.5828 - classification_loss: 0.0642 498/500 [============================>.] - ETA: 0s - loss: 0.6471 - regression_loss: 0.5830 - classification_loss: 0.0641 499/500 [============================>.] - ETA: 0s - loss: 0.6469 - regression_loss: 0.5828 - classification_loss: 0.0641 500/500 [==============================] - 191s 382ms/step - loss: 0.6472 - regression_loss: 0.5831 - classification_loss: 0.0641 1172 instances of class plum with average precision: 0.7190 mAP: 0.7190 Epoch 00017: saving model to ./training/snapshots/resnet101_pascal_17.h5 Epoch 18/20 1/500 [..............................] - ETA: 3:06 - loss: 0.5196 - regression_loss: 0.4735 - classification_loss: 0.0460 2/500 [..............................] - ETA: 3:10 - loss: 0.5352 - regression_loss: 0.4837 - classification_loss: 0.0515 3/500 [..............................] - ETA: 3:10 - loss: 0.5780 - regression_loss: 0.5220 - classification_loss: 0.0561 4/500 [..............................] - ETA: 3:11 - loss: 0.5167 - regression_loss: 0.4675 - classification_loss: 0.0492 5/500 [..............................] - ETA: 3:12 - loss: 0.4882 - regression_loss: 0.4446 - classification_loss: 0.0436 6/500 [..............................] - ETA: 3:10 - loss: 0.4682 - regression_loss: 0.4271 - classification_loss: 0.0411 7/500 [..............................] - ETA: 3:08 - loss: 0.4505 - regression_loss: 0.4125 - classification_loss: 0.0381 8/500 [..............................] - ETA: 3:07 - loss: 0.4780 - regression_loss: 0.4377 - classification_loss: 0.0403 9/500 [..............................] - ETA: 3:07 - loss: 0.4706 - regression_loss: 0.4309 - classification_loss: 0.0397 10/500 [..............................] - ETA: 3:07 - loss: 0.4837 - regression_loss: 0.4443 - classification_loss: 0.0394 11/500 [..............................] - ETA: 3:06 - loss: 0.4786 - regression_loss: 0.4394 - classification_loss: 0.0392 12/500 [..............................] - ETA: 3:06 - loss: 0.4969 - regression_loss: 0.4564 - classification_loss: 0.0405 13/500 [..............................] - ETA: 3:06 - loss: 0.4938 - regression_loss: 0.4541 - classification_loss: 0.0397 14/500 [..............................] - ETA: 3:07 - loss: 0.4936 - regression_loss: 0.4530 - classification_loss: 0.0406 15/500 [..............................] - ETA: 3:06 - loss: 0.4932 - regression_loss: 0.4525 - classification_loss: 0.0407 16/500 [..............................] - ETA: 3:05 - loss: 0.4770 - regression_loss: 0.4380 - classification_loss: 0.0390 17/500 [>.............................] - ETA: 3:05 - loss: 0.4916 - regression_loss: 0.4505 - classification_loss: 0.0412 18/500 [>.............................] - ETA: 3:04 - loss: 0.4890 - regression_loss: 0.4476 - classification_loss: 0.0414 19/500 [>.............................] - ETA: 3:04 - loss: 0.4839 - regression_loss: 0.4429 - classification_loss: 0.0411 20/500 [>.............................] - ETA: 3:04 - loss: 0.4841 - regression_loss: 0.4424 - classification_loss: 0.0417 21/500 [>.............................] - ETA: 3:05 - loss: 0.5118 - regression_loss: 0.4668 - classification_loss: 0.0451 22/500 [>.............................] - ETA: 3:04 - loss: 0.5290 - regression_loss: 0.4832 - classification_loss: 0.0459 23/500 [>.............................] - ETA: 3:03 - loss: 0.5274 - regression_loss: 0.4811 - classification_loss: 0.0463 24/500 [>.............................] - ETA: 3:03 - loss: 0.5305 - regression_loss: 0.4830 - classification_loss: 0.0475 25/500 [>.............................] - ETA: 3:02 - loss: 0.5224 - regression_loss: 0.4761 - classification_loss: 0.0464 26/500 [>.............................] - ETA: 3:02 - loss: 0.5167 - regression_loss: 0.4713 - classification_loss: 0.0453 27/500 [>.............................] - ETA: 3:02 - loss: 0.5159 - regression_loss: 0.4696 - classification_loss: 0.0463 28/500 [>.............................] - ETA: 3:02 - loss: 0.5198 - regression_loss: 0.4726 - classification_loss: 0.0472 29/500 [>.............................] - ETA: 3:01 - loss: 0.5370 - regression_loss: 0.4888 - classification_loss: 0.0482 30/500 [>.............................] - ETA: 3:01 - loss: 0.5446 - regression_loss: 0.4955 - classification_loss: 0.0490 31/500 [>.............................] - ETA: 3:00 - loss: 0.5513 - regression_loss: 0.5004 - classification_loss: 0.0509 32/500 [>.............................] - ETA: 3:00 - loss: 0.5589 - regression_loss: 0.5063 - classification_loss: 0.0526 33/500 [>.............................] - ETA: 3:00 - loss: 0.5566 - regression_loss: 0.5037 - classification_loss: 0.0530 34/500 [=>............................] - ETA: 3:00 - loss: 0.5579 - regression_loss: 0.5046 - classification_loss: 0.0532 35/500 [=>............................] - ETA: 2:59 - loss: 0.5478 - regression_loss: 0.4957 - classification_loss: 0.0522 36/500 [=>............................] - ETA: 2:59 - loss: 0.5457 - regression_loss: 0.4942 - classification_loss: 0.0516 37/500 [=>............................] - ETA: 2:59 - loss: 0.5453 - regression_loss: 0.4938 - classification_loss: 0.0515 38/500 [=>............................] - ETA: 2:59 - loss: 0.5503 - regression_loss: 0.4981 - classification_loss: 0.0522 39/500 [=>............................] - ETA: 2:58 - loss: 0.5481 - regression_loss: 0.4956 - classification_loss: 0.0525 40/500 [=>............................] - ETA: 2:57 - loss: 0.5483 - regression_loss: 0.4955 - classification_loss: 0.0528 41/500 [=>............................] - ETA: 2:57 - loss: 0.5438 - regression_loss: 0.4915 - classification_loss: 0.0522 42/500 [=>............................] - ETA: 2:57 - loss: 0.5487 - regression_loss: 0.4952 - classification_loss: 0.0535 43/500 [=>............................] - ETA: 2:56 - loss: 0.5563 - regression_loss: 0.5025 - classification_loss: 0.0539 44/500 [=>............................] - ETA: 2:55 - loss: 0.5523 - regression_loss: 0.4986 - classification_loss: 0.0537 45/500 [=>............................] - ETA: 2:55 - loss: 0.5510 - regression_loss: 0.4976 - classification_loss: 0.0535 46/500 [=>............................] - ETA: 2:55 - loss: 0.5536 - regression_loss: 0.5002 - classification_loss: 0.0534 47/500 [=>............................] - ETA: 2:55 - loss: 0.5515 - regression_loss: 0.4983 - classification_loss: 0.0533 48/500 [=>............................] - ETA: 2:54 - loss: 0.5520 - regression_loss: 0.4985 - classification_loss: 0.0534 49/500 [=>............................] - ETA: 2:53 - loss: 0.5495 - regression_loss: 0.4965 - classification_loss: 0.0530 50/500 [==>...........................] - ETA: 2:53 - loss: 0.5491 - regression_loss: 0.4959 - classification_loss: 0.0532 51/500 [==>...........................] - ETA: 2:53 - loss: 0.5476 - regression_loss: 0.4949 - classification_loss: 0.0527 52/500 [==>...........................] - ETA: 2:53 - loss: 0.5534 - regression_loss: 0.4999 - classification_loss: 0.0535 53/500 [==>...........................] - ETA: 2:52 - loss: 0.5568 - regression_loss: 0.5024 - classification_loss: 0.0544 54/500 [==>...........................] - ETA: 2:52 - loss: 0.5506 - regression_loss: 0.4969 - classification_loss: 0.0537 55/500 [==>...........................] - ETA: 2:52 - loss: 0.5573 - regression_loss: 0.5025 - classification_loss: 0.0548 56/500 [==>...........................] - ETA: 2:51 - loss: 0.5604 - regression_loss: 0.5057 - classification_loss: 0.0546 57/500 [==>...........................] - ETA: 2:51 - loss: 0.5616 - regression_loss: 0.5072 - classification_loss: 0.0545 58/500 [==>...........................] - ETA: 2:51 - loss: 0.5584 - regression_loss: 0.5044 - classification_loss: 0.0541 59/500 [==>...........................] - ETA: 2:50 - loss: 0.5547 - regression_loss: 0.5007 - classification_loss: 0.0540 60/500 [==>...........................] - ETA: 2:50 - loss: 0.5584 - regression_loss: 0.5043 - classification_loss: 0.0541 61/500 [==>...........................] - ETA: 2:49 - loss: 0.5559 - regression_loss: 0.5021 - classification_loss: 0.0538 62/500 [==>...........................] - ETA: 2:49 - loss: 0.5518 - regression_loss: 0.4985 - classification_loss: 0.0533 63/500 [==>...........................] - ETA: 2:49 - loss: 0.5503 - regression_loss: 0.4966 - classification_loss: 0.0537 64/500 [==>...........................] - ETA: 2:49 - loss: 0.5505 - regression_loss: 0.4968 - classification_loss: 0.0537 65/500 [==>...........................] - ETA: 2:48 - loss: 0.5526 - regression_loss: 0.4992 - classification_loss: 0.0534 66/500 [==>...........................] - ETA: 2:48 - loss: 0.5620 - regression_loss: 0.5083 - classification_loss: 0.0538 67/500 [===>..........................] - ETA: 2:48 - loss: 0.5718 - regression_loss: 0.5172 - classification_loss: 0.0546 68/500 [===>..........................] - ETA: 2:47 - loss: 0.5709 - regression_loss: 0.5162 - classification_loss: 0.0547 69/500 [===>..........................] - ETA: 2:47 - loss: 0.5779 - regression_loss: 0.5227 - classification_loss: 0.0552 70/500 [===>..........................] - ETA: 2:46 - loss: 0.5773 - regression_loss: 0.5224 - classification_loss: 0.0549 71/500 [===>..........................] - ETA: 2:46 - loss: 0.5738 - regression_loss: 0.5194 - classification_loss: 0.0544 72/500 [===>..........................] - ETA: 2:46 - loss: 0.5764 - regression_loss: 0.5218 - classification_loss: 0.0546 73/500 [===>..........................] - ETA: 2:45 - loss: 0.5732 - regression_loss: 0.5188 - classification_loss: 0.0544 74/500 [===>..........................] - ETA: 2:45 - loss: 0.5709 - regression_loss: 0.5170 - classification_loss: 0.0539 75/500 [===>..........................] - ETA: 2:44 - loss: 0.5730 - regression_loss: 0.5195 - classification_loss: 0.0535 76/500 [===>..........................] - ETA: 2:44 - loss: 0.5700 - regression_loss: 0.5168 - classification_loss: 0.0531 77/500 [===>..........................] - ETA: 2:43 - loss: 0.5734 - regression_loss: 0.5196 - classification_loss: 0.0538 78/500 [===>..........................] - ETA: 2:43 - loss: 0.5796 - regression_loss: 0.5245 - classification_loss: 0.0551 79/500 [===>..........................] - ETA: 2:43 - loss: 0.5835 - regression_loss: 0.5280 - classification_loss: 0.0555 80/500 [===>..........................] - ETA: 2:42 - loss: 0.5851 - regression_loss: 0.5299 - classification_loss: 0.0552 81/500 [===>..........................] - ETA: 2:42 - loss: 0.5828 - regression_loss: 0.5278 - classification_loss: 0.0550 82/500 [===>..........................] - ETA: 2:41 - loss: 0.5876 - regression_loss: 0.5324 - classification_loss: 0.0552 83/500 [===>..........................] - ETA: 2:41 - loss: 0.5889 - regression_loss: 0.5337 - classification_loss: 0.0552 84/500 [====>.........................] - ETA: 2:40 - loss: 0.5928 - regression_loss: 0.5374 - classification_loss: 0.0554 85/500 [====>.........................] - ETA: 2:40 - loss: 0.5928 - regression_loss: 0.5374 - classification_loss: 0.0554 86/500 [====>.........................] - ETA: 2:40 - loss: 0.5944 - regression_loss: 0.5389 - classification_loss: 0.0556 87/500 [====>.........................] - ETA: 2:39 - loss: 0.5957 - regression_loss: 0.5399 - classification_loss: 0.0558 88/500 [====>.........................] - ETA: 2:39 - loss: 0.5978 - regression_loss: 0.5419 - classification_loss: 0.0559 89/500 [====>.........................] - ETA: 2:38 - loss: 0.5950 - regression_loss: 0.5395 - classification_loss: 0.0555 90/500 [====>.........................] - ETA: 2:38 - loss: 0.5957 - regression_loss: 0.5400 - classification_loss: 0.0558 91/500 [====>.........................] - ETA: 2:37 - loss: 0.5962 - regression_loss: 0.5404 - classification_loss: 0.0558 92/500 [====>.........................] - ETA: 2:37 - loss: 0.5982 - regression_loss: 0.5423 - classification_loss: 0.0559 93/500 [====>.........................] - ETA: 2:36 - loss: 0.5970 - regression_loss: 0.5415 - classification_loss: 0.0556 94/500 [====>.........................] - ETA: 2:36 - loss: 0.5940 - regression_loss: 0.5388 - classification_loss: 0.0552 95/500 [====>.........................] - ETA: 2:35 - loss: 0.5952 - regression_loss: 0.5400 - classification_loss: 0.0552 96/500 [====>.........................] - ETA: 2:35 - loss: 0.5964 - regression_loss: 0.5410 - classification_loss: 0.0554 97/500 [====>.........................] - ETA: 2:35 - loss: 0.6013 - regression_loss: 0.5453 - classification_loss: 0.0560 98/500 [====>.........................] - ETA: 2:34 - loss: 0.5993 - regression_loss: 0.5436 - classification_loss: 0.0558 99/500 [====>.........................] - ETA: 2:34 - loss: 0.6044 - regression_loss: 0.5477 - classification_loss: 0.0567 100/500 [=====>........................] - ETA: 2:34 - loss: 0.6033 - regression_loss: 0.5467 - classification_loss: 0.0566 101/500 [=====>........................] - ETA: 2:33 - loss: 0.6032 - regression_loss: 0.5465 - classification_loss: 0.0567 102/500 [=====>........................] - ETA: 2:33 - loss: 0.6042 - regression_loss: 0.5474 - classification_loss: 0.0567 103/500 [=====>........................] - ETA: 2:32 - loss: 0.6015 - regression_loss: 0.5451 - classification_loss: 0.0564 104/500 [=====>........................] - ETA: 2:32 - loss: 0.6019 - regression_loss: 0.5454 - classification_loss: 0.0566 105/500 [=====>........................] - ETA: 2:32 - loss: 0.6013 - regression_loss: 0.5448 - classification_loss: 0.0565 106/500 [=====>........................] - ETA: 2:31 - loss: 0.5992 - regression_loss: 0.5428 - classification_loss: 0.0564 107/500 [=====>........................] - ETA: 2:31 - loss: 0.6016 - regression_loss: 0.5450 - classification_loss: 0.0565 108/500 [=====>........................] - ETA: 2:31 - loss: 0.5990 - regression_loss: 0.5428 - classification_loss: 0.0563 109/500 [=====>........................] - ETA: 2:30 - loss: 0.5991 - regression_loss: 0.5431 - classification_loss: 0.0560 110/500 [=====>........................] - ETA: 2:30 - loss: 0.5974 - regression_loss: 0.5417 - classification_loss: 0.0557 111/500 [=====>........................] - ETA: 2:29 - loss: 0.6061 - regression_loss: 0.5490 - classification_loss: 0.0572 112/500 [=====>........................] - ETA: 2:29 - loss: 0.6105 - regression_loss: 0.5531 - classification_loss: 0.0574 113/500 [=====>........................] - ETA: 2:28 - loss: 0.6134 - regression_loss: 0.5560 - classification_loss: 0.0575 114/500 [=====>........................] - ETA: 2:28 - loss: 0.6124 - regression_loss: 0.5551 - classification_loss: 0.0573 115/500 [=====>........................] - ETA: 2:28 - loss: 0.6138 - regression_loss: 0.5566 - classification_loss: 0.0572 116/500 [=====>........................] - ETA: 2:27 - loss: 0.6168 - regression_loss: 0.5594 - classification_loss: 0.0574 117/500 [======>.......................] - ETA: 2:27 - loss: 0.6166 - regression_loss: 0.5593 - classification_loss: 0.0573 118/500 [======>.......................] - ETA: 2:27 - loss: 0.6205 - regression_loss: 0.5628 - classification_loss: 0.0577 119/500 [======>.......................] - ETA: 2:26 - loss: 0.6256 - regression_loss: 0.5657 - classification_loss: 0.0599 120/500 [======>.......................] - ETA: 2:26 - loss: 0.6272 - regression_loss: 0.5674 - classification_loss: 0.0598 121/500 [======>.......................] - ETA: 2:25 - loss: 0.6281 - regression_loss: 0.5683 - classification_loss: 0.0598 122/500 [======>.......................] - ETA: 2:25 - loss: 0.6278 - regression_loss: 0.5682 - classification_loss: 0.0597 123/500 [======>.......................] - ETA: 2:25 - loss: 0.6267 - regression_loss: 0.5671 - classification_loss: 0.0596 124/500 [======>.......................] - ETA: 2:24 - loss: 0.6263 - regression_loss: 0.5667 - classification_loss: 0.0596 125/500 [======>.......................] - ETA: 2:24 - loss: 0.6303 - regression_loss: 0.5698 - classification_loss: 0.0605 126/500 [======>.......................] - ETA: 2:23 - loss: 0.6307 - regression_loss: 0.5701 - classification_loss: 0.0606 127/500 [======>.......................] - ETA: 2:23 - loss: 0.6332 - regression_loss: 0.5720 - classification_loss: 0.0611 128/500 [======>.......................] - ETA: 2:23 - loss: 0.6346 - regression_loss: 0.5733 - classification_loss: 0.0613 129/500 [======>.......................] - ETA: 2:22 - loss: 0.6321 - regression_loss: 0.5712 - classification_loss: 0.0609 130/500 [======>.......................] - ETA: 2:22 - loss: 0.6334 - regression_loss: 0.5723 - classification_loss: 0.0612 131/500 [======>.......................] - ETA: 2:21 - loss: 0.6307 - regression_loss: 0.5697 - classification_loss: 0.0609 132/500 [======>.......................] - ETA: 2:21 - loss: 0.6302 - regression_loss: 0.5693 - classification_loss: 0.0609 133/500 [======>.......................] - ETA: 2:21 - loss: 0.6289 - regression_loss: 0.5680 - classification_loss: 0.0609 134/500 [=======>......................] - ETA: 2:20 - loss: 0.6279 - regression_loss: 0.5671 - classification_loss: 0.0608 135/500 [=======>......................] - ETA: 2:20 - loss: 0.6274 - regression_loss: 0.5667 - classification_loss: 0.0607 136/500 [=======>......................] - ETA: 2:19 - loss: 0.6249 - regression_loss: 0.5645 - classification_loss: 0.0603 137/500 [=======>......................] - ETA: 2:19 - loss: 0.6237 - regression_loss: 0.5635 - classification_loss: 0.0602 138/500 [=======>......................] - ETA: 2:19 - loss: 0.6223 - regression_loss: 0.5623 - classification_loss: 0.0600 139/500 [=======>......................] - ETA: 2:18 - loss: 0.6241 - regression_loss: 0.5638 - classification_loss: 0.0603 140/500 [=======>......................] - ETA: 2:18 - loss: 0.6277 - regression_loss: 0.5664 - classification_loss: 0.0612 141/500 [=======>......................] - ETA: 2:17 - loss: 0.6280 - regression_loss: 0.5668 - classification_loss: 0.0612 142/500 [=======>......................] - ETA: 2:17 - loss: 0.6276 - regression_loss: 0.5664 - classification_loss: 0.0611 143/500 [=======>......................] - ETA: 2:17 - loss: 0.6284 - regression_loss: 0.5672 - classification_loss: 0.0612 144/500 [=======>......................] - ETA: 2:16 - loss: 0.6284 - regression_loss: 0.5670 - classification_loss: 0.0613 145/500 [=======>......................] - ETA: 2:16 - loss: 0.6272 - regression_loss: 0.5662 - classification_loss: 0.0611 146/500 [=======>......................] - ETA: 2:15 - loss: 0.6266 - regression_loss: 0.5658 - classification_loss: 0.0608 147/500 [=======>......................] - ETA: 2:15 - loss: 0.6266 - regression_loss: 0.5658 - classification_loss: 0.0608 148/500 [=======>......................] - ETA: 2:15 - loss: 0.6279 - regression_loss: 0.5671 - classification_loss: 0.0608 149/500 [=======>......................] - ETA: 2:14 - loss: 0.6286 - regression_loss: 0.5678 - classification_loss: 0.0608 150/500 [========>.....................] - ETA: 2:14 - loss: 0.6293 - regression_loss: 0.5684 - classification_loss: 0.0609 151/500 [========>.....................] - ETA: 2:14 - loss: 0.6276 - regression_loss: 0.5669 - classification_loss: 0.0607 152/500 [========>.....................] - ETA: 2:13 - loss: 0.6276 - regression_loss: 0.5669 - classification_loss: 0.0607 153/500 [========>.....................] - ETA: 2:13 - loss: 0.6293 - regression_loss: 0.5688 - classification_loss: 0.0605 154/500 [========>.....................] - ETA: 2:13 - loss: 0.6296 - regression_loss: 0.5690 - classification_loss: 0.0606 155/500 [========>.....................] - ETA: 2:12 - loss: 0.6274 - regression_loss: 0.5671 - classification_loss: 0.0603 156/500 [========>.....................] - ETA: 2:12 - loss: 0.6262 - regression_loss: 0.5661 - classification_loss: 0.0601 157/500 [========>.....................] - ETA: 2:11 - loss: 0.6257 - regression_loss: 0.5656 - classification_loss: 0.0601 158/500 [========>.....................] - ETA: 2:11 - loss: 0.6251 - regression_loss: 0.5650 - classification_loss: 0.0601 159/500 [========>.....................] - ETA: 2:11 - loss: 0.6251 - regression_loss: 0.5650 - classification_loss: 0.0601 160/500 [========>.....................] - ETA: 2:10 - loss: 0.6238 - regression_loss: 0.5639 - classification_loss: 0.0599 161/500 [========>.....................] - ETA: 2:10 - loss: 0.6255 - regression_loss: 0.5655 - classification_loss: 0.0599 162/500 [========>.....................] - ETA: 2:10 - loss: 0.6237 - regression_loss: 0.5640 - classification_loss: 0.0597 163/500 [========>.....................] - ETA: 2:09 - loss: 0.6227 - regression_loss: 0.5631 - classification_loss: 0.0596 164/500 [========>.....................] - ETA: 2:09 - loss: 0.6230 - regression_loss: 0.5634 - classification_loss: 0.0596 165/500 [========>.....................] - ETA: 2:08 - loss: 0.6224 - regression_loss: 0.5629 - classification_loss: 0.0595 166/500 [========>.....................] - ETA: 2:08 - loss: 0.6215 - regression_loss: 0.5621 - classification_loss: 0.0594 167/500 [=========>....................] - ETA: 2:08 - loss: 0.6196 - regression_loss: 0.5605 - classification_loss: 0.0591 168/500 [=========>....................] - ETA: 2:07 - loss: 0.6197 - regression_loss: 0.5605 - classification_loss: 0.0592 169/500 [=========>....................] - ETA: 2:07 - loss: 0.6201 - regression_loss: 0.5609 - classification_loss: 0.0592 170/500 [=========>....................] - ETA: 2:06 - loss: 0.6221 - regression_loss: 0.5627 - classification_loss: 0.0594 171/500 [=========>....................] - ETA: 2:06 - loss: 0.6218 - regression_loss: 0.5625 - classification_loss: 0.0593 172/500 [=========>....................] - ETA: 2:06 - loss: 0.6205 - regression_loss: 0.5613 - classification_loss: 0.0592 173/500 [=========>....................] - ETA: 2:05 - loss: 0.6191 - regression_loss: 0.5600 - classification_loss: 0.0591 174/500 [=========>....................] - ETA: 2:05 - loss: 0.6191 - regression_loss: 0.5599 - classification_loss: 0.0591 175/500 [=========>....................] - ETA: 2:05 - loss: 0.6192 - regression_loss: 0.5600 - classification_loss: 0.0592 176/500 [=========>....................] - ETA: 2:04 - loss: 0.6180 - regression_loss: 0.5589 - classification_loss: 0.0591 177/500 [=========>....................] - ETA: 2:04 - loss: 0.6182 - regression_loss: 0.5592 - classification_loss: 0.0590 178/500 [=========>....................] - ETA: 2:03 - loss: 0.6190 - regression_loss: 0.5600 - classification_loss: 0.0590 179/500 [=========>....................] - ETA: 2:03 - loss: 0.6166 - regression_loss: 0.5578 - classification_loss: 0.0588 180/500 [=========>....................] - ETA: 2:03 - loss: 0.6144 - regression_loss: 0.5559 - classification_loss: 0.0585 181/500 [=========>....................] - ETA: 2:02 - loss: 0.6130 - regression_loss: 0.5546 - classification_loss: 0.0584 182/500 [=========>....................] - ETA: 2:02 - loss: 0.6135 - regression_loss: 0.5551 - classification_loss: 0.0584 183/500 [=========>....................] - ETA: 2:01 - loss: 0.6146 - regression_loss: 0.5560 - classification_loss: 0.0586 184/500 [==========>...................] - ETA: 2:01 - loss: 0.6133 - regression_loss: 0.5548 - classification_loss: 0.0585 185/500 [==========>...................] - ETA: 2:01 - loss: 0.6146 - regression_loss: 0.5559 - classification_loss: 0.0587 186/500 [==========>...................] - ETA: 2:00 - loss: 0.6148 - regression_loss: 0.5561 - classification_loss: 0.0587 187/500 [==========>...................] - ETA: 2:00 - loss: 0.6138 - regression_loss: 0.5552 - classification_loss: 0.0586 188/500 [==========>...................] - ETA: 2:00 - loss: 0.6131 - regression_loss: 0.5545 - classification_loss: 0.0586 189/500 [==========>...................] - ETA: 1:59 - loss: 0.6138 - regression_loss: 0.5553 - classification_loss: 0.0585 190/500 [==========>...................] - ETA: 1:59 - loss: 0.6163 - regression_loss: 0.5575 - classification_loss: 0.0588 191/500 [==========>...................] - ETA: 1:58 - loss: 0.6173 - regression_loss: 0.5585 - classification_loss: 0.0588 192/500 [==========>...................] - ETA: 1:58 - loss: 0.6175 - regression_loss: 0.5587 - classification_loss: 0.0588 193/500 [==========>...................] - ETA: 1:57 - loss: 0.6169 - regression_loss: 0.5583 - classification_loss: 0.0586 194/500 [==========>...................] - ETA: 1:57 - loss: 0.6176 - regression_loss: 0.5590 - classification_loss: 0.0587 195/500 [==========>...................] - ETA: 1:57 - loss: 0.6171 - regression_loss: 0.5585 - classification_loss: 0.0586 196/500 [==========>...................] - ETA: 1:56 - loss: 0.6166 - regression_loss: 0.5581 - classification_loss: 0.0585 197/500 [==========>...................] - ETA: 1:56 - loss: 0.6155 - regression_loss: 0.5571 - classification_loss: 0.0584 198/500 [==========>...................] - ETA: 1:55 - loss: 0.6145 - regression_loss: 0.5563 - classification_loss: 0.0583 199/500 [==========>...................] - ETA: 1:55 - loss: 0.6149 - regression_loss: 0.5565 - classification_loss: 0.0584 200/500 [===========>..................] - ETA: 1:55 - loss: 0.6140 - regression_loss: 0.5558 - classification_loss: 0.0583 201/500 [===========>..................] - ETA: 1:54 - loss: 0.6142 - regression_loss: 0.5560 - classification_loss: 0.0582 202/500 [===========>..................] - ETA: 1:54 - loss: 0.6146 - regression_loss: 0.5564 - classification_loss: 0.0582 203/500 [===========>..................] - ETA: 1:54 - loss: 0.6135 - regression_loss: 0.5554 - classification_loss: 0.0581 204/500 [===========>..................] - ETA: 1:53 - loss: 0.6145 - regression_loss: 0.5562 - classification_loss: 0.0582 205/500 [===========>..................] - ETA: 1:53 - loss: 0.6146 - regression_loss: 0.5564 - classification_loss: 0.0583 206/500 [===========>..................] - ETA: 1:52 - loss: 0.6156 - regression_loss: 0.5572 - classification_loss: 0.0584 207/500 [===========>..................] - ETA: 1:52 - loss: 0.6159 - regression_loss: 0.5575 - classification_loss: 0.0584 208/500 [===========>..................] - ETA: 1:52 - loss: 0.6154 - regression_loss: 0.5564 - classification_loss: 0.0591 209/500 [===========>..................] - ETA: 1:51 - loss: 0.6147 - regression_loss: 0.5558 - classification_loss: 0.0589 210/500 [===========>..................] - ETA: 1:51 - loss: 0.6136 - regression_loss: 0.5548 - classification_loss: 0.0588 211/500 [===========>..................] - ETA: 1:51 - loss: 0.6117 - regression_loss: 0.5531 - classification_loss: 0.0586 212/500 [===========>..................] - ETA: 1:50 - loss: 0.6123 - regression_loss: 0.5536 - classification_loss: 0.0587 213/500 [===========>..................] - ETA: 1:50 - loss: 0.6124 - regression_loss: 0.5537 - classification_loss: 0.0587 214/500 [===========>..................] - ETA: 1:49 - loss: 0.6115 - regression_loss: 0.5529 - classification_loss: 0.0586 215/500 [===========>..................] - ETA: 1:49 - loss: 0.6108 - regression_loss: 0.5523 - classification_loss: 0.0585 216/500 [===========>..................] - ETA: 1:49 - loss: 0.6108 - regression_loss: 0.5523 - classification_loss: 0.0585 217/500 [============>.................] - ETA: 1:48 - loss: 0.6096 - regression_loss: 0.5512 - classification_loss: 0.0584 218/500 [============>.................] - ETA: 1:48 - loss: 0.6084 - regression_loss: 0.5501 - classification_loss: 0.0583 219/500 [============>.................] - ETA: 1:47 - loss: 0.6077 - regression_loss: 0.5495 - classification_loss: 0.0582 220/500 [============>.................] - ETA: 1:47 - loss: 0.6072 - regression_loss: 0.5490 - classification_loss: 0.0582 221/500 [============>.................] - ETA: 1:47 - loss: 0.6074 - regression_loss: 0.5493 - classification_loss: 0.0581 222/500 [============>.................] - ETA: 1:46 - loss: 0.6093 - regression_loss: 0.5508 - classification_loss: 0.0585 223/500 [============>.................] - ETA: 1:46 - loss: 0.6086 - regression_loss: 0.5502 - classification_loss: 0.0584 224/500 [============>.................] - ETA: 1:46 - loss: 0.6078 - regression_loss: 0.5495 - classification_loss: 0.0583 225/500 [============>.................] - ETA: 1:45 - loss: 0.6074 - regression_loss: 0.5491 - classification_loss: 0.0583 226/500 [============>.................] - ETA: 1:45 - loss: 0.6081 - regression_loss: 0.5497 - classification_loss: 0.0584 227/500 [============>.................] - ETA: 1:44 - loss: 0.6073 - regression_loss: 0.5489 - classification_loss: 0.0585 228/500 [============>.................] - ETA: 1:44 - loss: 0.6088 - regression_loss: 0.5504 - classification_loss: 0.0583 229/500 [============>.................] - ETA: 1:44 - loss: 0.6086 - regression_loss: 0.5503 - classification_loss: 0.0583 230/500 [============>.................] - ETA: 1:43 - loss: 0.6074 - regression_loss: 0.5492 - classification_loss: 0.0582 231/500 [============>.................] - ETA: 1:43 - loss: 0.6070 - regression_loss: 0.5488 - classification_loss: 0.0582 232/500 [============>.................] - ETA: 1:43 - loss: 0.6070 - regression_loss: 0.5488 - classification_loss: 0.0582 233/500 [============>.................] - ETA: 1:42 - loss: 0.6073 - regression_loss: 0.5490 - classification_loss: 0.0584 234/500 [=============>................] - ETA: 1:42 - loss: 0.6068 - regression_loss: 0.5484 - classification_loss: 0.0584 235/500 [=============>................] - ETA: 1:41 - loss: 0.6059 - regression_loss: 0.5476 - classification_loss: 0.0583 236/500 [=============>................] - ETA: 1:41 - loss: 0.6067 - regression_loss: 0.5484 - classification_loss: 0.0583 237/500 [=============>................] - ETA: 1:41 - loss: 0.6082 - regression_loss: 0.5498 - classification_loss: 0.0584 238/500 [=============>................] - ETA: 1:40 - loss: 0.6082 - regression_loss: 0.5497 - classification_loss: 0.0585 239/500 [=============>................] - ETA: 1:40 - loss: 0.6085 - regression_loss: 0.5499 - classification_loss: 0.0586 240/500 [=============>................] - ETA: 1:39 - loss: 0.6089 - regression_loss: 0.5503 - classification_loss: 0.0586 241/500 [=============>................] - ETA: 1:39 - loss: 0.6079 - regression_loss: 0.5495 - classification_loss: 0.0584 242/500 [=============>................] - ETA: 1:39 - loss: 0.6075 - regression_loss: 0.5492 - classification_loss: 0.0583 243/500 [=============>................] - ETA: 1:38 - loss: 0.6073 - regression_loss: 0.5490 - classification_loss: 0.0582 244/500 [=============>................] - ETA: 1:38 - loss: 0.6073 - regression_loss: 0.5490 - classification_loss: 0.0582 245/500 [=============>................] - ETA: 1:37 - loss: 0.6076 - regression_loss: 0.5492 - classification_loss: 0.0583 246/500 [=============>................] - ETA: 1:37 - loss: 0.6077 - regression_loss: 0.5493 - classification_loss: 0.0584 247/500 [=============>................] - ETA: 1:37 - loss: 0.6069 - regression_loss: 0.5486 - classification_loss: 0.0583 248/500 [=============>................] - ETA: 1:36 - loss: 0.6065 - regression_loss: 0.5482 - classification_loss: 0.0583 249/500 [=============>................] - ETA: 1:36 - loss: 0.6057 - regression_loss: 0.5474 - classification_loss: 0.0583 250/500 [==============>...............] - ETA: 1:35 - loss: 0.6054 - regression_loss: 0.5472 - classification_loss: 0.0582 251/500 [==============>...............] - ETA: 1:35 - loss: 0.6057 - regression_loss: 0.5474 - classification_loss: 0.0583 252/500 [==============>...............] - ETA: 1:35 - loss: 0.6054 - regression_loss: 0.5472 - classification_loss: 0.0583 253/500 [==============>...............] - ETA: 1:34 - loss: 0.6046 - regression_loss: 0.5465 - classification_loss: 0.0581 254/500 [==============>...............] - ETA: 1:34 - loss: 0.6033 - regression_loss: 0.5453 - classification_loss: 0.0580 255/500 [==============>...............] - ETA: 1:33 - loss: 0.6033 - regression_loss: 0.5454 - classification_loss: 0.0580 256/500 [==============>...............] - ETA: 1:33 - loss: 0.6020 - regression_loss: 0.5441 - classification_loss: 0.0579 257/500 [==============>...............] - ETA: 1:33 - loss: 0.6008 - regression_loss: 0.5430 - classification_loss: 0.0577 258/500 [==============>...............] - ETA: 1:32 - loss: 0.6004 - regression_loss: 0.5427 - classification_loss: 0.0577 259/500 [==============>...............] - ETA: 1:32 - loss: 0.6011 - regression_loss: 0.5435 - classification_loss: 0.0576 260/500 [==============>...............] - ETA: 1:32 - loss: 0.5998 - regression_loss: 0.5424 - classification_loss: 0.0574 261/500 [==============>...............] - ETA: 1:31 - loss: 0.6008 - regression_loss: 0.5431 - classification_loss: 0.0576 262/500 [==============>...............] - ETA: 1:31 - loss: 0.6004 - regression_loss: 0.5428 - classification_loss: 0.0576 263/500 [==============>...............] - ETA: 1:30 - loss: 0.5996 - regression_loss: 0.5421 - classification_loss: 0.0575 264/500 [==============>...............] - ETA: 1:30 - loss: 0.5997 - regression_loss: 0.5421 - classification_loss: 0.0575 265/500 [==============>...............] - ETA: 1:30 - loss: 0.5989 - regression_loss: 0.5413 - classification_loss: 0.0576 266/500 [==============>...............] - ETA: 1:29 - loss: 0.5977 - regression_loss: 0.5402 - classification_loss: 0.0575 267/500 [===============>..............] - ETA: 1:29 - loss: 0.5970 - regression_loss: 0.5396 - classification_loss: 0.0575 268/500 [===============>..............] - ETA: 1:28 - loss: 0.5975 - regression_loss: 0.5399 - classification_loss: 0.0576 269/500 [===============>..............] - ETA: 1:28 - loss: 0.5985 - regression_loss: 0.5408 - classification_loss: 0.0577 270/500 [===============>..............] - ETA: 1:28 - loss: 0.5985 - regression_loss: 0.5408 - classification_loss: 0.0577 271/500 [===============>..............] - ETA: 1:27 - loss: 0.5990 - regression_loss: 0.5414 - classification_loss: 0.0577 272/500 [===============>..............] - ETA: 1:27 - loss: 0.5981 - regression_loss: 0.5406 - classification_loss: 0.0575 273/500 [===============>..............] - ETA: 1:26 - loss: 0.5978 - regression_loss: 0.5403 - classification_loss: 0.0575 274/500 [===============>..............] - ETA: 1:26 - loss: 0.5981 - regression_loss: 0.5406 - classification_loss: 0.0575 275/500 [===============>..............] - ETA: 1:26 - loss: 0.5970 - regression_loss: 0.5396 - classification_loss: 0.0574 276/500 [===============>..............] - ETA: 1:25 - loss: 0.5967 - regression_loss: 0.5394 - classification_loss: 0.0573 277/500 [===============>..............] - ETA: 1:25 - loss: 0.5972 - regression_loss: 0.5399 - classification_loss: 0.0573 278/500 [===============>..............] - ETA: 1:24 - loss: 0.5980 - regression_loss: 0.5405 - classification_loss: 0.0575 279/500 [===============>..............] - ETA: 1:24 - loss: 0.5980 - regression_loss: 0.5405 - classification_loss: 0.0575 280/500 [===============>..............] - ETA: 1:24 - loss: 0.5981 - regression_loss: 0.5406 - classification_loss: 0.0576 281/500 [===============>..............] - ETA: 1:23 - loss: 0.5986 - regression_loss: 0.5409 - classification_loss: 0.0576 282/500 [===============>..............] - ETA: 1:23 - loss: 0.5976 - regression_loss: 0.5400 - classification_loss: 0.0576 283/500 [===============>..............] - ETA: 1:23 - loss: 0.5967 - regression_loss: 0.5392 - classification_loss: 0.0575 284/500 [================>.............] - ETA: 1:22 - loss: 0.5978 - regression_loss: 0.5400 - classification_loss: 0.0578 285/500 [================>.............] - ETA: 1:22 - loss: 0.5970 - regression_loss: 0.5394 - classification_loss: 0.0576 286/500 [================>.............] - ETA: 1:21 - loss: 0.5976 - regression_loss: 0.5398 - classification_loss: 0.0578 287/500 [================>.............] - ETA: 1:21 - loss: 0.5966 - regression_loss: 0.5389 - classification_loss: 0.0577 288/500 [================>.............] - ETA: 1:21 - loss: 0.5970 - regression_loss: 0.5392 - classification_loss: 0.0578 289/500 [================>.............] - ETA: 1:20 - loss: 0.5959 - regression_loss: 0.5382 - classification_loss: 0.0577 290/500 [================>.............] - ETA: 1:20 - loss: 0.5958 - regression_loss: 0.5381 - classification_loss: 0.0576 291/500 [================>.............] - ETA: 1:20 - loss: 0.5955 - regression_loss: 0.5379 - classification_loss: 0.0576 292/500 [================>.............] - ETA: 1:19 - loss: 0.5959 - regression_loss: 0.5384 - classification_loss: 0.0575 293/500 [================>.............] - ETA: 1:19 - loss: 0.5977 - regression_loss: 0.5400 - classification_loss: 0.0576 294/500 [================>.............] - ETA: 1:18 - loss: 0.5978 - regression_loss: 0.5402 - classification_loss: 0.0576 295/500 [================>.............] - ETA: 1:18 - loss: 0.5976 - regression_loss: 0.5401 - classification_loss: 0.0576 296/500 [================>.............] - ETA: 1:18 - loss: 0.5974 - regression_loss: 0.5398 - classification_loss: 0.0577 297/500 [================>.............] - ETA: 1:17 - loss: 0.5978 - regression_loss: 0.5401 - classification_loss: 0.0577 298/500 [================>.............] - ETA: 1:17 - loss: 0.5995 - regression_loss: 0.5416 - classification_loss: 0.0579 299/500 [================>.............] - ETA: 1:16 - loss: 0.5994 - regression_loss: 0.5416 - classification_loss: 0.0577 300/500 [=================>............] - ETA: 1:16 - loss: 0.5993 - regression_loss: 0.5417 - classification_loss: 0.0576 301/500 [=================>............] - ETA: 1:16 - loss: 0.5993 - regression_loss: 0.5417 - classification_loss: 0.0577 302/500 [=================>............] - ETA: 1:15 - loss: 0.5990 - regression_loss: 0.5414 - classification_loss: 0.0575 303/500 [=================>............] - ETA: 1:15 - loss: 0.5988 - regression_loss: 0.5413 - classification_loss: 0.0575 304/500 [=================>............] - ETA: 1:15 - loss: 0.5983 - regression_loss: 0.5409 - classification_loss: 0.0574 305/500 [=================>............] - ETA: 1:14 - loss: 0.5975 - regression_loss: 0.5402 - classification_loss: 0.0573 306/500 [=================>............] - ETA: 1:14 - loss: 0.5973 - regression_loss: 0.5400 - classification_loss: 0.0573 307/500 [=================>............] - ETA: 1:13 - loss: 0.5970 - regression_loss: 0.5397 - classification_loss: 0.0573 308/500 [=================>............] - ETA: 1:13 - loss: 0.5975 - regression_loss: 0.5402 - classification_loss: 0.0573 309/500 [=================>............] - ETA: 1:13 - loss: 0.5977 - regression_loss: 0.5405 - classification_loss: 0.0572 310/500 [=================>............] - ETA: 1:12 - loss: 0.5974 - regression_loss: 0.5402 - classification_loss: 0.0572 311/500 [=================>............] - ETA: 1:12 - loss: 0.5984 - regression_loss: 0.5412 - classification_loss: 0.0573 312/500 [=================>............] - ETA: 1:11 - loss: 0.5997 - regression_loss: 0.5423 - classification_loss: 0.0574 313/500 [=================>............] - ETA: 1:11 - loss: 0.6003 - regression_loss: 0.5429 - classification_loss: 0.0574 314/500 [=================>............] - ETA: 1:11 - loss: 0.6007 - regression_loss: 0.5431 - classification_loss: 0.0576 315/500 [=================>............] - ETA: 1:10 - loss: 0.6019 - regression_loss: 0.5441 - classification_loss: 0.0578 316/500 [=================>............] - ETA: 1:10 - loss: 0.6026 - regression_loss: 0.5446 - classification_loss: 0.0580 317/500 [==================>...........] - ETA: 1:10 - loss: 0.6027 - regression_loss: 0.5447 - classification_loss: 0.0580 318/500 [==================>...........] - ETA: 1:09 - loss: 0.6021 - regression_loss: 0.5442 - classification_loss: 0.0579 319/500 [==================>...........] - ETA: 1:09 - loss: 0.6028 - regression_loss: 0.5448 - classification_loss: 0.0580 320/500 [==================>...........] - ETA: 1:08 - loss: 0.6033 - regression_loss: 0.5453 - classification_loss: 0.0580 321/500 [==================>...........] - ETA: 1:08 - loss: 0.6040 - regression_loss: 0.5459 - classification_loss: 0.0581 322/500 [==================>...........] - ETA: 1:08 - loss: 0.6033 - regression_loss: 0.5453 - classification_loss: 0.0581 323/500 [==================>...........] - ETA: 1:07 - loss: 0.6023 - regression_loss: 0.5444 - classification_loss: 0.0579 324/500 [==================>...........] - ETA: 1:07 - loss: 0.6018 - regression_loss: 0.5440 - classification_loss: 0.0578 325/500 [==================>...........] - ETA: 1:06 - loss: 0.6013 - regression_loss: 0.5435 - classification_loss: 0.0578 326/500 [==================>...........] - ETA: 1:06 - loss: 0.6026 - regression_loss: 0.5447 - classification_loss: 0.0579 327/500 [==================>...........] - ETA: 1:06 - loss: 0.6029 - regression_loss: 0.5450 - classification_loss: 0.0579 328/500 [==================>...........] - ETA: 1:05 - loss: 0.6040 - regression_loss: 0.5460 - classification_loss: 0.0580 329/500 [==================>...........] - ETA: 1:05 - loss: 0.6044 - regression_loss: 0.5464 - classification_loss: 0.0580 330/500 [==================>...........] - ETA: 1:05 - loss: 0.6041 - regression_loss: 0.5460 - classification_loss: 0.0581 331/500 [==================>...........] - ETA: 1:04 - loss: 0.6036 - regression_loss: 0.5456 - classification_loss: 0.0580 332/500 [==================>...........] - ETA: 1:04 - loss: 0.6031 - regression_loss: 0.5451 - classification_loss: 0.0580 333/500 [==================>...........] - ETA: 1:03 - loss: 0.6036 - regression_loss: 0.5456 - classification_loss: 0.0580 334/500 [===================>..........] - ETA: 1:03 - loss: 0.6027 - regression_loss: 0.5447 - classification_loss: 0.0580 335/500 [===================>..........] - ETA: 1:03 - loss: 0.6041 - regression_loss: 0.5459 - classification_loss: 0.0582 336/500 [===================>..........] - ETA: 1:02 - loss: 0.6034 - regression_loss: 0.5453 - classification_loss: 0.0581 337/500 [===================>..........] - ETA: 1:02 - loss: 0.6026 - regression_loss: 0.5446 - classification_loss: 0.0580 338/500 [===================>..........] - ETA: 1:01 - loss: 0.6024 - regression_loss: 0.5444 - classification_loss: 0.0580 339/500 [===================>..........] - ETA: 1:01 - loss: 0.6026 - regression_loss: 0.5447 - classification_loss: 0.0580 340/500 [===================>..........] - ETA: 1:01 - loss: 0.6019 - regression_loss: 0.5440 - classification_loss: 0.0579 341/500 [===================>..........] - ETA: 1:00 - loss: 0.6025 - regression_loss: 0.5447 - classification_loss: 0.0578 342/500 [===================>..........] - ETA: 1:00 - loss: 0.6031 - regression_loss: 0.5452 - classification_loss: 0.0580 343/500 [===================>..........] - ETA: 1:00 - loss: 0.6020 - regression_loss: 0.5442 - classification_loss: 0.0578 344/500 [===================>..........] - ETA: 59s - loss: 0.6021 - regression_loss: 0.5443 - classification_loss: 0.0578  345/500 [===================>..........] - ETA: 59s - loss: 0.6017 - regression_loss: 0.5439 - classification_loss: 0.0578 346/500 [===================>..........] - ETA: 58s - loss: 0.6020 - regression_loss: 0.5441 - classification_loss: 0.0579 347/500 [===================>..........] - ETA: 58s - loss: 0.6022 - regression_loss: 0.5443 - classification_loss: 0.0579 348/500 [===================>..........] - ETA: 58s - loss: 0.6021 - regression_loss: 0.5442 - classification_loss: 0.0579 349/500 [===================>..........] - ETA: 57s - loss: 0.6015 - regression_loss: 0.5437 - classification_loss: 0.0578 350/500 [====================>.........] - ETA: 57s - loss: 0.6012 - regression_loss: 0.5434 - classification_loss: 0.0578 351/500 [====================>.........] - ETA: 56s - loss: 0.6010 - regression_loss: 0.5433 - classification_loss: 0.0578 352/500 [====================>.........] - ETA: 56s - loss: 0.6014 - regression_loss: 0.5436 - classification_loss: 0.0578 353/500 [====================>.........] - ETA: 56s - loss: 0.6017 - regression_loss: 0.5439 - classification_loss: 0.0578 354/500 [====================>.........] - ETA: 55s - loss: 0.6023 - regression_loss: 0.5444 - classification_loss: 0.0579 355/500 [====================>.........] - ETA: 55s - loss: 0.6029 - regression_loss: 0.5450 - classification_loss: 0.0580 356/500 [====================>.........] - ETA: 55s - loss: 0.6041 - regression_loss: 0.5461 - classification_loss: 0.0580 357/500 [====================>.........] - ETA: 54s - loss: 0.6034 - regression_loss: 0.5455 - classification_loss: 0.0579 358/500 [====================>.........] - ETA: 54s - loss: 0.6030 - regression_loss: 0.5451 - classification_loss: 0.0578 359/500 [====================>.........] - ETA: 53s - loss: 0.6019 - regression_loss: 0.5442 - classification_loss: 0.0577 360/500 [====================>.........] - ETA: 53s - loss: 0.6025 - regression_loss: 0.5446 - classification_loss: 0.0579 361/500 [====================>.........] - ETA: 53s - loss: 0.6021 - regression_loss: 0.5443 - classification_loss: 0.0579 362/500 [====================>.........] - ETA: 52s - loss: 0.6023 - regression_loss: 0.5444 - classification_loss: 0.0579 363/500 [====================>.........] - ETA: 52s - loss: 0.6014 - regression_loss: 0.5437 - classification_loss: 0.0578 364/500 [====================>.........] - ETA: 51s - loss: 0.6009 - regression_loss: 0.5432 - classification_loss: 0.0577 365/500 [====================>.........] - ETA: 51s - loss: 0.6007 - regression_loss: 0.5429 - classification_loss: 0.0578 366/500 [====================>.........] - ETA: 51s - loss: 0.6007 - regression_loss: 0.5428 - classification_loss: 0.0579 367/500 [=====================>........] - ETA: 50s - loss: 0.6007 - regression_loss: 0.5428 - classification_loss: 0.0579 368/500 [=====================>........] - ETA: 50s - loss: 0.6003 - regression_loss: 0.5425 - classification_loss: 0.0578 369/500 [=====================>........] - ETA: 50s - loss: 0.6007 - regression_loss: 0.5430 - classification_loss: 0.0577 370/500 [=====================>........] - ETA: 49s - loss: 0.5999 - regression_loss: 0.5422 - classification_loss: 0.0576 371/500 [=====================>........] - ETA: 49s - loss: 0.6004 - regression_loss: 0.5427 - classification_loss: 0.0577 372/500 [=====================>........] - ETA: 48s - loss: 0.6000 - regression_loss: 0.5423 - classification_loss: 0.0577 373/500 [=====================>........] - ETA: 48s - loss: 0.6005 - regression_loss: 0.5429 - classification_loss: 0.0576 374/500 [=====================>........] - ETA: 48s - loss: 0.6006 - regression_loss: 0.5430 - classification_loss: 0.0576 375/500 [=====================>........] - ETA: 47s - loss: 0.6014 - regression_loss: 0.5436 - classification_loss: 0.0577 376/500 [=====================>........] - ETA: 47s - loss: 0.6016 - regression_loss: 0.5438 - classification_loss: 0.0577 377/500 [=====================>........] - ETA: 47s - loss: 0.6008 - regression_loss: 0.5432 - classification_loss: 0.0577 378/500 [=====================>........] - ETA: 46s - loss: 0.6006 - regression_loss: 0.5430 - classification_loss: 0.0576 379/500 [=====================>........] - ETA: 46s - loss: 0.6005 - regression_loss: 0.5429 - classification_loss: 0.0576 380/500 [=====================>........] - ETA: 45s - loss: 0.5999 - regression_loss: 0.5423 - classification_loss: 0.0575 381/500 [=====================>........] - ETA: 45s - loss: 0.5994 - regression_loss: 0.5420 - classification_loss: 0.0574 382/500 [=====================>........] - ETA: 45s - loss: 0.5996 - regression_loss: 0.5421 - classification_loss: 0.0575 383/500 [=====================>........] - ETA: 44s - loss: 0.5995 - regression_loss: 0.5420 - classification_loss: 0.0575 384/500 [======================>.......] - ETA: 44s - loss: 0.5998 - regression_loss: 0.5424 - classification_loss: 0.0574 385/500 [======================>.......] - ETA: 44s - loss: 0.5997 - regression_loss: 0.5424 - classification_loss: 0.0574 386/500 [======================>.......] - ETA: 43s - loss: 0.5995 - regression_loss: 0.5422 - classification_loss: 0.0573 387/500 [======================>.......] - ETA: 43s - loss: 0.5995 - regression_loss: 0.5422 - classification_loss: 0.0574 388/500 [======================>.......] - ETA: 42s - loss: 0.5990 - regression_loss: 0.5417 - classification_loss: 0.0573 389/500 [======================>.......] - ETA: 42s - loss: 0.5981 - regression_loss: 0.5409 - classification_loss: 0.0572 390/500 [======================>.......] - ETA: 42s - loss: 0.5975 - regression_loss: 0.5403 - classification_loss: 0.0572 391/500 [======================>.......] - ETA: 41s - loss: 0.5969 - regression_loss: 0.5398 - classification_loss: 0.0571 392/500 [======================>.......] - ETA: 41s - loss: 0.5969 - regression_loss: 0.5398 - classification_loss: 0.0570 393/500 [======================>.......] - ETA: 40s - loss: 0.5958 - regression_loss: 0.5388 - classification_loss: 0.0569 394/500 [======================>.......] - ETA: 40s - loss: 0.5948 - regression_loss: 0.5379 - classification_loss: 0.0569 395/500 [======================>.......] - ETA: 40s - loss: 0.5952 - regression_loss: 0.5383 - classification_loss: 0.0569 396/500 [======================>.......] - ETA: 39s - loss: 0.5952 - regression_loss: 0.5383 - classification_loss: 0.0569 397/500 [======================>.......] - ETA: 39s - loss: 0.5945 - regression_loss: 0.5377 - classification_loss: 0.0569 398/500 [======================>.......] - ETA: 39s - loss: 0.5949 - regression_loss: 0.5380 - classification_loss: 0.0568 399/500 [======================>.......] - ETA: 38s - loss: 0.5949 - regression_loss: 0.5382 - classification_loss: 0.0568 400/500 [=======================>......] - ETA: 38s - loss: 0.5960 - regression_loss: 0.5390 - classification_loss: 0.0570 401/500 [=======================>......] - ETA: 37s - loss: 0.5951 - regression_loss: 0.5383 - classification_loss: 0.0569 402/500 [=======================>......] - ETA: 37s - loss: 0.5948 - regression_loss: 0.5380 - classification_loss: 0.0568 403/500 [=======================>......] - ETA: 37s - loss: 0.5945 - regression_loss: 0.5378 - classification_loss: 0.0567 404/500 [=======================>......] - ETA: 36s - loss: 0.5947 - regression_loss: 0.5379 - classification_loss: 0.0568 405/500 [=======================>......] - ETA: 36s - loss: 0.5951 - regression_loss: 0.5384 - classification_loss: 0.0568 406/500 [=======================>......] - ETA: 35s - loss: 0.5945 - regression_loss: 0.5378 - classification_loss: 0.0567 407/500 [=======================>......] - ETA: 35s - loss: 0.5947 - regression_loss: 0.5380 - classification_loss: 0.0567 408/500 [=======================>......] - ETA: 35s - loss: 0.5955 - regression_loss: 0.5388 - classification_loss: 0.0567 409/500 [=======================>......] - ETA: 34s - loss: 0.5953 - regression_loss: 0.5386 - classification_loss: 0.0567 410/500 [=======================>......] - ETA: 34s - loss: 0.5946 - regression_loss: 0.5380 - classification_loss: 0.0566 411/500 [=======================>......] - ETA: 34s - loss: 0.5945 - regression_loss: 0.5379 - classification_loss: 0.0566 412/500 [=======================>......] - ETA: 33s - loss: 0.5944 - regression_loss: 0.5378 - classification_loss: 0.0566 413/500 [=======================>......] - ETA: 33s - loss: 0.5939 - regression_loss: 0.5374 - classification_loss: 0.0565 414/500 [=======================>......] - ETA: 32s - loss: 0.5942 - regression_loss: 0.5377 - classification_loss: 0.0565 415/500 [=======================>......] - ETA: 32s - loss: 0.5946 - regression_loss: 0.5380 - classification_loss: 0.0566 416/500 [=======================>......] - ETA: 32s - loss: 0.5948 - regression_loss: 0.5382 - classification_loss: 0.0566 417/500 [========================>.....] - ETA: 31s - loss: 0.5954 - regression_loss: 0.5386 - classification_loss: 0.0568 418/500 [========================>.....] - ETA: 31s - loss: 0.5955 - regression_loss: 0.5387 - classification_loss: 0.0568 419/500 [========================>.....] - ETA: 31s - loss: 0.5953 - regression_loss: 0.5385 - classification_loss: 0.0567 420/500 [========================>.....] - ETA: 30s - loss: 0.5954 - regression_loss: 0.5386 - classification_loss: 0.0568 421/500 [========================>.....] - ETA: 30s - loss: 0.5950 - regression_loss: 0.5383 - classification_loss: 0.0567 422/500 [========================>.....] - ETA: 29s - loss: 0.5952 - regression_loss: 0.5384 - classification_loss: 0.0568 423/500 [========================>.....] - ETA: 29s - loss: 0.5957 - regression_loss: 0.5388 - classification_loss: 0.0569 424/500 [========================>.....] - ETA: 29s - loss: 0.5950 - regression_loss: 0.5382 - classification_loss: 0.0568 425/500 [========================>.....] - ETA: 28s - loss: 0.5951 - regression_loss: 0.5384 - classification_loss: 0.0568 426/500 [========================>.....] - ETA: 28s - loss: 0.5956 - regression_loss: 0.5388 - classification_loss: 0.0568 427/500 [========================>.....] - ETA: 27s - loss: 0.5955 - regression_loss: 0.5387 - classification_loss: 0.0567 428/500 [========================>.....] - ETA: 27s - loss: 0.5954 - regression_loss: 0.5387 - classification_loss: 0.0567 429/500 [========================>.....] - ETA: 27s - loss: 0.5960 - regression_loss: 0.5392 - classification_loss: 0.0568 430/500 [========================>.....] - ETA: 26s - loss: 0.5967 - regression_loss: 0.5399 - classification_loss: 0.0568 431/500 [========================>.....] - ETA: 26s - loss: 0.5963 - regression_loss: 0.5395 - classification_loss: 0.0568 432/500 [========================>.....] - ETA: 26s - loss: 0.5970 - regression_loss: 0.5401 - classification_loss: 0.0569 433/500 [========================>.....] - ETA: 25s - loss: 0.5970 - regression_loss: 0.5401 - classification_loss: 0.0569 434/500 [=========================>....] - ETA: 25s - loss: 0.5972 - regression_loss: 0.5402 - classification_loss: 0.0570 435/500 [=========================>....] - ETA: 24s - loss: 0.5967 - regression_loss: 0.5398 - classification_loss: 0.0569 436/500 [=========================>....] - ETA: 24s - loss: 0.5965 - regression_loss: 0.5396 - classification_loss: 0.0569 437/500 [=========================>....] - ETA: 24s - loss: 0.5962 - regression_loss: 0.5393 - classification_loss: 0.0569 438/500 [=========================>....] - ETA: 23s - loss: 0.5959 - regression_loss: 0.5390 - classification_loss: 0.0569 439/500 [=========================>....] - ETA: 23s - loss: 0.5952 - regression_loss: 0.5384 - classification_loss: 0.0568 440/500 [=========================>....] - ETA: 22s - loss: 0.5958 - regression_loss: 0.5389 - classification_loss: 0.0569 441/500 [=========================>....] - ETA: 22s - loss: 0.5960 - regression_loss: 0.5391 - classification_loss: 0.0569 442/500 [=========================>....] - ETA: 22s - loss: 0.5950 - regression_loss: 0.5381 - classification_loss: 0.0568 443/500 [=========================>....] - ETA: 21s - loss: 0.5949 - regression_loss: 0.5381 - classification_loss: 0.0568 444/500 [=========================>....] - ETA: 21s - loss: 0.5946 - regression_loss: 0.5378 - classification_loss: 0.0568 445/500 [=========================>....] - ETA: 21s - loss: 0.5940 - regression_loss: 0.5373 - classification_loss: 0.0567 446/500 [=========================>....] - ETA: 20s - loss: 0.5940 - regression_loss: 0.5372 - classification_loss: 0.0567 447/500 [=========================>....] - ETA: 20s - loss: 0.5939 - regression_loss: 0.5372 - classification_loss: 0.0567 448/500 [=========================>....] - ETA: 19s - loss: 0.5940 - regression_loss: 0.5372 - classification_loss: 0.0568 449/500 [=========================>....] - ETA: 19s - loss: 0.5938 - regression_loss: 0.5370 - classification_loss: 0.0567 450/500 [==========================>...] - ETA: 19s - loss: 0.5930 - regression_loss: 0.5363 - classification_loss: 0.0567 451/500 [==========================>...] - ETA: 18s - loss: 0.5934 - regression_loss: 0.5367 - classification_loss: 0.0567 452/500 [==========================>...] - ETA: 18s - loss: 0.5939 - regression_loss: 0.5372 - classification_loss: 0.0568 453/500 [==========================>...] - ETA: 17s - loss: 0.5946 - regression_loss: 0.5377 - classification_loss: 0.0569 454/500 [==========================>...] - ETA: 17s - loss: 0.5946 - regression_loss: 0.5378 - classification_loss: 0.0569 455/500 [==========================>...] - ETA: 17s - loss: 0.5943 - regression_loss: 0.5375 - classification_loss: 0.0568 456/500 [==========================>...] - ETA: 16s - loss: 0.5937 - regression_loss: 0.5370 - classification_loss: 0.0568 457/500 [==========================>...] - ETA: 16s - loss: 0.5935 - regression_loss: 0.5368 - classification_loss: 0.0567 458/500 [==========================>...] - ETA: 16s - loss: 0.5936 - regression_loss: 0.5369 - classification_loss: 0.0568 459/500 [==========================>...] - ETA: 15s - loss: 0.5939 - regression_loss: 0.5371 - classification_loss: 0.0568 460/500 [==========================>...] - ETA: 15s - loss: 0.5938 - regression_loss: 0.5370 - classification_loss: 0.0568 461/500 [==========================>...] - ETA: 14s - loss: 0.5931 - regression_loss: 0.5364 - classification_loss: 0.0567 462/500 [==========================>...] - ETA: 14s - loss: 0.5933 - regression_loss: 0.5367 - classification_loss: 0.0566 463/500 [==========================>...] - ETA: 14s - loss: 0.5929 - regression_loss: 0.5363 - classification_loss: 0.0566 464/500 [==========================>...] - ETA: 13s - loss: 0.5922 - regression_loss: 0.5356 - classification_loss: 0.0566 465/500 [==========================>...] - ETA: 13s - loss: 0.5935 - regression_loss: 0.5367 - classification_loss: 0.0568 466/500 [==========================>...] - ETA: 13s - loss: 0.5941 - regression_loss: 0.5373 - classification_loss: 0.0568 467/500 [===========================>..] - ETA: 12s - loss: 0.5944 - regression_loss: 0.5376 - classification_loss: 0.0568 468/500 [===========================>..] - ETA: 12s - loss: 0.5948 - regression_loss: 0.5380 - classification_loss: 0.0568 469/500 [===========================>..] - ETA: 11s - loss: 0.5942 - regression_loss: 0.5374 - classification_loss: 0.0568 470/500 [===========================>..] - ETA: 11s - loss: 0.5937 - regression_loss: 0.5370 - classification_loss: 0.0567 471/500 [===========================>..] - ETA: 11s - loss: 0.5934 - regression_loss: 0.5367 - classification_loss: 0.0567 472/500 [===========================>..] - ETA: 10s - loss: 0.5937 - regression_loss: 0.5371 - classification_loss: 0.0566 473/500 [===========================>..] - ETA: 10s - loss: 0.5950 - regression_loss: 0.5383 - classification_loss: 0.0567 474/500 [===========================>..] - ETA: 9s - loss: 0.5956 - regression_loss: 0.5389 - classification_loss: 0.0567  475/500 [===========================>..] - ETA: 9s - loss: 0.5949 - regression_loss: 0.5382 - classification_loss: 0.0566 476/500 [===========================>..] - ETA: 9s - loss: 0.5945 - regression_loss: 0.5379 - classification_loss: 0.0566 477/500 [===========================>..] - ETA: 8s - loss: 0.5949 - regression_loss: 0.5383 - classification_loss: 0.0566 478/500 [===========================>..] - ETA: 8s - loss: 0.5957 - regression_loss: 0.5390 - classification_loss: 0.0566 479/500 [===========================>..] - ETA: 8s - loss: 0.5960 - regression_loss: 0.5393 - classification_loss: 0.0567 480/500 [===========================>..] - ETA: 7s - loss: 0.5965 - regression_loss: 0.5397 - classification_loss: 0.0568 481/500 [===========================>..] - ETA: 7s - loss: 0.5978 - regression_loss: 0.5408 - classification_loss: 0.0570 482/500 [===========================>..] - ETA: 6s - loss: 0.5976 - regression_loss: 0.5406 - classification_loss: 0.0570 483/500 [===========================>..] - ETA: 6s - loss: 0.5991 - regression_loss: 0.5421 - classification_loss: 0.0570 484/500 [============================>.] - ETA: 6s - loss: 0.5993 - regression_loss: 0.5423 - classification_loss: 0.0570 485/500 [============================>.] - ETA: 5s - loss: 0.5995 - regression_loss: 0.5425 - classification_loss: 0.0570 486/500 [============================>.] - ETA: 5s - loss: 0.5995 - regression_loss: 0.5425 - classification_loss: 0.0570 487/500 [============================>.] - ETA: 4s - loss: 0.5991 - regression_loss: 0.5422 - classification_loss: 0.0570 488/500 [============================>.] - ETA: 4s - loss: 0.5989 - regression_loss: 0.5420 - classification_loss: 0.0569 489/500 [============================>.] - ETA: 4s - loss: 0.5987 - regression_loss: 0.5418 - classification_loss: 0.0569 490/500 [============================>.] - ETA: 3s - loss: 0.5985 - regression_loss: 0.5417 - classification_loss: 0.0568 491/500 [============================>.] - ETA: 3s - loss: 0.5985 - regression_loss: 0.5417 - classification_loss: 0.0568 492/500 [============================>.] - ETA: 3s - loss: 0.5988 - regression_loss: 0.5420 - classification_loss: 0.0568 493/500 [============================>.] - ETA: 2s - loss: 0.5985 - regression_loss: 0.5417 - classification_loss: 0.0568 494/500 [============================>.] - ETA: 2s - loss: 0.5986 - regression_loss: 0.5419 - classification_loss: 0.0568 495/500 [============================>.] - ETA: 1s - loss: 0.5983 - regression_loss: 0.5415 - classification_loss: 0.0568 496/500 [============================>.] - ETA: 1s - loss: 0.5979 - regression_loss: 0.5412 - classification_loss: 0.0567 497/500 [============================>.] - ETA: 1s - loss: 0.5978 - regression_loss: 0.5411 - classification_loss: 0.0567 498/500 [============================>.] - ETA: 0s - loss: 0.5977 - regression_loss: 0.5410 - classification_loss: 0.0567 499/500 [============================>.] - ETA: 0s - loss: 0.5976 - regression_loss: 0.5409 - classification_loss: 0.0567 500/500 [==============================] - 191s 382ms/step - loss: 0.5981 - regression_loss: 0.5414 - classification_loss: 0.0567 1172 instances of class plum with average precision: 0.7113 mAP: 0.7113 Epoch 00018: saving model to ./training/snapshots/resnet101_pascal_18.h5 Epoch 19/20 1/500 [..............................] - ETA: 3:10 - loss: 0.6044 - regression_loss: 0.5615 - classification_loss: 0.0429 2/500 [..............................] - ETA: 3:15 - loss: 0.4885 - regression_loss: 0.4579 - classification_loss: 0.0306 3/500 [..............................] - ETA: 3:11 - loss: 0.4033 - regression_loss: 0.3772 - classification_loss: 0.0261 4/500 [..............................] - ETA: 3:09 - loss: 0.5695 - regression_loss: 0.5198 - classification_loss: 0.0497 5/500 [..............................] - ETA: 3:09 - loss: 0.5752 - regression_loss: 0.5267 - classification_loss: 0.0485 6/500 [..............................] - ETA: 3:09 - loss: 0.5964 - regression_loss: 0.5482 - classification_loss: 0.0482 7/500 [..............................] - ETA: 3:08 - loss: 0.6382 - regression_loss: 0.5863 - classification_loss: 0.0519 8/500 [..............................] - ETA: 3:06 - loss: 0.6461 - regression_loss: 0.5939 - classification_loss: 0.0521 9/500 [..............................] - ETA: 3:06 - loss: 0.6362 - regression_loss: 0.5849 - classification_loss: 0.0513 10/500 [..............................] - ETA: 3:07 - loss: 0.6004 - regression_loss: 0.5510 - classification_loss: 0.0494 11/500 [..............................] - ETA: 3:08 - loss: 0.6074 - regression_loss: 0.5560 - classification_loss: 0.0514 12/500 [..............................] - ETA: 3:08 - loss: 0.6248 - regression_loss: 0.5714 - classification_loss: 0.0534 13/500 [..............................] - ETA: 3:07 - loss: 0.6363 - regression_loss: 0.5788 - classification_loss: 0.0575 14/500 [..............................] - ETA: 3:07 - loss: 0.6384 - regression_loss: 0.5803 - classification_loss: 0.0581 15/500 [..............................] - ETA: 3:07 - loss: 0.6198 - regression_loss: 0.5638 - classification_loss: 0.0560 16/500 [..............................] - ETA: 3:06 - loss: 0.6112 - regression_loss: 0.5571 - classification_loss: 0.0541 17/500 [>.............................] - ETA: 3:05 - loss: 0.6091 - regression_loss: 0.5545 - classification_loss: 0.0546 18/500 [>.............................] - ETA: 3:04 - loss: 0.5941 - regression_loss: 0.5404 - classification_loss: 0.0537 19/500 [>.............................] - ETA: 3:04 - loss: 0.5833 - regression_loss: 0.5315 - classification_loss: 0.0518 20/500 [>.............................] - ETA: 3:03 - loss: 0.5875 - regression_loss: 0.5345 - classification_loss: 0.0530 21/500 [>.............................] - ETA: 3:03 - loss: 0.5729 - regression_loss: 0.5214 - classification_loss: 0.0515 22/500 [>.............................] - ETA: 3:02 - loss: 0.5656 - regression_loss: 0.5140 - classification_loss: 0.0516 23/500 [>.............................] - ETA: 3:02 - loss: 0.5718 - regression_loss: 0.5196 - classification_loss: 0.0522 24/500 [>.............................] - ETA: 3:01 - loss: 0.5762 - regression_loss: 0.5235 - classification_loss: 0.0527 25/500 [>.............................] - ETA: 3:01 - loss: 0.5772 - regression_loss: 0.5245 - classification_loss: 0.0526 26/500 [>.............................] - ETA: 3:00 - loss: 0.5822 - regression_loss: 0.5287 - classification_loss: 0.0534 27/500 [>.............................] - ETA: 3:00 - loss: 0.5875 - regression_loss: 0.5337 - classification_loss: 0.0538 28/500 [>.............................] - ETA: 3:01 - loss: 0.5852 - regression_loss: 0.5317 - classification_loss: 0.0536 29/500 [>.............................] - ETA: 3:00 - loss: 0.5949 - regression_loss: 0.5399 - classification_loss: 0.0550 30/500 [>.............................] - ETA: 3:00 - loss: 0.5927 - regression_loss: 0.5383 - classification_loss: 0.0544 31/500 [>.............................] - ETA: 3:00 - loss: 0.5909 - regression_loss: 0.5345 - classification_loss: 0.0564 32/500 [>.............................] - ETA: 2:59 - loss: 0.5846 - regression_loss: 0.5288 - classification_loss: 0.0558 33/500 [>.............................] - ETA: 2:59 - loss: 0.5826 - regression_loss: 0.5272 - classification_loss: 0.0554 34/500 [=>............................] - ETA: 2:58 - loss: 0.5810 - regression_loss: 0.5265 - classification_loss: 0.0545 35/500 [=>............................] - ETA: 2:58 - loss: 0.5750 - regression_loss: 0.5208 - classification_loss: 0.0542 36/500 [=>............................] - ETA: 2:57 - loss: 0.5882 - regression_loss: 0.5335 - classification_loss: 0.0547 37/500 [=>............................] - ETA: 2:57 - loss: 0.5881 - regression_loss: 0.5339 - classification_loss: 0.0542 38/500 [=>............................] - ETA: 2:57 - loss: 0.5922 - regression_loss: 0.5381 - classification_loss: 0.0541 39/500 [=>............................] - ETA: 2:56 - loss: 0.5926 - regression_loss: 0.5390 - classification_loss: 0.0536 40/500 [=>............................] - ETA: 2:56 - loss: 0.5868 - regression_loss: 0.5340 - classification_loss: 0.0528 41/500 [=>............................] - ETA: 2:55 - loss: 0.5876 - regression_loss: 0.5348 - classification_loss: 0.0528 42/500 [=>............................] - ETA: 2:55 - loss: 0.5997 - regression_loss: 0.5457 - classification_loss: 0.0540 43/500 [=>............................] - ETA: 2:54 - loss: 0.5966 - regression_loss: 0.5426 - classification_loss: 0.0541 44/500 [=>............................] - ETA: 2:54 - loss: 0.5989 - regression_loss: 0.5448 - classification_loss: 0.0541 45/500 [=>............................] - ETA: 2:53 - loss: 0.5977 - regression_loss: 0.5441 - classification_loss: 0.0537 46/500 [=>............................] - ETA: 2:53 - loss: 0.5984 - regression_loss: 0.5445 - classification_loss: 0.0538 47/500 [=>............................] - ETA: 2:52 - loss: 0.5929 - regression_loss: 0.5395 - classification_loss: 0.0534 48/500 [=>............................] - ETA: 2:52 - loss: 0.5957 - regression_loss: 0.5421 - classification_loss: 0.0536 49/500 [=>............................] - ETA: 2:52 - loss: 0.6008 - regression_loss: 0.5474 - classification_loss: 0.0534 50/500 [==>...........................] - ETA: 2:51 - loss: 0.6024 - regression_loss: 0.5488 - classification_loss: 0.0536 51/500 [==>...........................] - ETA: 2:51 - loss: 0.6036 - regression_loss: 0.5501 - classification_loss: 0.0535 52/500 [==>...........................] - ETA: 2:51 - loss: 0.6062 - regression_loss: 0.5524 - classification_loss: 0.0538 53/500 [==>...........................] - ETA: 2:51 - loss: 0.6097 - regression_loss: 0.5554 - classification_loss: 0.0544 54/500 [==>...........................] - ETA: 2:50 - loss: 0.6056 - regression_loss: 0.5518 - classification_loss: 0.0538 55/500 [==>...........................] - ETA: 2:50 - loss: 0.6041 - regression_loss: 0.5508 - classification_loss: 0.0532 56/500 [==>...........................] - ETA: 2:49 - loss: 0.6039 - regression_loss: 0.5509 - classification_loss: 0.0531 57/500 [==>...........................] - ETA: 2:49 - loss: 0.6083 - regression_loss: 0.5553 - classification_loss: 0.0530 58/500 [==>...........................] - ETA: 2:48 - loss: 0.6088 - regression_loss: 0.5561 - classification_loss: 0.0527 59/500 [==>...........................] - ETA: 2:48 - loss: 0.6106 - regression_loss: 0.5580 - classification_loss: 0.0526 60/500 [==>...........................] - ETA: 2:48 - loss: 0.6194 - regression_loss: 0.5660 - classification_loss: 0.0534 61/500 [==>...........................] - ETA: 2:47 - loss: 0.6219 - regression_loss: 0.5684 - classification_loss: 0.0535 62/500 [==>...........................] - ETA: 2:47 - loss: 0.6200 - regression_loss: 0.5666 - classification_loss: 0.0534 63/500 [==>...........................] - ETA: 2:46 - loss: 0.6192 - regression_loss: 0.5656 - classification_loss: 0.0536 64/500 [==>...........................] - ETA: 2:46 - loss: 0.6143 - regression_loss: 0.5613 - classification_loss: 0.0530 65/500 [==>...........................] - ETA: 2:46 - loss: 0.6190 - regression_loss: 0.5650 - classification_loss: 0.0540 66/500 [==>...........................] - ETA: 2:45 - loss: 0.6167 - regression_loss: 0.5628 - classification_loss: 0.0538 67/500 [===>..........................] - ETA: 2:45 - loss: 0.6131 - regression_loss: 0.5597 - classification_loss: 0.0534 68/500 [===>..........................] - ETA: 2:44 - loss: 0.6104 - regression_loss: 0.5573 - classification_loss: 0.0531 69/500 [===>..........................] - ETA: 2:44 - loss: 0.6074 - regression_loss: 0.5545 - classification_loss: 0.0529 70/500 [===>..........................] - ETA: 2:43 - loss: 0.6094 - regression_loss: 0.5564 - classification_loss: 0.0531 71/500 [===>..........................] - ETA: 2:43 - loss: 0.6124 - regression_loss: 0.5591 - classification_loss: 0.0533 72/500 [===>..........................] - ETA: 2:43 - loss: 0.6077 - regression_loss: 0.5548 - classification_loss: 0.0528 73/500 [===>..........................] - ETA: 2:42 - loss: 0.6115 - regression_loss: 0.5586 - classification_loss: 0.0529 74/500 [===>..........................] - ETA: 2:42 - loss: 0.6152 - regression_loss: 0.5620 - classification_loss: 0.0532 75/500 [===>..........................] - ETA: 2:42 - loss: 0.6190 - regression_loss: 0.5657 - classification_loss: 0.0533 76/500 [===>..........................] - ETA: 2:41 - loss: 0.6188 - regression_loss: 0.5654 - classification_loss: 0.0534 77/500 [===>..........................] - ETA: 2:41 - loss: 0.6191 - regression_loss: 0.5656 - classification_loss: 0.0534 78/500 [===>..........................] - ETA: 2:40 - loss: 0.6183 - regression_loss: 0.5646 - classification_loss: 0.0537 79/500 [===>..........................] - ETA: 2:40 - loss: 0.6175 - regression_loss: 0.5640 - classification_loss: 0.0535 80/500 [===>..........................] - ETA: 2:39 - loss: 0.6165 - regression_loss: 0.5632 - classification_loss: 0.0534 81/500 [===>..........................] - ETA: 2:39 - loss: 0.6155 - regression_loss: 0.5622 - classification_loss: 0.0533 82/500 [===>..........................] - ETA: 2:39 - loss: 0.6182 - regression_loss: 0.5645 - classification_loss: 0.0537 83/500 [===>..........................] - ETA: 2:38 - loss: 0.6175 - regression_loss: 0.5640 - classification_loss: 0.0535 84/500 [====>.........................] - ETA: 2:38 - loss: 0.6191 - regression_loss: 0.5656 - classification_loss: 0.0535 85/500 [====>.........................] - ETA: 2:38 - loss: 0.6236 - regression_loss: 0.5693 - classification_loss: 0.0543 86/500 [====>.........................] - ETA: 2:37 - loss: 0.6279 - regression_loss: 0.5737 - classification_loss: 0.0542 87/500 [====>.........................] - ETA: 2:37 - loss: 0.6245 - regression_loss: 0.5706 - classification_loss: 0.0539 88/500 [====>.........................] - ETA: 2:37 - loss: 0.6255 - regression_loss: 0.5717 - classification_loss: 0.0539 89/500 [====>.........................] - ETA: 2:36 - loss: 0.6203 - regression_loss: 0.5669 - classification_loss: 0.0534 90/500 [====>.........................] - ETA: 2:36 - loss: 0.6242 - regression_loss: 0.5703 - classification_loss: 0.0540 91/500 [====>.........................] - ETA: 2:36 - loss: 0.6211 - regression_loss: 0.5673 - classification_loss: 0.0538 92/500 [====>.........................] - ETA: 2:35 - loss: 0.6226 - regression_loss: 0.5687 - classification_loss: 0.0539 93/500 [====>.........................] - ETA: 2:35 - loss: 0.6257 - regression_loss: 0.5720 - classification_loss: 0.0537 94/500 [====>.........................] - ETA: 2:34 - loss: 0.6274 - regression_loss: 0.5734 - classification_loss: 0.0540 95/500 [====>.........................] - ETA: 2:34 - loss: 0.6282 - regression_loss: 0.5742 - classification_loss: 0.0541 96/500 [====>.........................] - ETA: 2:34 - loss: 0.6246 - regression_loss: 0.5709 - classification_loss: 0.0537 97/500 [====>.........................] - ETA: 2:33 - loss: 0.6268 - regression_loss: 0.5726 - classification_loss: 0.0543 98/500 [====>.........................] - ETA: 2:33 - loss: 0.6269 - regression_loss: 0.5724 - classification_loss: 0.0544 99/500 [====>.........................] - ETA: 2:33 - loss: 0.6276 - regression_loss: 0.5730 - classification_loss: 0.0545 100/500 [=====>........................] - ETA: 2:32 - loss: 0.6243 - regression_loss: 0.5701 - classification_loss: 0.0542 101/500 [=====>........................] - ETA: 2:32 - loss: 0.6218 - regression_loss: 0.5679 - classification_loss: 0.0539 102/500 [=====>........................] - ETA: 2:31 - loss: 0.6249 - regression_loss: 0.5708 - classification_loss: 0.0542 103/500 [=====>........................] - ETA: 2:31 - loss: 0.6255 - regression_loss: 0.5711 - classification_loss: 0.0544 104/500 [=====>........................] - ETA: 2:31 - loss: 0.6274 - regression_loss: 0.5726 - classification_loss: 0.0548 105/500 [=====>........................] - ETA: 2:30 - loss: 0.6260 - regression_loss: 0.5715 - classification_loss: 0.0545 106/500 [=====>........................] - ETA: 2:30 - loss: 0.6255 - regression_loss: 0.5710 - classification_loss: 0.0545 107/500 [=====>........................] - ETA: 2:29 - loss: 0.6267 - regression_loss: 0.5722 - classification_loss: 0.0545 108/500 [=====>........................] - ETA: 2:29 - loss: 0.6240 - regression_loss: 0.5698 - classification_loss: 0.0541 109/500 [=====>........................] - ETA: 2:29 - loss: 0.6244 - regression_loss: 0.5702 - classification_loss: 0.0542 110/500 [=====>........................] - ETA: 2:28 - loss: 0.6238 - regression_loss: 0.5695 - classification_loss: 0.0542 111/500 [=====>........................] - ETA: 2:28 - loss: 0.6221 - regression_loss: 0.5680 - classification_loss: 0.0541 112/500 [=====>........................] - ETA: 2:28 - loss: 0.6235 - regression_loss: 0.5690 - classification_loss: 0.0545 113/500 [=====>........................] - ETA: 2:27 - loss: 0.6239 - regression_loss: 0.5695 - classification_loss: 0.0545 114/500 [=====>........................] - ETA: 2:27 - loss: 0.6209 - regression_loss: 0.5668 - classification_loss: 0.0542 115/500 [=====>........................] - ETA: 2:27 - loss: 0.6210 - regression_loss: 0.5667 - classification_loss: 0.0543 116/500 [=====>........................] - ETA: 2:26 - loss: 0.6243 - regression_loss: 0.5692 - classification_loss: 0.0551 117/500 [======>.......................] - ETA: 2:26 - loss: 0.6231 - regression_loss: 0.5681 - classification_loss: 0.0551 118/500 [======>.......................] - ETA: 2:26 - loss: 0.6242 - regression_loss: 0.5688 - classification_loss: 0.0555 119/500 [======>.......................] - ETA: 2:25 - loss: 0.6217 - regression_loss: 0.5666 - classification_loss: 0.0551 120/500 [======>.......................] - ETA: 2:25 - loss: 0.6206 - regression_loss: 0.5656 - classification_loss: 0.0550 121/500 [======>.......................] - ETA: 2:24 - loss: 0.6181 - regression_loss: 0.5633 - classification_loss: 0.0548 122/500 [======>.......................] - ETA: 2:24 - loss: 0.6190 - regression_loss: 0.5639 - classification_loss: 0.0551 123/500 [======>.......................] - ETA: 2:24 - loss: 0.6166 - regression_loss: 0.5617 - classification_loss: 0.0549 124/500 [======>.......................] - ETA: 2:23 - loss: 0.6168 - regression_loss: 0.5615 - classification_loss: 0.0553 125/500 [======>.......................] - ETA: 2:23 - loss: 0.6175 - regression_loss: 0.5622 - classification_loss: 0.0553 126/500 [======>.......................] - ETA: 2:23 - loss: 0.6162 - regression_loss: 0.5608 - classification_loss: 0.0554 127/500 [======>.......................] - ETA: 2:22 - loss: 0.6153 - regression_loss: 0.5600 - classification_loss: 0.0553 128/500 [======>.......................] - ETA: 2:22 - loss: 0.6146 - regression_loss: 0.5594 - classification_loss: 0.0552 129/500 [======>.......................] - ETA: 2:21 - loss: 0.6134 - regression_loss: 0.5583 - classification_loss: 0.0551 130/500 [======>.......................] - ETA: 2:21 - loss: 0.6116 - regression_loss: 0.5568 - classification_loss: 0.0547 131/500 [======>.......................] - ETA: 2:21 - loss: 0.6109 - regression_loss: 0.5562 - classification_loss: 0.0547 132/500 [======>.......................] - ETA: 2:20 - loss: 0.6094 - regression_loss: 0.5549 - classification_loss: 0.0545 133/500 [======>.......................] - ETA: 2:20 - loss: 0.6088 - regression_loss: 0.5544 - classification_loss: 0.0545 134/500 [=======>......................] - ETA: 2:20 - loss: 0.6069 - regression_loss: 0.5527 - classification_loss: 0.0542 135/500 [=======>......................] - ETA: 2:19 - loss: 0.6075 - regression_loss: 0.5533 - classification_loss: 0.0543 136/500 [=======>......................] - ETA: 2:19 - loss: 0.6092 - regression_loss: 0.5546 - classification_loss: 0.0546 137/500 [=======>......................] - ETA: 2:18 - loss: 0.6095 - regression_loss: 0.5549 - classification_loss: 0.0546 138/500 [=======>......................] - ETA: 2:18 - loss: 0.6074 - regression_loss: 0.5529 - classification_loss: 0.0545 139/500 [=======>......................] - ETA: 2:18 - loss: 0.6065 - regression_loss: 0.5520 - classification_loss: 0.0545 140/500 [=======>......................] - ETA: 2:17 - loss: 0.6082 - regression_loss: 0.5531 - classification_loss: 0.0551 141/500 [=======>......................] - ETA: 2:17 - loss: 0.6098 - regression_loss: 0.5544 - classification_loss: 0.0554 142/500 [=======>......................] - ETA: 2:16 - loss: 0.6084 - regression_loss: 0.5532 - classification_loss: 0.0552 143/500 [=======>......................] - ETA: 2:16 - loss: 0.6070 - regression_loss: 0.5519 - classification_loss: 0.0551 144/500 [=======>......................] - ETA: 2:16 - loss: 0.6047 - regression_loss: 0.5498 - classification_loss: 0.0548 145/500 [=======>......................] - ETA: 2:15 - loss: 0.6049 - regression_loss: 0.5500 - classification_loss: 0.0549 146/500 [=======>......................] - ETA: 2:15 - loss: 0.6055 - regression_loss: 0.5502 - classification_loss: 0.0552 147/500 [=======>......................] - ETA: 2:14 - loss: 0.6049 - regression_loss: 0.5499 - classification_loss: 0.0550 148/500 [=======>......................] - ETA: 2:14 - loss: 0.6016 - regression_loss: 0.5468 - classification_loss: 0.0547 149/500 [=======>......................] - ETA: 2:14 - loss: 0.6041 - regression_loss: 0.5491 - classification_loss: 0.0550 150/500 [========>.....................] - ETA: 2:13 - loss: 0.6035 - regression_loss: 0.5487 - classification_loss: 0.0548 151/500 [========>.....................] - ETA: 2:13 - loss: 0.6040 - regression_loss: 0.5491 - classification_loss: 0.0550 152/500 [========>.....................] - ETA: 2:12 - loss: 0.6039 - regression_loss: 0.5489 - classification_loss: 0.0551 153/500 [========>.....................] - ETA: 2:12 - loss: 0.6038 - regression_loss: 0.5487 - classification_loss: 0.0551 154/500 [========>.....................] - ETA: 2:12 - loss: 0.6035 - regression_loss: 0.5485 - classification_loss: 0.0550 155/500 [========>.....................] - ETA: 2:11 - loss: 0.6033 - regression_loss: 0.5484 - classification_loss: 0.0549 156/500 [========>.....................] - ETA: 2:11 - loss: 0.6041 - regression_loss: 0.5491 - classification_loss: 0.0550 157/500 [========>.....................] - ETA: 2:10 - loss: 0.6027 - regression_loss: 0.5478 - classification_loss: 0.0549 158/500 [========>.....................] - ETA: 2:10 - loss: 0.6018 - regression_loss: 0.5468 - classification_loss: 0.0549 159/500 [========>.....................] - ETA: 2:10 - loss: 0.6008 - regression_loss: 0.5460 - classification_loss: 0.0548 160/500 [========>.....................] - ETA: 2:09 - loss: 0.6000 - regression_loss: 0.5453 - classification_loss: 0.0547 161/500 [========>.....................] - ETA: 2:09 - loss: 0.6014 - regression_loss: 0.5462 - classification_loss: 0.0551 162/500 [========>.....................] - ETA: 2:09 - loss: 0.5997 - regression_loss: 0.5447 - classification_loss: 0.0550 163/500 [========>.....................] - ETA: 2:08 - loss: 0.6004 - regression_loss: 0.5454 - classification_loss: 0.0550 164/500 [========>.....................] - ETA: 2:08 - loss: 0.5988 - regression_loss: 0.5441 - classification_loss: 0.0547 165/500 [========>.....................] - ETA: 2:07 - loss: 0.5989 - regression_loss: 0.5442 - classification_loss: 0.0547 166/500 [========>.....................] - ETA: 2:07 - loss: 0.5983 - regression_loss: 0.5437 - classification_loss: 0.0547 167/500 [=========>....................] - ETA: 2:07 - loss: 0.5979 - regression_loss: 0.5433 - classification_loss: 0.0546 168/500 [=========>....................] - ETA: 2:06 - loss: 0.5965 - regression_loss: 0.5421 - classification_loss: 0.0545 169/500 [=========>....................] - ETA: 2:06 - loss: 0.5990 - regression_loss: 0.5439 - classification_loss: 0.0550 170/500 [=========>....................] - ETA: 2:05 - loss: 0.6000 - regression_loss: 0.5449 - classification_loss: 0.0551 171/500 [=========>....................] - ETA: 2:05 - loss: 0.5992 - regression_loss: 0.5441 - classification_loss: 0.0551 172/500 [=========>....................] - ETA: 2:05 - loss: 0.6001 - regression_loss: 0.5449 - classification_loss: 0.0552 173/500 [=========>....................] - ETA: 2:04 - loss: 0.5985 - regression_loss: 0.5435 - classification_loss: 0.0551 174/500 [=========>....................] - ETA: 2:04 - loss: 0.5968 - regression_loss: 0.5420 - classification_loss: 0.0548 175/500 [=========>....................] - ETA: 2:04 - loss: 0.5970 - regression_loss: 0.5420 - classification_loss: 0.0551 176/500 [=========>....................] - ETA: 2:03 - loss: 0.5967 - regression_loss: 0.5417 - classification_loss: 0.0549 177/500 [=========>....................] - ETA: 2:03 - loss: 0.5981 - regression_loss: 0.5429 - classification_loss: 0.0552 178/500 [=========>....................] - ETA: 2:02 - loss: 0.5971 - regression_loss: 0.5421 - classification_loss: 0.0550 179/500 [=========>....................] - ETA: 2:02 - loss: 0.5983 - regression_loss: 0.5431 - classification_loss: 0.0553 180/500 [=========>....................] - ETA: 2:02 - loss: 0.5978 - regression_loss: 0.5426 - classification_loss: 0.0552 181/500 [=========>....................] - ETA: 2:01 - loss: 0.5962 - regression_loss: 0.5412 - classification_loss: 0.0550 182/500 [=========>....................] - ETA: 2:01 - loss: 0.5954 - regression_loss: 0.5406 - classification_loss: 0.0549 183/500 [=========>....................] - ETA: 2:00 - loss: 0.5966 - regression_loss: 0.5416 - classification_loss: 0.0550 184/500 [==========>...................] - ETA: 2:00 - loss: 0.5965 - regression_loss: 0.5417 - classification_loss: 0.0549 185/500 [==========>...................] - ETA: 2:00 - loss: 0.5947 - regression_loss: 0.5400 - classification_loss: 0.0547 186/500 [==========>...................] - ETA: 1:59 - loss: 0.5935 - regression_loss: 0.5390 - classification_loss: 0.0545 187/500 [==========>...................] - ETA: 1:59 - loss: 0.5933 - regression_loss: 0.5388 - classification_loss: 0.0545 188/500 [==========>...................] - ETA: 1:58 - loss: 0.5918 - regression_loss: 0.5375 - classification_loss: 0.0543 189/500 [==========>...................] - ETA: 1:58 - loss: 0.5921 - regression_loss: 0.5377 - classification_loss: 0.0544 190/500 [==========>...................] - ETA: 1:58 - loss: 0.5918 - regression_loss: 0.5373 - classification_loss: 0.0544 191/500 [==========>...................] - ETA: 1:57 - loss: 0.5913 - regression_loss: 0.5368 - classification_loss: 0.0544 192/500 [==========>...................] - ETA: 1:57 - loss: 0.5900 - regression_loss: 0.5357 - classification_loss: 0.0543 193/500 [==========>...................] - ETA: 1:57 - loss: 0.5890 - regression_loss: 0.5347 - classification_loss: 0.0543 194/500 [==========>...................] - ETA: 1:56 - loss: 0.5882 - regression_loss: 0.5339 - classification_loss: 0.0542 195/500 [==========>...................] - ETA: 1:56 - loss: 0.5879 - regression_loss: 0.5337 - classification_loss: 0.0542 196/500 [==========>...................] - ETA: 1:55 - loss: 0.5863 - regression_loss: 0.5323 - classification_loss: 0.0540 197/500 [==========>...................] - ETA: 1:55 - loss: 0.5876 - regression_loss: 0.5335 - classification_loss: 0.0541 198/500 [==========>...................] - ETA: 1:55 - loss: 0.5897 - regression_loss: 0.5353 - classification_loss: 0.0544 199/500 [==========>...................] - ETA: 1:54 - loss: 0.5890 - regression_loss: 0.5346 - classification_loss: 0.0544 200/500 [===========>..................] - ETA: 1:54 - loss: 0.5903 - regression_loss: 0.5361 - classification_loss: 0.0542 201/500 [===========>..................] - ETA: 1:54 - loss: 0.5891 - regression_loss: 0.5350 - classification_loss: 0.0541 202/500 [===========>..................] - ETA: 1:53 - loss: 0.5899 - regression_loss: 0.5357 - classification_loss: 0.0542 203/500 [===========>..................] - ETA: 1:53 - loss: 0.5883 - regression_loss: 0.5342 - classification_loss: 0.0541 204/500 [===========>..................] - ETA: 1:52 - loss: 0.5888 - regression_loss: 0.5346 - classification_loss: 0.0542 205/500 [===========>..................] - ETA: 1:52 - loss: 0.5887 - regression_loss: 0.5344 - classification_loss: 0.0543 206/500 [===========>..................] - ETA: 1:52 - loss: 0.5895 - regression_loss: 0.5351 - classification_loss: 0.0543 207/500 [===========>..................] - ETA: 1:51 - loss: 0.5890 - regression_loss: 0.5348 - classification_loss: 0.0542 208/500 [===========>..................] - ETA: 1:51 - loss: 0.5889 - regression_loss: 0.5347 - classification_loss: 0.0542 209/500 [===========>..................] - ETA: 1:51 - loss: 0.5879 - regression_loss: 0.5338 - classification_loss: 0.0540 210/500 [===========>..................] - ETA: 1:50 - loss: 0.5875 - regression_loss: 0.5335 - classification_loss: 0.0539 211/500 [===========>..................] - ETA: 1:50 - loss: 0.5878 - regression_loss: 0.5338 - classification_loss: 0.0540 212/500 [===========>..................] - ETA: 1:49 - loss: 0.5866 - regression_loss: 0.5328 - classification_loss: 0.0538 213/500 [===========>..................] - ETA: 1:49 - loss: 0.5877 - regression_loss: 0.5337 - classification_loss: 0.0540 214/500 [===========>..................] - ETA: 1:49 - loss: 0.5887 - regression_loss: 0.5347 - classification_loss: 0.0540 215/500 [===========>..................] - ETA: 1:48 - loss: 0.5884 - regression_loss: 0.5345 - classification_loss: 0.0540 216/500 [===========>..................] - ETA: 1:48 - loss: 0.5893 - regression_loss: 0.5353 - classification_loss: 0.0540 217/500 [============>.................] - ETA: 1:47 - loss: 0.5900 - regression_loss: 0.5359 - classification_loss: 0.0541 218/500 [============>.................] - ETA: 1:47 - loss: 0.5887 - regression_loss: 0.5347 - classification_loss: 0.0540 219/500 [============>.................] - ETA: 1:47 - loss: 0.5880 - regression_loss: 0.5341 - classification_loss: 0.0539 220/500 [============>.................] - ETA: 1:46 - loss: 0.5881 - regression_loss: 0.5342 - classification_loss: 0.0538 221/500 [============>.................] - ETA: 1:46 - loss: 0.5886 - regression_loss: 0.5348 - classification_loss: 0.0538 222/500 [============>.................] - ETA: 1:45 - loss: 0.5895 - regression_loss: 0.5355 - classification_loss: 0.0540 223/500 [============>.................] - ETA: 1:45 - loss: 0.5900 - regression_loss: 0.5359 - classification_loss: 0.0540 224/500 [============>.................] - ETA: 1:45 - loss: 0.5913 - regression_loss: 0.5370 - classification_loss: 0.0543 225/500 [============>.................] - ETA: 1:44 - loss: 0.5915 - regression_loss: 0.5372 - classification_loss: 0.0543 226/500 [============>.................] - ETA: 1:44 - loss: 0.5917 - regression_loss: 0.5372 - classification_loss: 0.0544 227/500 [============>.................] - ETA: 1:44 - loss: 0.5911 - regression_loss: 0.5367 - classification_loss: 0.0544 228/500 [============>.................] - ETA: 1:43 - loss: 0.5907 - regression_loss: 0.5365 - classification_loss: 0.0542 229/500 [============>.................] - ETA: 1:43 - loss: 0.5902 - regression_loss: 0.5361 - classification_loss: 0.0542 230/500 [============>.................] - ETA: 1:42 - loss: 0.5892 - regression_loss: 0.5352 - classification_loss: 0.0540 231/500 [============>.................] - ETA: 1:42 - loss: 0.5889 - regression_loss: 0.5349 - classification_loss: 0.0540 232/500 [============>.................] - ETA: 1:42 - loss: 0.5896 - regression_loss: 0.5355 - classification_loss: 0.0542 233/500 [============>.................] - ETA: 1:41 - loss: 0.5902 - regression_loss: 0.5360 - classification_loss: 0.0542 234/500 [=============>................] - ETA: 1:41 - loss: 0.5901 - regression_loss: 0.5359 - classification_loss: 0.0542 235/500 [=============>................] - ETA: 1:40 - loss: 0.5889 - regression_loss: 0.5348 - classification_loss: 0.0540 236/500 [=============>................] - ETA: 1:40 - loss: 0.5886 - regression_loss: 0.5346 - classification_loss: 0.0540 237/500 [=============>................] - ETA: 1:40 - loss: 0.5883 - regression_loss: 0.5343 - classification_loss: 0.0540 238/500 [=============>................] - ETA: 1:39 - loss: 0.5871 - regression_loss: 0.5333 - classification_loss: 0.0538 239/500 [=============>................] - ETA: 1:39 - loss: 0.5859 - regression_loss: 0.5322 - classification_loss: 0.0537 240/500 [=============>................] - ETA: 1:38 - loss: 0.5849 - regression_loss: 0.5314 - classification_loss: 0.0536 241/500 [=============>................] - ETA: 1:38 - loss: 0.5866 - regression_loss: 0.5327 - classification_loss: 0.0538 242/500 [=============>................] - ETA: 1:38 - loss: 0.5857 - regression_loss: 0.5320 - classification_loss: 0.0537 243/500 [=============>................] - ETA: 1:37 - loss: 0.5848 - regression_loss: 0.5311 - classification_loss: 0.0537 244/500 [=============>................] - ETA: 1:37 - loss: 0.5853 - regression_loss: 0.5317 - classification_loss: 0.0537 245/500 [=============>................] - ETA: 1:37 - loss: 0.5854 - regression_loss: 0.5318 - classification_loss: 0.0536 246/500 [=============>................] - ETA: 1:36 - loss: 0.5848 - regression_loss: 0.5313 - classification_loss: 0.0535 247/500 [=============>................] - ETA: 1:36 - loss: 0.5856 - regression_loss: 0.5321 - classification_loss: 0.0535 248/500 [=============>................] - ETA: 1:35 - loss: 0.5859 - regression_loss: 0.5324 - classification_loss: 0.0535 249/500 [=============>................] - ETA: 1:35 - loss: 0.5851 - regression_loss: 0.5317 - classification_loss: 0.0534 250/500 [==============>...............] - ETA: 1:35 - loss: 0.5855 - regression_loss: 0.5322 - classification_loss: 0.0533 251/500 [==============>...............] - ETA: 1:34 - loss: 0.5849 - regression_loss: 0.5317 - classification_loss: 0.0531 252/500 [==============>...............] - ETA: 1:34 - loss: 0.5849 - regression_loss: 0.5317 - classification_loss: 0.0532 253/500 [==============>...............] - ETA: 1:34 - loss: 0.5832 - regression_loss: 0.5302 - classification_loss: 0.0530 254/500 [==============>...............] - ETA: 1:33 - loss: 0.5826 - regression_loss: 0.5296 - classification_loss: 0.0530 255/500 [==============>...............] - ETA: 1:33 - loss: 0.5840 - regression_loss: 0.5308 - classification_loss: 0.0532 256/500 [==============>...............] - ETA: 1:32 - loss: 0.5846 - regression_loss: 0.5313 - classification_loss: 0.0533 257/500 [==============>...............] - ETA: 1:32 - loss: 0.5844 - regression_loss: 0.5312 - classification_loss: 0.0533 258/500 [==============>...............] - ETA: 1:32 - loss: 0.5846 - regression_loss: 0.5314 - classification_loss: 0.0532 259/500 [==============>...............] - ETA: 1:31 - loss: 0.5852 - regression_loss: 0.5319 - classification_loss: 0.0533 260/500 [==============>...............] - ETA: 1:31 - loss: 0.5846 - regression_loss: 0.5313 - classification_loss: 0.0533 261/500 [==============>...............] - ETA: 1:31 - loss: 0.5850 - regression_loss: 0.5316 - classification_loss: 0.0534 262/500 [==============>...............] - ETA: 1:30 - loss: 0.5850 - regression_loss: 0.5317 - classification_loss: 0.0534 263/500 [==============>...............] - ETA: 1:30 - loss: 0.5857 - regression_loss: 0.5323 - classification_loss: 0.0534 264/500 [==============>...............] - ETA: 1:29 - loss: 0.5863 - regression_loss: 0.5328 - classification_loss: 0.0535 265/500 [==============>...............] - ETA: 1:29 - loss: 0.5864 - regression_loss: 0.5330 - classification_loss: 0.0535 266/500 [==============>...............] - ETA: 1:29 - loss: 0.5861 - regression_loss: 0.5326 - classification_loss: 0.0534 267/500 [===============>..............] - ETA: 1:28 - loss: 0.5855 - regression_loss: 0.5321 - classification_loss: 0.0534 268/500 [===============>..............] - ETA: 1:28 - loss: 0.5870 - regression_loss: 0.5334 - classification_loss: 0.0536 269/500 [===============>..............] - ETA: 1:28 - loss: 0.5864 - regression_loss: 0.5330 - classification_loss: 0.0535 270/500 [===============>..............] - ETA: 1:27 - loss: 0.5888 - regression_loss: 0.5350 - classification_loss: 0.0538 271/500 [===============>..............] - ETA: 1:27 - loss: 0.5883 - regression_loss: 0.5345 - classification_loss: 0.0538 272/500 [===============>..............] - ETA: 1:26 - loss: 0.5883 - regression_loss: 0.5346 - classification_loss: 0.0537 273/500 [===============>..............] - ETA: 1:26 - loss: 0.5909 - regression_loss: 0.5370 - classification_loss: 0.0538 274/500 [===============>..............] - ETA: 1:26 - loss: 0.5893 - regression_loss: 0.5356 - classification_loss: 0.0537 275/500 [===============>..............] - ETA: 1:25 - loss: 0.5904 - regression_loss: 0.5367 - classification_loss: 0.0538 276/500 [===============>..............] - ETA: 1:25 - loss: 0.5892 - regression_loss: 0.5356 - classification_loss: 0.0536 277/500 [===============>..............] - ETA: 1:25 - loss: 0.5896 - regression_loss: 0.5359 - classification_loss: 0.0537 278/500 [===============>..............] - ETA: 1:24 - loss: 0.5903 - regression_loss: 0.5365 - classification_loss: 0.0538 279/500 [===============>..............] - ETA: 1:24 - loss: 0.5899 - regression_loss: 0.5362 - classification_loss: 0.0537 280/500 [===============>..............] - ETA: 1:23 - loss: 0.5894 - regression_loss: 0.5358 - classification_loss: 0.0536 281/500 [===============>..............] - ETA: 1:23 - loss: 0.5899 - regression_loss: 0.5363 - classification_loss: 0.0537 282/500 [===============>..............] - ETA: 1:23 - loss: 0.5898 - regression_loss: 0.5360 - classification_loss: 0.0538 283/500 [===============>..............] - ETA: 1:22 - loss: 0.5895 - regression_loss: 0.5357 - classification_loss: 0.0538 284/500 [================>.............] - ETA: 1:22 - loss: 0.5892 - regression_loss: 0.5356 - classification_loss: 0.0537 285/500 [================>.............] - ETA: 1:21 - loss: 0.5891 - regression_loss: 0.5355 - classification_loss: 0.0537 286/500 [================>.............] - ETA: 1:21 - loss: 0.5898 - regression_loss: 0.5361 - classification_loss: 0.0537 287/500 [================>.............] - ETA: 1:21 - loss: 0.5907 - regression_loss: 0.5369 - classification_loss: 0.0539 288/500 [================>.............] - ETA: 1:20 - loss: 0.5912 - regression_loss: 0.5372 - classification_loss: 0.0540 289/500 [================>.............] - ETA: 1:20 - loss: 0.5913 - regression_loss: 0.5374 - classification_loss: 0.0539 290/500 [================>.............] - ETA: 1:19 - loss: 0.5901 - regression_loss: 0.5364 - classification_loss: 0.0538 291/500 [================>.............] - ETA: 1:19 - loss: 0.5903 - regression_loss: 0.5365 - classification_loss: 0.0538 292/500 [================>.............] - ETA: 1:19 - loss: 0.5901 - regression_loss: 0.5364 - classification_loss: 0.0537 293/500 [================>.............] - ETA: 1:18 - loss: 0.5904 - regression_loss: 0.5367 - classification_loss: 0.0537 294/500 [================>.............] - ETA: 1:18 - loss: 0.5895 - regression_loss: 0.5359 - classification_loss: 0.0536 295/500 [================>.............] - ETA: 1:18 - loss: 0.5895 - regression_loss: 0.5359 - classification_loss: 0.0536 296/500 [================>.............] - ETA: 1:17 - loss: 0.5887 - regression_loss: 0.5353 - classification_loss: 0.0535 297/500 [================>.............] - ETA: 1:17 - loss: 0.5900 - regression_loss: 0.5362 - classification_loss: 0.0537 298/500 [================>.............] - ETA: 1:16 - loss: 0.5892 - regression_loss: 0.5356 - classification_loss: 0.0536 299/500 [================>.............] - ETA: 1:16 - loss: 0.5902 - regression_loss: 0.5365 - classification_loss: 0.0537 300/500 [=================>............] - ETA: 1:16 - loss: 0.5896 - regression_loss: 0.5359 - classification_loss: 0.0537 301/500 [=================>............] - ETA: 1:15 - loss: 0.5896 - regression_loss: 0.5359 - classification_loss: 0.0537 302/500 [=================>............] - ETA: 1:15 - loss: 0.5893 - regression_loss: 0.5356 - classification_loss: 0.0537 303/500 [=================>............] - ETA: 1:15 - loss: 0.5896 - regression_loss: 0.5359 - classification_loss: 0.0538 304/500 [=================>............] - ETA: 1:14 - loss: 0.5899 - regression_loss: 0.5360 - classification_loss: 0.0538 305/500 [=================>............] - ETA: 1:14 - loss: 0.5891 - regression_loss: 0.5353 - classification_loss: 0.0537 306/500 [=================>............] - ETA: 1:13 - loss: 0.5889 - regression_loss: 0.5352 - classification_loss: 0.0537 307/500 [=================>............] - ETA: 1:13 - loss: 0.5882 - regression_loss: 0.5345 - classification_loss: 0.0537 308/500 [=================>............] - ETA: 1:13 - loss: 0.5880 - regression_loss: 0.5343 - classification_loss: 0.0537 309/500 [=================>............] - ETA: 1:12 - loss: 0.5883 - regression_loss: 0.5346 - classification_loss: 0.0538 310/500 [=================>............] - ETA: 1:12 - loss: 0.5881 - regression_loss: 0.5344 - classification_loss: 0.0537 311/500 [=================>............] - ETA: 1:12 - loss: 0.5870 - regression_loss: 0.5334 - classification_loss: 0.0536 312/500 [=================>............] - ETA: 1:11 - loss: 0.5873 - regression_loss: 0.5336 - classification_loss: 0.0537 313/500 [=================>............] - ETA: 1:11 - loss: 0.5872 - regression_loss: 0.5334 - classification_loss: 0.0537 314/500 [=================>............] - ETA: 1:10 - loss: 0.5866 - regression_loss: 0.5329 - classification_loss: 0.0537 315/500 [=================>............] - ETA: 1:10 - loss: 0.5854 - regression_loss: 0.5319 - classification_loss: 0.0535 316/500 [=================>............] - ETA: 1:10 - loss: 0.5855 - regression_loss: 0.5319 - classification_loss: 0.0536 317/500 [==================>...........] - ETA: 1:09 - loss: 0.5852 - regression_loss: 0.5316 - classification_loss: 0.0535 318/500 [==================>...........] - ETA: 1:09 - loss: 0.5861 - regression_loss: 0.5324 - classification_loss: 0.0537 319/500 [==================>...........] - ETA: 1:09 - loss: 0.5857 - regression_loss: 0.5320 - classification_loss: 0.0536 320/500 [==================>...........] - ETA: 1:08 - loss: 0.5850 - regression_loss: 0.5315 - classification_loss: 0.0536 321/500 [==================>...........] - ETA: 1:08 - loss: 0.5859 - regression_loss: 0.5323 - classification_loss: 0.0536 322/500 [==================>...........] - ETA: 1:07 - loss: 0.5865 - regression_loss: 0.5330 - classification_loss: 0.0536 323/500 [==================>...........] - ETA: 1:07 - loss: 0.5866 - regression_loss: 0.5331 - classification_loss: 0.0535 324/500 [==================>...........] - ETA: 1:07 - loss: 0.5863 - regression_loss: 0.5328 - classification_loss: 0.0535 325/500 [==================>...........] - ETA: 1:06 - loss: 0.5867 - regression_loss: 0.5333 - classification_loss: 0.0534 326/500 [==================>...........] - ETA: 1:06 - loss: 0.5870 - regression_loss: 0.5335 - classification_loss: 0.0535 327/500 [==================>...........] - ETA: 1:06 - loss: 0.5869 - regression_loss: 0.5335 - classification_loss: 0.0535 328/500 [==================>...........] - ETA: 1:05 - loss: 0.5871 - regression_loss: 0.5336 - classification_loss: 0.0535 329/500 [==================>...........] - ETA: 1:05 - loss: 0.5867 - regression_loss: 0.5333 - classification_loss: 0.0534 330/500 [==================>...........] - ETA: 1:04 - loss: 0.5869 - regression_loss: 0.5335 - classification_loss: 0.0534 331/500 [==================>...........] - ETA: 1:04 - loss: 0.5871 - regression_loss: 0.5337 - classification_loss: 0.0534 332/500 [==================>...........] - ETA: 1:04 - loss: 0.5865 - regression_loss: 0.5332 - classification_loss: 0.0533 333/500 [==================>...........] - ETA: 1:03 - loss: 0.5868 - regression_loss: 0.5334 - classification_loss: 0.0533 334/500 [===================>..........] - ETA: 1:03 - loss: 0.5868 - regression_loss: 0.5335 - classification_loss: 0.0533 335/500 [===================>..........] - ETA: 1:02 - loss: 0.5867 - regression_loss: 0.5335 - classification_loss: 0.0532 336/500 [===================>..........] - ETA: 1:02 - loss: 0.5860 - regression_loss: 0.5329 - classification_loss: 0.0531 337/500 [===================>..........] - ETA: 1:02 - loss: 0.5850 - regression_loss: 0.5320 - classification_loss: 0.0530 338/500 [===================>..........] - ETA: 1:01 - loss: 0.5850 - regression_loss: 0.5319 - classification_loss: 0.0531 339/500 [===================>..........] - ETA: 1:01 - loss: 0.5845 - regression_loss: 0.5315 - classification_loss: 0.0530 340/500 [===================>..........] - ETA: 1:01 - loss: 0.5847 - regression_loss: 0.5317 - classification_loss: 0.0530 341/500 [===================>..........] - ETA: 1:00 - loss: 0.5840 - regression_loss: 0.5311 - classification_loss: 0.0529 342/500 [===================>..........] - ETA: 1:00 - loss: 0.5836 - regression_loss: 0.5308 - classification_loss: 0.0528 343/500 [===================>..........] - ETA: 59s - loss: 0.5831 - regression_loss: 0.5303 - classification_loss: 0.0527  344/500 [===================>..........] - ETA: 59s - loss: 0.5826 - regression_loss: 0.5299 - classification_loss: 0.0527 345/500 [===================>..........] - ETA: 59s - loss: 0.5821 - regression_loss: 0.5294 - classification_loss: 0.0527 346/500 [===================>..........] - ETA: 58s - loss: 0.5813 - regression_loss: 0.5287 - classification_loss: 0.0526 347/500 [===================>..........] - ETA: 58s - loss: 0.5805 - regression_loss: 0.5279 - classification_loss: 0.0525 348/500 [===================>..........] - ETA: 57s - loss: 0.5793 - regression_loss: 0.5269 - classification_loss: 0.0524 349/500 [===================>..........] - ETA: 57s - loss: 0.5792 - regression_loss: 0.5267 - classification_loss: 0.0524 350/500 [====================>.........] - ETA: 57s - loss: 0.5787 - regression_loss: 0.5264 - classification_loss: 0.0523 351/500 [====================>.........] - ETA: 56s - loss: 0.5789 - regression_loss: 0.5265 - classification_loss: 0.0524 352/500 [====================>.........] - ETA: 56s - loss: 0.5790 - regression_loss: 0.5266 - classification_loss: 0.0524 353/500 [====================>.........] - ETA: 56s - loss: 0.5792 - regression_loss: 0.5266 - classification_loss: 0.0526 354/500 [====================>.........] - ETA: 55s - loss: 0.5792 - regression_loss: 0.5267 - classification_loss: 0.0525 355/500 [====================>.........] - ETA: 55s - loss: 0.5793 - regression_loss: 0.5269 - classification_loss: 0.0524 356/500 [====================>.........] - ETA: 54s - loss: 0.5795 - regression_loss: 0.5271 - classification_loss: 0.0524 357/500 [====================>.........] - ETA: 54s - loss: 0.5802 - regression_loss: 0.5278 - classification_loss: 0.0524 358/500 [====================>.........] - ETA: 54s - loss: 0.5813 - regression_loss: 0.5287 - classification_loss: 0.0526 359/500 [====================>.........] - ETA: 53s - loss: 0.5813 - regression_loss: 0.5286 - classification_loss: 0.0526 360/500 [====================>.........] - ETA: 53s - loss: 0.5808 - regression_loss: 0.5282 - classification_loss: 0.0526 361/500 [====================>.........] - ETA: 52s - loss: 0.5811 - regression_loss: 0.5284 - classification_loss: 0.0527 362/500 [====================>.........] - ETA: 52s - loss: 0.5820 - regression_loss: 0.5292 - classification_loss: 0.0528 363/500 [====================>.........] - ETA: 52s - loss: 0.5813 - regression_loss: 0.5285 - classification_loss: 0.0527 364/500 [====================>.........] - ETA: 51s - loss: 0.5813 - regression_loss: 0.5285 - classification_loss: 0.0527 365/500 [====================>.........] - ETA: 51s - loss: 0.5814 - regression_loss: 0.5286 - classification_loss: 0.0528 366/500 [====================>.........] - ETA: 51s - loss: 0.5809 - regression_loss: 0.5282 - classification_loss: 0.0527 367/500 [=====================>........] - ETA: 50s - loss: 0.5802 - regression_loss: 0.5276 - classification_loss: 0.0526 368/500 [=====================>........] - ETA: 50s - loss: 0.5796 - regression_loss: 0.5270 - classification_loss: 0.0526 369/500 [=====================>........] - ETA: 49s - loss: 0.5797 - regression_loss: 0.5270 - classification_loss: 0.0528 370/500 [=====================>........] - ETA: 49s - loss: 0.5798 - regression_loss: 0.5270 - classification_loss: 0.0528 371/500 [=====================>........] - ETA: 49s - loss: 0.5794 - regression_loss: 0.5267 - classification_loss: 0.0527 372/500 [=====================>........] - ETA: 48s - loss: 0.5797 - regression_loss: 0.5270 - classification_loss: 0.0528 373/500 [=====================>........] - ETA: 48s - loss: 0.5800 - regression_loss: 0.5272 - classification_loss: 0.0528 374/500 [=====================>........] - ETA: 48s - loss: 0.5806 - regression_loss: 0.5277 - classification_loss: 0.0529 375/500 [=====================>........] - ETA: 47s - loss: 0.5807 - regression_loss: 0.5279 - classification_loss: 0.0529 376/500 [=====================>........] - ETA: 47s - loss: 0.5815 - regression_loss: 0.5286 - classification_loss: 0.0530 377/500 [=====================>........] - ETA: 46s - loss: 0.5822 - regression_loss: 0.5292 - classification_loss: 0.0530 378/500 [=====================>........] - ETA: 46s - loss: 0.5817 - regression_loss: 0.5288 - classification_loss: 0.0529 379/500 [=====================>........] - ETA: 46s - loss: 0.5813 - regression_loss: 0.5285 - classification_loss: 0.0529 380/500 [=====================>........] - ETA: 45s - loss: 0.5819 - regression_loss: 0.5289 - classification_loss: 0.0530 381/500 [=====================>........] - ETA: 45s - loss: 0.5822 - regression_loss: 0.5291 - classification_loss: 0.0530 382/500 [=====================>........] - ETA: 44s - loss: 0.5833 - regression_loss: 0.5301 - classification_loss: 0.0532 383/500 [=====================>........] - ETA: 44s - loss: 0.5835 - regression_loss: 0.5302 - classification_loss: 0.0534 384/500 [======================>.......] - ETA: 44s - loss: 0.5834 - regression_loss: 0.5301 - classification_loss: 0.0533 385/500 [======================>.......] - ETA: 43s - loss: 0.5834 - regression_loss: 0.5301 - classification_loss: 0.0533 386/500 [======================>.......] - ETA: 43s - loss: 0.5831 - regression_loss: 0.5298 - classification_loss: 0.0533 387/500 [======================>.......] - ETA: 43s - loss: 0.5828 - regression_loss: 0.5295 - classification_loss: 0.0533 388/500 [======================>.......] - ETA: 42s - loss: 0.5827 - regression_loss: 0.5294 - classification_loss: 0.0532 389/500 [======================>.......] - ETA: 42s - loss: 0.5838 - regression_loss: 0.5305 - classification_loss: 0.0533 390/500 [======================>.......] - ETA: 41s - loss: 0.5841 - regression_loss: 0.5308 - classification_loss: 0.0533 391/500 [======================>.......] - ETA: 41s - loss: 0.5832 - regression_loss: 0.5300 - classification_loss: 0.0532 392/500 [======================>.......] - ETA: 41s - loss: 0.5827 - regression_loss: 0.5295 - classification_loss: 0.0532 393/500 [======================>.......] - ETA: 40s - loss: 0.5823 - regression_loss: 0.5291 - classification_loss: 0.0532 394/500 [======================>.......] - ETA: 40s - loss: 0.5827 - regression_loss: 0.5295 - classification_loss: 0.0532 395/500 [======================>.......] - ETA: 39s - loss: 0.5836 - regression_loss: 0.5302 - classification_loss: 0.0534 396/500 [======================>.......] - ETA: 39s - loss: 0.5838 - regression_loss: 0.5304 - classification_loss: 0.0534 397/500 [======================>.......] - ETA: 39s - loss: 0.5835 - regression_loss: 0.5302 - classification_loss: 0.0534 398/500 [======================>.......] - ETA: 38s - loss: 0.5841 - regression_loss: 0.5306 - classification_loss: 0.0535 399/500 [======================>.......] - ETA: 38s - loss: 0.5843 - regression_loss: 0.5308 - classification_loss: 0.0535 400/500 [=======================>......] - ETA: 38s - loss: 0.5837 - regression_loss: 0.5303 - classification_loss: 0.0534 401/500 [=======================>......] - ETA: 37s - loss: 0.5842 - regression_loss: 0.5307 - classification_loss: 0.0535 402/500 [=======================>......] - ETA: 37s - loss: 0.5842 - regression_loss: 0.5308 - classification_loss: 0.0535 403/500 [=======================>......] - ETA: 36s - loss: 0.5838 - regression_loss: 0.5304 - classification_loss: 0.0534 404/500 [=======================>......] - ETA: 36s - loss: 0.5833 - regression_loss: 0.5299 - classification_loss: 0.0534 405/500 [=======================>......] - ETA: 36s - loss: 0.5826 - regression_loss: 0.5293 - classification_loss: 0.0533 406/500 [=======================>......] - ETA: 35s - loss: 0.5821 - regression_loss: 0.5289 - classification_loss: 0.0533 407/500 [=======================>......] - ETA: 35s - loss: 0.5823 - regression_loss: 0.5290 - classification_loss: 0.0533 408/500 [=======================>......] - ETA: 35s - loss: 0.5814 - regression_loss: 0.5282 - classification_loss: 0.0533 409/500 [=======================>......] - ETA: 34s - loss: 0.5807 - regression_loss: 0.5275 - classification_loss: 0.0532 410/500 [=======================>......] - ETA: 34s - loss: 0.5804 - regression_loss: 0.5272 - classification_loss: 0.0532 411/500 [=======================>......] - ETA: 33s - loss: 0.5806 - regression_loss: 0.5274 - classification_loss: 0.0532 412/500 [=======================>......] - ETA: 33s - loss: 0.5807 - regression_loss: 0.5275 - classification_loss: 0.0532 413/500 [=======================>......] - ETA: 33s - loss: 0.5809 - regression_loss: 0.5277 - classification_loss: 0.0532 414/500 [=======================>......] - ETA: 32s - loss: 0.5809 - regression_loss: 0.5278 - classification_loss: 0.0531 415/500 [=======================>......] - ETA: 32s - loss: 0.5813 - regression_loss: 0.5280 - classification_loss: 0.0533 416/500 [=======================>......] - ETA: 31s - loss: 0.5822 - regression_loss: 0.5289 - classification_loss: 0.0533 417/500 [========================>.....] - ETA: 31s - loss: 0.5827 - regression_loss: 0.5293 - classification_loss: 0.0533 418/500 [========================>.....] - ETA: 31s - loss: 0.5822 - regression_loss: 0.5289 - classification_loss: 0.0533 419/500 [========================>.....] - ETA: 30s - loss: 0.5824 - regression_loss: 0.5291 - classification_loss: 0.0533 420/500 [========================>.....] - ETA: 30s - loss: 0.5825 - regression_loss: 0.5293 - classification_loss: 0.0532 421/500 [========================>.....] - ETA: 30s - loss: 0.5827 - regression_loss: 0.5294 - classification_loss: 0.0533 422/500 [========================>.....] - ETA: 29s - loss: 0.5823 - regression_loss: 0.5290 - classification_loss: 0.0533 423/500 [========================>.....] - ETA: 29s - loss: 0.5816 - regression_loss: 0.5284 - classification_loss: 0.0532 424/500 [========================>.....] - ETA: 28s - loss: 0.5815 - regression_loss: 0.5282 - classification_loss: 0.0533 425/500 [========================>.....] - ETA: 28s - loss: 0.5811 - regression_loss: 0.5279 - classification_loss: 0.0532 426/500 [========================>.....] - ETA: 28s - loss: 0.5809 - regression_loss: 0.5277 - classification_loss: 0.0532 427/500 [========================>.....] - ETA: 27s - loss: 0.5807 - regression_loss: 0.5276 - classification_loss: 0.0531 428/500 [========================>.....] - ETA: 27s - loss: 0.5814 - regression_loss: 0.5283 - classification_loss: 0.0532 429/500 [========================>.....] - ETA: 27s - loss: 0.5815 - regression_loss: 0.5283 - classification_loss: 0.0532 430/500 [========================>.....] - ETA: 26s - loss: 0.5814 - regression_loss: 0.5282 - classification_loss: 0.0532 431/500 [========================>.....] - ETA: 26s - loss: 0.5813 - regression_loss: 0.5282 - classification_loss: 0.0531 432/500 [========================>.....] - ETA: 25s - loss: 0.5811 - regression_loss: 0.5280 - classification_loss: 0.0531 433/500 [========================>.....] - ETA: 25s - loss: 0.5814 - regression_loss: 0.5283 - classification_loss: 0.0531 434/500 [=========================>....] - ETA: 25s - loss: 0.5810 - regression_loss: 0.5280 - classification_loss: 0.0531 435/500 [=========================>....] - ETA: 24s - loss: 0.5812 - regression_loss: 0.5281 - classification_loss: 0.0532 436/500 [=========================>....] - ETA: 24s - loss: 0.5804 - regression_loss: 0.5273 - classification_loss: 0.0531 437/500 [=========================>....] - ETA: 23s - loss: 0.5799 - regression_loss: 0.5268 - classification_loss: 0.0530 438/500 [=========================>....] - ETA: 23s - loss: 0.5798 - regression_loss: 0.5268 - classification_loss: 0.0530 439/500 [=========================>....] - ETA: 23s - loss: 0.5791 - regression_loss: 0.5261 - classification_loss: 0.0529 440/500 [=========================>....] - ETA: 22s - loss: 0.5792 - regression_loss: 0.5263 - classification_loss: 0.0529 441/500 [=========================>....] - ETA: 22s - loss: 0.5795 - regression_loss: 0.5265 - classification_loss: 0.0530 442/500 [=========================>....] - ETA: 22s - loss: 0.5787 - regression_loss: 0.5258 - classification_loss: 0.0529 443/500 [=========================>....] - ETA: 21s - loss: 0.5783 - regression_loss: 0.5254 - classification_loss: 0.0529 444/500 [=========================>....] - ETA: 21s - loss: 0.5782 - regression_loss: 0.5254 - classification_loss: 0.0528 445/500 [=========================>....] - ETA: 20s - loss: 0.5779 - regression_loss: 0.5251 - classification_loss: 0.0528 446/500 [=========================>....] - ETA: 20s - loss: 0.5783 - regression_loss: 0.5255 - classification_loss: 0.0528 447/500 [=========================>....] - ETA: 20s - loss: 0.5782 - regression_loss: 0.5254 - classification_loss: 0.0528 448/500 [=========================>....] - ETA: 19s - loss: 0.5783 - regression_loss: 0.5254 - classification_loss: 0.0529 449/500 [=========================>....] - ETA: 19s - loss: 0.5776 - regression_loss: 0.5249 - classification_loss: 0.0528 450/500 [==========================>...] - ETA: 19s - loss: 0.5775 - regression_loss: 0.5247 - classification_loss: 0.0528 451/500 [==========================>...] - ETA: 18s - loss: 0.5782 - regression_loss: 0.5253 - classification_loss: 0.0529 452/500 [==========================>...] - ETA: 18s - loss: 0.5784 - regression_loss: 0.5255 - classification_loss: 0.0529 453/500 [==========================>...] - ETA: 17s - loss: 0.5785 - regression_loss: 0.5256 - classification_loss: 0.0529 454/500 [==========================>...] - ETA: 17s - loss: 0.5780 - regression_loss: 0.5252 - classification_loss: 0.0528 455/500 [==========================>...] - ETA: 17s - loss: 0.5773 - regression_loss: 0.5245 - classification_loss: 0.0527 456/500 [==========================>...] - ETA: 16s - loss: 0.5776 - regression_loss: 0.5248 - classification_loss: 0.0528 457/500 [==========================>...] - ETA: 16s - loss: 0.5784 - regression_loss: 0.5255 - classification_loss: 0.0529 458/500 [==========================>...] - ETA: 15s - loss: 0.5783 - regression_loss: 0.5254 - classification_loss: 0.0529 459/500 [==========================>...] - ETA: 15s - loss: 0.5778 - regression_loss: 0.5250 - classification_loss: 0.0528 460/500 [==========================>...] - ETA: 15s - loss: 0.5776 - regression_loss: 0.5248 - classification_loss: 0.0528 461/500 [==========================>...] - ETA: 14s - loss: 0.5776 - regression_loss: 0.5247 - classification_loss: 0.0528 462/500 [==========================>...] - ETA: 14s - loss: 0.5778 - regression_loss: 0.5249 - classification_loss: 0.0529 463/500 [==========================>...] - ETA: 14s - loss: 0.5771 - regression_loss: 0.5243 - classification_loss: 0.0528 464/500 [==========================>...] - ETA: 13s - loss: 0.5770 - regression_loss: 0.5243 - classification_loss: 0.0528 465/500 [==========================>...] - ETA: 13s - loss: 0.5766 - regression_loss: 0.5238 - classification_loss: 0.0528 466/500 [==========================>...] - ETA: 12s - loss: 0.5769 - regression_loss: 0.5242 - classification_loss: 0.0527 467/500 [===========================>..] - ETA: 12s - loss: 0.5771 - regression_loss: 0.5243 - classification_loss: 0.0527 468/500 [===========================>..] - ETA: 12s - loss: 0.5769 - regression_loss: 0.5242 - classification_loss: 0.0527 469/500 [===========================>..] - ETA: 11s - loss: 0.5770 - regression_loss: 0.5243 - classification_loss: 0.0527 470/500 [===========================>..] - ETA: 11s - loss: 0.5772 - regression_loss: 0.5244 - classification_loss: 0.0528 471/500 [===========================>..] - ETA: 11s - loss: 0.5777 - regression_loss: 0.5248 - classification_loss: 0.0528 472/500 [===========================>..] - ETA: 10s - loss: 0.5775 - regression_loss: 0.5247 - classification_loss: 0.0528 473/500 [===========================>..] - ETA: 10s - loss: 0.5775 - regression_loss: 0.5247 - classification_loss: 0.0528 474/500 [===========================>..] - ETA: 9s - loss: 0.5774 - regression_loss: 0.5246 - classification_loss: 0.0528  475/500 [===========================>..] - ETA: 9s - loss: 0.5772 - regression_loss: 0.5245 - classification_loss: 0.0528 476/500 [===========================>..] - ETA: 9s - loss: 0.5770 - regression_loss: 0.5243 - classification_loss: 0.0527 477/500 [===========================>..] - ETA: 8s - loss: 0.5766 - regression_loss: 0.5239 - classification_loss: 0.0527 478/500 [===========================>..] - ETA: 8s - loss: 0.5768 - regression_loss: 0.5240 - classification_loss: 0.0528 479/500 [===========================>..] - ETA: 7s - loss: 0.5767 - regression_loss: 0.5239 - classification_loss: 0.0527 480/500 [===========================>..] - ETA: 7s - loss: 0.5758 - regression_loss: 0.5231 - classification_loss: 0.0527 481/500 [===========================>..] - ETA: 7s - loss: 0.5759 - regression_loss: 0.5232 - classification_loss: 0.0527 482/500 [===========================>..] - ETA: 6s - loss: 0.5751 - regression_loss: 0.5226 - classification_loss: 0.0526 483/500 [===========================>..] - ETA: 6s - loss: 0.5747 - regression_loss: 0.5222 - classification_loss: 0.0526 484/500 [============================>.] - ETA: 6s - loss: 0.5747 - regression_loss: 0.5221 - classification_loss: 0.0525 485/500 [============================>.] - ETA: 5s - loss: 0.5749 - regression_loss: 0.5223 - classification_loss: 0.0526 486/500 [============================>.] - ETA: 5s - loss: 0.5747 - regression_loss: 0.5222 - classification_loss: 0.0525 487/500 [============================>.] - ETA: 4s - loss: 0.5738 - regression_loss: 0.5213 - classification_loss: 0.0524 488/500 [============================>.] - ETA: 4s - loss: 0.5740 - regression_loss: 0.5216 - classification_loss: 0.0524 489/500 [============================>.] - ETA: 4s - loss: 0.5735 - regression_loss: 0.5211 - classification_loss: 0.0524 490/500 [============================>.] - ETA: 3s - loss: 0.5739 - regression_loss: 0.5215 - classification_loss: 0.0524 491/500 [============================>.] - ETA: 3s - loss: 0.5736 - regression_loss: 0.5212 - classification_loss: 0.0524 492/500 [============================>.] - ETA: 3s - loss: 0.5735 - regression_loss: 0.5211 - classification_loss: 0.0524 493/500 [============================>.] - ETA: 2s - loss: 0.5734 - regression_loss: 0.5211 - classification_loss: 0.0524 494/500 [============================>.] - ETA: 2s - loss: 0.5729 - regression_loss: 0.5206 - classification_loss: 0.0523 495/500 [============================>.] - ETA: 1s - loss: 0.5724 - regression_loss: 0.5202 - classification_loss: 0.0523 496/500 [============================>.] - ETA: 1s - loss: 0.5725 - regression_loss: 0.5202 - classification_loss: 0.0523 497/500 [============================>.] - ETA: 1s - loss: 0.5722 - regression_loss: 0.5199 - classification_loss: 0.0522 498/500 [============================>.] - ETA: 0s - loss: 0.5716 - regression_loss: 0.5194 - classification_loss: 0.0521 499/500 [============================>.] - ETA: 0s - loss: 0.5713 - regression_loss: 0.5192 - classification_loss: 0.0521 500/500 [==============================] - 190s 381ms/step - loss: 0.5716 - regression_loss: 0.5195 - classification_loss: 0.0521 1172 instances of class plum with average precision: 0.7236 mAP: 0.7236 Epoch 00019: saving model to ./training/snapshots/resnet101_pascal_19.h5 Epoch 20/20 1/500 [..............................] - ETA: 3:05 - loss: 0.5773 - regression_loss: 0.4963 - classification_loss: 0.0810 2/500 [..............................] - ETA: 3:11 - loss: 0.4303 - regression_loss: 0.3798 - classification_loss: 0.0505 3/500 [..............................] - ETA: 3:11 - loss: 0.3811 - regression_loss: 0.3390 - classification_loss: 0.0421 4/500 [..............................] - ETA: 3:10 - loss: 0.3891 - regression_loss: 0.3522 - classification_loss: 0.0369 5/500 [..............................] - ETA: 3:07 - loss: 0.4409 - regression_loss: 0.3964 - classification_loss: 0.0444 6/500 [..............................] - ETA: 3:06 - loss: 0.4840 - regression_loss: 0.4362 - classification_loss: 0.0478 7/500 [..............................] - ETA: 3:05 - loss: 0.4905 - regression_loss: 0.4428 - classification_loss: 0.0477 8/500 [..............................] - ETA: 3:04 - loss: 0.4745 - regression_loss: 0.4280 - classification_loss: 0.0465 9/500 [..............................] - ETA: 3:04 - loss: 0.5030 - regression_loss: 0.4531 - classification_loss: 0.0498 10/500 [..............................] - ETA: 3:04 - loss: 0.5125 - regression_loss: 0.4588 - classification_loss: 0.0536 11/500 [..............................] - ETA: 3:03 - loss: 0.5664 - regression_loss: 0.5055 - classification_loss: 0.0609 12/500 [..............................] - ETA: 3:03 - loss: 0.5890 - regression_loss: 0.5245 - classification_loss: 0.0645 13/500 [..............................] - ETA: 3:03 - loss: 0.6036 - regression_loss: 0.5407 - classification_loss: 0.0629 14/500 [..............................] - ETA: 3:02 - loss: 0.5983 - regression_loss: 0.5344 - classification_loss: 0.0638 15/500 [..............................] - ETA: 3:01 - loss: 0.5903 - regression_loss: 0.5279 - classification_loss: 0.0624 16/500 [..............................] - ETA: 3:02 - loss: 0.5750 - regression_loss: 0.5146 - classification_loss: 0.0604 17/500 [>.............................] - ETA: 3:02 - loss: 0.5644 - regression_loss: 0.5055 - classification_loss: 0.0589 18/500 [>.............................] - ETA: 3:02 - loss: 0.5599 - regression_loss: 0.5016 - classification_loss: 0.0583 19/500 [>.............................] - ETA: 3:02 - loss: 0.5458 - regression_loss: 0.4892 - classification_loss: 0.0566 20/500 [>.............................] - ETA: 3:01 - loss: 0.5395 - regression_loss: 0.4850 - classification_loss: 0.0545 21/500 [>.............................] - ETA: 3:00 - loss: 0.5360 - regression_loss: 0.4816 - classification_loss: 0.0544 22/500 [>.............................] - ETA: 2:59 - loss: 0.5467 - regression_loss: 0.4915 - classification_loss: 0.0552 23/500 [>.............................] - ETA: 2:59 - loss: 0.5555 - regression_loss: 0.4986 - classification_loss: 0.0570 24/500 [>.............................] - ETA: 2:59 - loss: 0.5607 - regression_loss: 0.5044 - classification_loss: 0.0563 25/500 [>.............................] - ETA: 2:59 - loss: 0.5580 - regression_loss: 0.5030 - classification_loss: 0.0550 26/500 [>.............................] - ETA: 2:59 - loss: 0.5622 - regression_loss: 0.5058 - classification_loss: 0.0564 27/500 [>.............................] - ETA: 2:59 - loss: 0.5670 - regression_loss: 0.5107 - classification_loss: 0.0563 28/500 [>.............................] - ETA: 2:59 - loss: 0.5619 - regression_loss: 0.5062 - classification_loss: 0.0557 29/500 [>.............................] - ETA: 2:59 - loss: 0.5602 - regression_loss: 0.5053 - classification_loss: 0.0549 30/500 [>.............................] - ETA: 2:59 - loss: 0.5619 - regression_loss: 0.5071 - classification_loss: 0.0547 31/500 [>.............................] - ETA: 2:58 - loss: 0.5580 - regression_loss: 0.5042 - classification_loss: 0.0538 32/500 [>.............................] - ETA: 2:58 - loss: 0.5547 - regression_loss: 0.5010 - classification_loss: 0.0537 33/500 [>.............................] - ETA: 2:57 - loss: 0.5526 - regression_loss: 0.4989 - classification_loss: 0.0536 34/500 [=>............................] - ETA: 2:57 - loss: 0.5542 - regression_loss: 0.4994 - classification_loss: 0.0547 35/500 [=>............................] - ETA: 2:56 - loss: 0.5526 - regression_loss: 0.4971 - classification_loss: 0.0554 36/500 [=>............................] - ETA: 2:56 - loss: 0.5544 - regression_loss: 0.4988 - classification_loss: 0.0557 37/500 [=>............................] - ETA: 2:55 - loss: 0.5478 - regression_loss: 0.4932 - classification_loss: 0.0546 38/500 [=>............................] - ETA: 2:55 - loss: 0.5440 - regression_loss: 0.4902 - classification_loss: 0.0538 39/500 [=>............................] - ETA: 2:54 - loss: 0.5457 - regression_loss: 0.4915 - classification_loss: 0.0542 40/500 [=>............................] - ETA: 2:54 - loss: 0.5416 - regression_loss: 0.4882 - classification_loss: 0.0534 41/500 [=>............................] - ETA: 2:53 - loss: 0.5552 - regression_loss: 0.5010 - classification_loss: 0.0542 42/500 [=>............................] - ETA: 2:53 - loss: 0.5589 - regression_loss: 0.5050 - classification_loss: 0.0539 43/500 [=>............................] - ETA: 2:52 - loss: 0.5638 - regression_loss: 0.5098 - classification_loss: 0.0540 44/500 [=>............................] - ETA: 2:52 - loss: 0.5705 - regression_loss: 0.5154 - classification_loss: 0.0551 45/500 [=>............................] - ETA: 2:52 - loss: 0.5794 - regression_loss: 0.5228 - classification_loss: 0.0566 46/500 [=>............................] - ETA: 2:52 - loss: 0.5766 - regression_loss: 0.5201 - classification_loss: 0.0564 47/500 [=>............................] - ETA: 2:52 - loss: 0.5775 - regression_loss: 0.5209 - classification_loss: 0.0566 48/500 [=>............................] - ETA: 2:51 - loss: 0.5744 - regression_loss: 0.5176 - classification_loss: 0.0568 49/500 [=>............................] - ETA: 2:51 - loss: 0.5786 - regression_loss: 0.5211 - classification_loss: 0.0575 50/500 [==>...........................] - ETA: 2:51 - loss: 0.5757 - regression_loss: 0.5184 - classification_loss: 0.0573 51/500 [==>...........................] - ETA: 2:50 - loss: 0.5716 - regression_loss: 0.5151 - classification_loss: 0.0566 52/500 [==>...........................] - ETA: 2:50 - loss: 0.5729 - regression_loss: 0.5162 - classification_loss: 0.0567 53/500 [==>...........................] - ETA: 2:49 - loss: 0.5706 - regression_loss: 0.5144 - classification_loss: 0.0562 54/500 [==>...........................] - ETA: 2:49 - loss: 0.5703 - regression_loss: 0.5135 - classification_loss: 0.0569 55/500 [==>...........................] - ETA: 2:49 - loss: 0.5707 - regression_loss: 0.5143 - classification_loss: 0.0564 56/500 [==>...........................] - ETA: 2:48 - loss: 0.5681 - regression_loss: 0.5121 - classification_loss: 0.0560 57/500 [==>...........................] - ETA: 2:48 - loss: 0.5643 - regression_loss: 0.5088 - classification_loss: 0.0555 58/500 [==>...........................] - ETA: 2:48 - loss: 0.5686 - regression_loss: 0.5121 - classification_loss: 0.0565 59/500 [==>...........................] - ETA: 2:48 - loss: 0.5710 - regression_loss: 0.5142 - classification_loss: 0.0568 60/500 [==>...........................] - ETA: 2:47 - loss: 0.5705 - regression_loss: 0.5135 - classification_loss: 0.0570 61/500 [==>...........................] - ETA: 2:47 - loss: 0.5688 - regression_loss: 0.5122 - classification_loss: 0.0566 62/500 [==>...........................] - ETA: 2:47 - loss: 0.5633 - regression_loss: 0.5072 - classification_loss: 0.0560 63/500 [==>...........................] - ETA: 2:46 - loss: 0.5612 - regression_loss: 0.5056 - classification_loss: 0.0556 64/500 [==>...........................] - ETA: 2:46 - loss: 0.5572 - regression_loss: 0.5021 - classification_loss: 0.0551 65/500 [==>...........................] - ETA: 2:45 - loss: 0.5597 - regression_loss: 0.5043 - classification_loss: 0.0554 66/500 [==>...........................] - ETA: 2:45 - loss: 0.5630 - regression_loss: 0.5073 - classification_loss: 0.0557 67/500 [===>..........................] - ETA: 2:45 - loss: 0.5659 - regression_loss: 0.5099 - classification_loss: 0.0560 68/500 [===>..........................] - ETA: 2:44 - loss: 0.5644 - regression_loss: 0.5085 - classification_loss: 0.0559 69/500 [===>..........................] - ETA: 2:44 - loss: 0.5601 - regression_loss: 0.5048 - classification_loss: 0.0553 70/500 [===>..........................] - ETA: 2:43 - loss: 0.5606 - regression_loss: 0.5052 - classification_loss: 0.0555 71/500 [===>..........................] - ETA: 2:43 - loss: 0.5576 - regression_loss: 0.5027 - classification_loss: 0.0550 72/500 [===>..........................] - ETA: 2:43 - loss: 0.5600 - regression_loss: 0.5043 - classification_loss: 0.0557 73/500 [===>..........................] - ETA: 2:42 - loss: 0.5583 - regression_loss: 0.5027 - classification_loss: 0.0556 74/500 [===>..........................] - ETA: 2:42 - loss: 0.5621 - regression_loss: 0.5060 - classification_loss: 0.0561 75/500 [===>..........................] - ETA: 2:42 - loss: 0.5669 - regression_loss: 0.5101 - classification_loss: 0.0568 76/500 [===>..........................] - ETA: 2:41 - loss: 0.5676 - regression_loss: 0.5109 - classification_loss: 0.0568 77/500 [===>..........................] - ETA: 2:41 - loss: 0.5655 - regression_loss: 0.5089 - classification_loss: 0.0565 78/500 [===>..........................] - ETA: 2:41 - loss: 0.5638 - regression_loss: 0.5074 - classification_loss: 0.0563 79/500 [===>..........................] - ETA: 2:40 - loss: 0.5589 - regression_loss: 0.5031 - classification_loss: 0.0558 80/500 [===>..........................] - ETA: 2:40 - loss: 0.5541 - regression_loss: 0.4987 - classification_loss: 0.0554 81/500 [===>..........................] - ETA: 2:40 - loss: 0.5551 - regression_loss: 0.4994 - classification_loss: 0.0557 82/500 [===>..........................] - ETA: 2:39 - loss: 0.5597 - regression_loss: 0.5031 - classification_loss: 0.0566 83/500 [===>..........................] - ETA: 2:39 - loss: 0.5611 - regression_loss: 0.5048 - classification_loss: 0.0562 84/500 [====>.........................] - ETA: 2:39 - loss: 0.5579 - regression_loss: 0.5020 - classification_loss: 0.0559 85/500 [====>.........................] - ETA: 2:39 - loss: 0.5561 - regression_loss: 0.5003 - classification_loss: 0.0558 86/500 [====>.........................] - ETA: 2:38 - loss: 0.5566 - regression_loss: 0.5008 - classification_loss: 0.0558 87/500 [====>.........................] - ETA: 2:38 - loss: 0.5540 - regression_loss: 0.4985 - classification_loss: 0.0555 88/500 [====>.........................] - ETA: 2:37 - loss: 0.5522 - regression_loss: 0.4969 - classification_loss: 0.0553 89/500 [====>.........................] - ETA: 2:37 - loss: 0.5519 - regression_loss: 0.4967 - classification_loss: 0.0552 90/500 [====>.........................] - ETA: 2:37 - loss: 0.5525 - regression_loss: 0.4974 - classification_loss: 0.0552 91/500 [====>.........................] - ETA: 2:36 - loss: 0.5474 - regression_loss: 0.4927 - classification_loss: 0.0547 92/500 [====>.........................] - ETA: 2:36 - loss: 0.5487 - regression_loss: 0.4941 - classification_loss: 0.0546 93/500 [====>.........................] - ETA: 2:35 - loss: 0.5503 - regression_loss: 0.4956 - classification_loss: 0.0547 94/500 [====>.........................] - ETA: 2:35 - loss: 0.5499 - regression_loss: 0.4952 - classification_loss: 0.0546 95/500 [====>.........................] - ETA: 2:34 - loss: 0.5488 - regression_loss: 0.4944 - classification_loss: 0.0544 96/500 [====>.........................] - ETA: 2:34 - loss: 0.5492 - regression_loss: 0.4946 - classification_loss: 0.0545 97/500 [====>.........................] - ETA: 2:34 - loss: 0.5530 - regression_loss: 0.4983 - classification_loss: 0.0547 98/500 [====>.........................] - ETA: 2:33 - loss: 0.5512 - regression_loss: 0.4968 - classification_loss: 0.0544 99/500 [====>.........................] - ETA: 2:33 - loss: 0.5489 - regression_loss: 0.4949 - classification_loss: 0.0540 100/500 [=====>........................] - ETA: 2:32 - loss: 0.5495 - regression_loss: 0.4953 - classification_loss: 0.0542 101/500 [=====>........................] - ETA: 2:32 - loss: 0.5508 - regression_loss: 0.4964 - classification_loss: 0.0544 102/500 [=====>........................] - ETA: 2:31 - loss: 0.5555 - regression_loss: 0.5005 - classification_loss: 0.0550 103/500 [=====>........................] - ETA: 2:31 - loss: 0.5564 - regression_loss: 0.5013 - classification_loss: 0.0551 104/500 [=====>........................] - ETA: 2:31 - loss: 0.5587 - regression_loss: 0.5036 - classification_loss: 0.0551 105/500 [=====>........................] - ETA: 2:30 - loss: 0.5558 - regression_loss: 0.5010 - classification_loss: 0.0548 106/500 [=====>........................] - ETA: 2:30 - loss: 0.5532 - regression_loss: 0.4988 - classification_loss: 0.0543 107/500 [=====>........................] - ETA: 2:29 - loss: 0.5536 - regression_loss: 0.4994 - classification_loss: 0.0541 108/500 [=====>........................] - ETA: 2:29 - loss: 0.5557 - regression_loss: 0.5015 - classification_loss: 0.0542 109/500 [=====>........................] - ETA: 2:29 - loss: 0.5539 - regression_loss: 0.4998 - classification_loss: 0.0541 110/500 [=====>........................] - ETA: 2:28 - loss: 0.5534 - regression_loss: 0.4994 - classification_loss: 0.0540 111/500 [=====>........................] - ETA: 2:28 - loss: 0.5515 - regression_loss: 0.4976 - classification_loss: 0.0539 112/500 [=====>........................] - ETA: 2:28 - loss: 0.5501 - regression_loss: 0.4964 - classification_loss: 0.0536 113/500 [=====>........................] - ETA: 2:28 - loss: 0.5541 - regression_loss: 0.5003 - classification_loss: 0.0538 114/500 [=====>........................] - ETA: 2:27 - loss: 0.5529 - regression_loss: 0.4993 - classification_loss: 0.0536 115/500 [=====>........................] - ETA: 2:27 - loss: 0.5515 - regression_loss: 0.4982 - classification_loss: 0.0534 116/500 [=====>........................] - ETA: 2:26 - loss: 0.5505 - regression_loss: 0.4973 - classification_loss: 0.0532 117/500 [======>.......................] - ETA: 2:26 - loss: 0.5512 - regression_loss: 0.4978 - classification_loss: 0.0534 118/500 [======>.......................] - ETA: 2:26 - loss: 0.5489 - regression_loss: 0.4958 - classification_loss: 0.0531 119/500 [======>.......................] - ETA: 2:25 - loss: 0.5490 - regression_loss: 0.4960 - classification_loss: 0.0530 120/500 [======>.......................] - ETA: 2:25 - loss: 0.5494 - regression_loss: 0.4963 - classification_loss: 0.0531 121/500 [======>.......................] - ETA: 2:24 - loss: 0.5487 - regression_loss: 0.4958 - classification_loss: 0.0529 122/500 [======>.......................] - ETA: 2:24 - loss: 0.5492 - regression_loss: 0.4962 - classification_loss: 0.0530 123/500 [======>.......................] - ETA: 2:24 - loss: 0.5478 - regression_loss: 0.4951 - classification_loss: 0.0528 124/500 [======>.......................] - ETA: 2:23 - loss: 0.5463 - regression_loss: 0.4938 - classification_loss: 0.0525 125/500 [======>.......................] - ETA: 2:23 - loss: 0.5434 - regression_loss: 0.4911 - classification_loss: 0.0523 126/500 [======>.......................] - ETA: 2:23 - loss: 0.5454 - regression_loss: 0.4930 - classification_loss: 0.0523 127/500 [======>.......................] - ETA: 2:22 - loss: 0.5481 - regression_loss: 0.4954 - classification_loss: 0.0528 128/500 [======>.......................] - ETA: 2:22 - loss: 0.5457 - regression_loss: 0.4932 - classification_loss: 0.0525 129/500 [======>.......................] - ETA: 2:22 - loss: 0.5458 - regression_loss: 0.4935 - classification_loss: 0.0523 130/500 [======>.......................] - ETA: 2:21 - loss: 0.5443 - regression_loss: 0.4922 - classification_loss: 0.0521 131/500 [======>.......................] - ETA: 2:21 - loss: 0.5461 - regression_loss: 0.4939 - classification_loss: 0.0523 132/500 [======>.......................] - ETA: 2:20 - loss: 0.5460 - regression_loss: 0.4939 - classification_loss: 0.0522 133/500 [======>.......................] - ETA: 2:20 - loss: 0.5447 - regression_loss: 0.4928 - classification_loss: 0.0519 134/500 [=======>......................] - ETA: 2:20 - loss: 0.5440 - regression_loss: 0.4922 - classification_loss: 0.0518 135/500 [=======>......................] - ETA: 2:19 - loss: 0.5436 - regression_loss: 0.4919 - classification_loss: 0.0517 136/500 [=======>......................] - ETA: 2:19 - loss: 0.5425 - regression_loss: 0.4909 - classification_loss: 0.0516 137/500 [=======>......................] - ETA: 2:19 - loss: 0.5476 - regression_loss: 0.4953 - classification_loss: 0.0522 138/500 [=======>......................] - ETA: 2:18 - loss: 0.5473 - regression_loss: 0.4952 - classification_loss: 0.0521 139/500 [=======>......................] - ETA: 2:18 - loss: 0.5460 - regression_loss: 0.4940 - classification_loss: 0.0520 140/500 [=======>......................] - ETA: 2:17 - loss: 0.5454 - regression_loss: 0.4937 - classification_loss: 0.0517 141/500 [=======>......................] - ETA: 2:17 - loss: 0.5452 - regression_loss: 0.4936 - classification_loss: 0.0517 142/500 [=======>......................] - ETA: 2:17 - loss: 0.5451 - regression_loss: 0.4934 - classification_loss: 0.0517 143/500 [=======>......................] - ETA: 2:16 - loss: 0.5459 - regression_loss: 0.4941 - classification_loss: 0.0518 144/500 [=======>......................] - ETA: 2:16 - loss: 0.5453 - regression_loss: 0.4937 - classification_loss: 0.0516 145/500 [=======>......................] - ETA: 2:15 - loss: 0.5482 - regression_loss: 0.4966 - classification_loss: 0.0517 146/500 [=======>......................] - ETA: 2:15 - loss: 0.5476 - regression_loss: 0.4960 - classification_loss: 0.0515 147/500 [=======>......................] - ETA: 2:15 - loss: 0.5469 - regression_loss: 0.4954 - classification_loss: 0.0514 148/500 [=======>......................] - ETA: 2:14 - loss: 0.5478 - regression_loss: 0.4963 - classification_loss: 0.0515 149/500 [=======>......................] - ETA: 2:14 - loss: 0.5457 - regression_loss: 0.4945 - classification_loss: 0.0513 150/500 [========>.....................] - ETA: 2:14 - loss: 0.5450 - regression_loss: 0.4936 - classification_loss: 0.0514 151/500 [========>.....................] - ETA: 2:13 - loss: 0.5456 - regression_loss: 0.4941 - classification_loss: 0.0515 152/500 [========>.....................] - ETA: 2:13 - loss: 0.5464 - regression_loss: 0.4948 - classification_loss: 0.0516 153/500 [========>.....................] - ETA: 2:12 - loss: 0.5448 - regression_loss: 0.4934 - classification_loss: 0.0514 154/500 [========>.....................] - ETA: 2:12 - loss: 0.5449 - regression_loss: 0.4935 - classification_loss: 0.0514 155/500 [========>.....................] - ETA: 2:12 - loss: 0.5449 - regression_loss: 0.4936 - classification_loss: 0.0513 156/500 [========>.....................] - ETA: 2:11 - loss: 0.5450 - regression_loss: 0.4937 - classification_loss: 0.0513 157/500 [========>.....................] - ETA: 2:11 - loss: 0.5433 - regression_loss: 0.4921 - classification_loss: 0.0512 158/500 [========>.....................] - ETA: 2:11 - loss: 0.5424 - regression_loss: 0.4914 - classification_loss: 0.0510 159/500 [========>.....................] - ETA: 2:10 - loss: 0.5434 - regression_loss: 0.4923 - classification_loss: 0.0510 160/500 [========>.....................] - ETA: 2:10 - loss: 0.5422 - regression_loss: 0.4913 - classification_loss: 0.0509 161/500 [========>.....................] - ETA: 2:10 - loss: 0.5417 - regression_loss: 0.4908 - classification_loss: 0.0509 162/500 [========>.....................] - ETA: 2:09 - loss: 0.5396 - regression_loss: 0.4890 - classification_loss: 0.0507 163/500 [========>.....................] - ETA: 2:09 - loss: 0.5379 - regression_loss: 0.4875 - classification_loss: 0.0505 164/500 [========>.....................] - ETA: 2:08 - loss: 0.5396 - regression_loss: 0.4892 - classification_loss: 0.0504 165/500 [========>.....................] - ETA: 2:08 - loss: 0.5409 - regression_loss: 0.4904 - classification_loss: 0.0505 166/500 [========>.....................] - ETA: 2:08 - loss: 0.5416 - regression_loss: 0.4909 - classification_loss: 0.0507 167/500 [=========>....................] - ETA: 2:07 - loss: 0.5412 - regression_loss: 0.4906 - classification_loss: 0.0505 168/500 [=========>....................] - ETA: 2:07 - loss: 0.5413 - regression_loss: 0.4907 - classification_loss: 0.0506 169/500 [=========>....................] - ETA: 2:07 - loss: 0.5402 - regression_loss: 0.4898 - classification_loss: 0.0504 170/500 [=========>....................] - ETA: 2:06 - loss: 0.5399 - regression_loss: 0.4895 - classification_loss: 0.0504 171/500 [=========>....................] - ETA: 2:06 - loss: 0.5413 - regression_loss: 0.4909 - classification_loss: 0.0505 172/500 [=========>....................] - ETA: 2:05 - loss: 0.5423 - regression_loss: 0.4917 - classification_loss: 0.0505 173/500 [=========>....................] - ETA: 2:05 - loss: 0.5419 - regression_loss: 0.4915 - classification_loss: 0.0504 174/500 [=========>....................] - ETA: 2:05 - loss: 0.5423 - regression_loss: 0.4918 - classification_loss: 0.0505 175/500 [=========>....................] - ETA: 2:04 - loss: 0.5427 - regression_loss: 0.4922 - classification_loss: 0.0506 176/500 [=========>....................] - ETA: 2:04 - loss: 0.5423 - regression_loss: 0.4917 - classification_loss: 0.0505 177/500 [=========>....................] - ETA: 2:04 - loss: 0.5436 - regression_loss: 0.4929 - classification_loss: 0.0508 178/500 [=========>....................] - ETA: 2:03 - loss: 0.5455 - regression_loss: 0.4947 - classification_loss: 0.0507 179/500 [=========>....................] - ETA: 2:03 - loss: 0.5451 - regression_loss: 0.4944 - classification_loss: 0.0507 180/500 [=========>....................] - ETA: 2:02 - loss: 0.5478 - regression_loss: 0.4969 - classification_loss: 0.0509 181/500 [=========>....................] - ETA: 2:02 - loss: 0.5476 - regression_loss: 0.4967 - classification_loss: 0.0509 182/500 [=========>....................] - ETA: 2:02 - loss: 0.5475 - regression_loss: 0.4967 - classification_loss: 0.0508 183/500 [=========>....................] - ETA: 2:01 - loss: 0.5462 - regression_loss: 0.4956 - classification_loss: 0.0507 184/500 [==========>...................] - ETA: 2:01 - loss: 0.5454 - regression_loss: 0.4948 - classification_loss: 0.0506 185/500 [==========>...................] - ETA: 2:00 - loss: 0.5447 - regression_loss: 0.4942 - classification_loss: 0.0505 186/500 [==========>...................] - ETA: 2:00 - loss: 0.5442 - regression_loss: 0.4938 - classification_loss: 0.0505 187/500 [==========>...................] - ETA: 2:00 - loss: 0.5452 - regression_loss: 0.4946 - classification_loss: 0.0506 188/500 [==========>...................] - ETA: 1:59 - loss: 0.5454 - regression_loss: 0.4948 - classification_loss: 0.0506 189/500 [==========>...................] - ETA: 1:59 - loss: 0.5444 - regression_loss: 0.4940 - classification_loss: 0.0505 190/500 [==========>...................] - ETA: 1:59 - loss: 0.5438 - regression_loss: 0.4935 - classification_loss: 0.0503 191/500 [==========>...................] - ETA: 1:58 - loss: 0.5442 - regression_loss: 0.4939 - classification_loss: 0.0503 192/500 [==========>...................] - ETA: 1:58 - loss: 0.5446 - regression_loss: 0.4943 - classification_loss: 0.0503 193/500 [==========>...................] - ETA: 1:57 - loss: 0.5438 - regression_loss: 0.4936 - classification_loss: 0.0501 194/500 [==========>...................] - ETA: 1:57 - loss: 0.5438 - regression_loss: 0.4937 - classification_loss: 0.0501 195/500 [==========>...................] - ETA: 1:57 - loss: 0.5442 - regression_loss: 0.4942 - classification_loss: 0.0500 196/500 [==========>...................] - ETA: 1:56 - loss: 0.5443 - regression_loss: 0.4943 - classification_loss: 0.0500 197/500 [==========>...................] - ETA: 1:56 - loss: 0.5444 - regression_loss: 0.4944 - classification_loss: 0.0501 198/500 [==========>...................] - ETA: 1:56 - loss: 0.5447 - regression_loss: 0.4947 - classification_loss: 0.0500 199/500 [==========>...................] - ETA: 1:55 - loss: 0.5447 - regression_loss: 0.4947 - classification_loss: 0.0500 200/500 [===========>..................] - ETA: 1:55 - loss: 0.5447 - regression_loss: 0.4947 - classification_loss: 0.0500 201/500 [===========>..................] - ETA: 1:54 - loss: 0.5457 - regression_loss: 0.4955 - classification_loss: 0.0501 202/500 [===========>..................] - ETA: 1:54 - loss: 0.5458 - regression_loss: 0.4955 - classification_loss: 0.0503 203/500 [===========>..................] - ETA: 1:54 - loss: 0.5450 - regression_loss: 0.4949 - classification_loss: 0.0501 204/500 [===========>..................] - ETA: 1:53 - loss: 0.5453 - regression_loss: 0.4951 - classification_loss: 0.0502 205/500 [===========>..................] - ETA: 1:53 - loss: 0.5459 - regression_loss: 0.4956 - classification_loss: 0.0504 206/500 [===========>..................] - ETA: 1:52 - loss: 0.5459 - regression_loss: 0.4955 - classification_loss: 0.0504 207/500 [===========>..................] - ETA: 1:52 - loss: 0.5454 - regression_loss: 0.4951 - classification_loss: 0.0503 208/500 [===========>..................] - ETA: 1:52 - loss: 0.5460 - regression_loss: 0.4955 - classification_loss: 0.0505 209/500 [===========>..................] - ETA: 1:51 - loss: 0.5453 - regression_loss: 0.4949 - classification_loss: 0.0504 210/500 [===========>..................] - ETA: 1:51 - loss: 0.5459 - regression_loss: 0.4953 - classification_loss: 0.0506 211/500 [===========>..................] - ETA: 1:51 - loss: 0.5452 - regression_loss: 0.4949 - classification_loss: 0.0504 212/500 [===========>..................] - ETA: 1:50 - loss: 0.5448 - regression_loss: 0.4945 - classification_loss: 0.0503 213/500 [===========>..................] - ETA: 1:50 - loss: 0.5458 - regression_loss: 0.4953 - classification_loss: 0.0505 214/500 [===========>..................] - ETA: 1:49 - loss: 0.5449 - regression_loss: 0.4945 - classification_loss: 0.0504 215/500 [===========>..................] - ETA: 1:49 - loss: 0.5455 - regression_loss: 0.4951 - classification_loss: 0.0504 216/500 [===========>..................] - ETA: 1:49 - loss: 0.5439 - regression_loss: 0.4936 - classification_loss: 0.0503 217/500 [============>.................] - ETA: 1:48 - loss: 0.5432 - regression_loss: 0.4930 - classification_loss: 0.0502 218/500 [============>.................] - ETA: 1:48 - loss: 0.5432 - regression_loss: 0.4930 - classification_loss: 0.0501 219/500 [============>.................] - ETA: 1:47 - loss: 0.5425 - regression_loss: 0.4925 - classification_loss: 0.0501 220/500 [============>.................] - ETA: 1:47 - loss: 0.5433 - regression_loss: 0.4931 - classification_loss: 0.0502 221/500 [============>.................] - ETA: 1:47 - loss: 0.5427 - regression_loss: 0.4927 - classification_loss: 0.0501 222/500 [============>.................] - ETA: 1:46 - loss: 0.5414 - regression_loss: 0.4915 - classification_loss: 0.0499 223/500 [============>.................] - ETA: 1:46 - loss: 0.5400 - regression_loss: 0.4902 - classification_loss: 0.0498 224/500 [============>.................] - ETA: 1:45 - loss: 0.5396 - regression_loss: 0.4899 - classification_loss: 0.0498 225/500 [============>.................] - ETA: 1:45 - loss: 0.5399 - regression_loss: 0.4901 - classification_loss: 0.0498 226/500 [============>.................] - ETA: 1:45 - loss: 0.5402 - regression_loss: 0.4904 - classification_loss: 0.0498 227/500 [============>.................] - ETA: 1:44 - loss: 0.5397 - regression_loss: 0.4899 - classification_loss: 0.0498 228/500 [============>.................] - ETA: 1:44 - loss: 0.5391 - regression_loss: 0.4894 - classification_loss: 0.0496 229/500 [============>.................] - ETA: 1:44 - loss: 0.5389 - regression_loss: 0.4892 - classification_loss: 0.0497 230/500 [============>.................] - ETA: 1:43 - loss: 0.5399 - regression_loss: 0.4902 - classification_loss: 0.0497 231/500 [============>.................] - ETA: 1:43 - loss: 0.5399 - regression_loss: 0.4902 - classification_loss: 0.0496 232/500 [============>.................] - ETA: 1:42 - loss: 0.5412 - regression_loss: 0.4914 - classification_loss: 0.0499 233/500 [============>.................] - ETA: 1:42 - loss: 0.5405 - regression_loss: 0.4908 - classification_loss: 0.0497 234/500 [=============>................] - ETA: 1:42 - loss: 0.5414 - regression_loss: 0.4916 - classification_loss: 0.0497 235/500 [=============>................] - ETA: 1:41 - loss: 0.5425 - regression_loss: 0.4926 - classification_loss: 0.0499 236/500 [=============>................] - ETA: 1:41 - loss: 0.5420 - regression_loss: 0.4921 - classification_loss: 0.0498 237/500 [=============>................] - ETA: 1:40 - loss: 0.5425 - regression_loss: 0.4927 - classification_loss: 0.0498 238/500 [=============>................] - ETA: 1:40 - loss: 0.5437 - regression_loss: 0.4937 - classification_loss: 0.0499 239/500 [=============>................] - ETA: 1:40 - loss: 0.5427 - regression_loss: 0.4929 - classification_loss: 0.0498 240/500 [=============>................] - ETA: 1:39 - loss: 0.5436 - regression_loss: 0.4938 - classification_loss: 0.0498 241/500 [=============>................] - ETA: 1:39 - loss: 0.5457 - regression_loss: 0.4958 - classification_loss: 0.0499 242/500 [=============>................] - ETA: 1:39 - loss: 0.5447 - regression_loss: 0.4949 - classification_loss: 0.0498 243/500 [=============>................] - ETA: 1:38 - loss: 0.5440 - regression_loss: 0.4942 - classification_loss: 0.0497 244/500 [=============>................] - ETA: 1:38 - loss: 0.5436 - regression_loss: 0.4940 - classification_loss: 0.0496 245/500 [=============>................] - ETA: 1:37 - loss: 0.5437 - regression_loss: 0.4941 - classification_loss: 0.0496 246/500 [=============>................] - ETA: 1:37 - loss: 0.5450 - regression_loss: 0.4953 - classification_loss: 0.0497 247/500 [=============>................] - ETA: 1:37 - loss: 0.5452 - regression_loss: 0.4955 - classification_loss: 0.0497 248/500 [=============>................] - ETA: 1:36 - loss: 0.5463 - regression_loss: 0.4965 - classification_loss: 0.0498 249/500 [=============>................] - ETA: 1:36 - loss: 0.5477 - regression_loss: 0.4976 - classification_loss: 0.0501 250/500 [==============>...............] - ETA: 1:35 - loss: 0.5491 - regression_loss: 0.4988 - classification_loss: 0.0503 251/500 [==============>...............] - ETA: 1:35 - loss: 0.5509 - regression_loss: 0.5003 - classification_loss: 0.0506 252/500 [==============>...............] - ETA: 1:35 - loss: 0.5512 - regression_loss: 0.5006 - classification_loss: 0.0506 253/500 [==============>...............] - ETA: 1:34 - loss: 0.5511 - regression_loss: 0.5005 - classification_loss: 0.0506 254/500 [==============>...............] - ETA: 1:34 - loss: 0.5508 - regression_loss: 0.5002 - classification_loss: 0.0506 255/500 [==============>...............] - ETA: 1:33 - loss: 0.5508 - regression_loss: 0.5002 - classification_loss: 0.0506 256/500 [==============>...............] - ETA: 1:33 - loss: 0.5496 - regression_loss: 0.4991 - classification_loss: 0.0505 257/500 [==============>...............] - ETA: 1:33 - loss: 0.5483 - regression_loss: 0.4980 - classification_loss: 0.0503 258/500 [==============>...............] - ETA: 1:32 - loss: 0.5481 - regression_loss: 0.4977 - classification_loss: 0.0504 259/500 [==============>...............] - ETA: 1:32 - loss: 0.5474 - regression_loss: 0.4971 - classification_loss: 0.0503 260/500 [==============>...............] - ETA: 1:32 - loss: 0.5466 - regression_loss: 0.4964 - classification_loss: 0.0502 261/500 [==============>...............] - ETA: 1:31 - loss: 0.5460 - regression_loss: 0.4959 - classification_loss: 0.0501 262/500 [==============>...............] - ETA: 1:31 - loss: 0.5454 - regression_loss: 0.4953 - classification_loss: 0.0501 263/500 [==============>...............] - ETA: 1:30 - loss: 0.5448 - regression_loss: 0.4948 - classification_loss: 0.0500 264/500 [==============>...............] - ETA: 1:30 - loss: 0.5440 - regression_loss: 0.4941 - classification_loss: 0.0499 265/500 [==============>...............] - ETA: 1:30 - loss: 0.5438 - regression_loss: 0.4939 - classification_loss: 0.0499 266/500 [==============>...............] - ETA: 1:29 - loss: 0.5439 - regression_loss: 0.4939 - classification_loss: 0.0499 267/500 [===============>..............] - ETA: 1:29 - loss: 0.5441 - regression_loss: 0.4941 - classification_loss: 0.0500 268/500 [===============>..............] - ETA: 1:28 - loss: 0.5436 - regression_loss: 0.4936 - classification_loss: 0.0500 269/500 [===============>..............] - ETA: 1:28 - loss: 0.5433 - regression_loss: 0.4934 - classification_loss: 0.0498 270/500 [===============>..............] - ETA: 1:28 - loss: 0.5426 - regression_loss: 0.4929 - classification_loss: 0.0497 271/500 [===============>..............] - ETA: 1:27 - loss: 0.5430 - regression_loss: 0.4933 - classification_loss: 0.0497 272/500 [===============>..............] - ETA: 1:27 - loss: 0.5423 - regression_loss: 0.4927 - classification_loss: 0.0496 273/500 [===============>..............] - ETA: 1:27 - loss: 0.5442 - regression_loss: 0.4946 - classification_loss: 0.0496 274/500 [===============>..............] - ETA: 1:26 - loss: 0.5437 - regression_loss: 0.4942 - classification_loss: 0.0495 275/500 [===============>..............] - ETA: 1:26 - loss: 0.5436 - regression_loss: 0.4942 - classification_loss: 0.0495 276/500 [===============>..............] - ETA: 1:25 - loss: 0.5432 - regression_loss: 0.4937 - classification_loss: 0.0494 277/500 [===============>..............] - ETA: 1:25 - loss: 0.5442 - regression_loss: 0.4946 - classification_loss: 0.0496 278/500 [===============>..............] - ETA: 1:25 - loss: 0.5437 - regression_loss: 0.4942 - classification_loss: 0.0495 279/500 [===============>..............] - ETA: 1:24 - loss: 0.5445 - regression_loss: 0.4950 - classification_loss: 0.0495 280/500 [===============>..............] - ETA: 1:24 - loss: 0.5450 - regression_loss: 0.4954 - classification_loss: 0.0495 281/500 [===============>..............] - ETA: 1:23 - loss: 0.5451 - regression_loss: 0.4955 - classification_loss: 0.0495 282/500 [===============>..............] - ETA: 1:23 - loss: 0.5466 - regression_loss: 0.4972 - classification_loss: 0.0495 283/500 [===============>..............] - ETA: 1:23 - loss: 0.5461 - regression_loss: 0.4967 - classification_loss: 0.0494 284/500 [================>.............] - ETA: 1:22 - loss: 0.5458 - regression_loss: 0.4964 - classification_loss: 0.0494 285/500 [================>.............] - ETA: 1:22 - loss: 0.5459 - regression_loss: 0.4965 - classification_loss: 0.0494 286/500 [================>.............] - ETA: 1:22 - loss: 0.5466 - regression_loss: 0.4971 - classification_loss: 0.0495 287/500 [================>.............] - ETA: 1:21 - loss: 0.5456 - regression_loss: 0.4962 - classification_loss: 0.0494 288/500 [================>.............] - ETA: 1:21 - loss: 0.5442 - regression_loss: 0.4949 - classification_loss: 0.0493 289/500 [================>.............] - ETA: 1:20 - loss: 0.5438 - regression_loss: 0.4946 - classification_loss: 0.0493 290/500 [================>.............] - ETA: 1:20 - loss: 0.5434 - regression_loss: 0.4942 - classification_loss: 0.0491 291/500 [================>.............] - ETA: 1:20 - loss: 0.5437 - regression_loss: 0.4945 - classification_loss: 0.0492 292/500 [================>.............] - ETA: 1:19 - loss: 0.5436 - regression_loss: 0.4944 - classification_loss: 0.0492 293/500 [================>.............] - ETA: 1:19 - loss: 0.5432 - regression_loss: 0.4940 - classification_loss: 0.0492 294/500 [================>.............] - ETA: 1:18 - loss: 0.5437 - regression_loss: 0.4944 - classification_loss: 0.0493 295/500 [================>.............] - ETA: 1:18 - loss: 0.5430 - regression_loss: 0.4938 - classification_loss: 0.0492 296/500 [================>.............] - ETA: 1:18 - loss: 0.5423 - regression_loss: 0.4932 - classification_loss: 0.0491 297/500 [================>.............] - ETA: 1:17 - loss: 0.5413 - regression_loss: 0.4923 - classification_loss: 0.0490 298/500 [================>.............] - ETA: 1:17 - loss: 0.5416 - regression_loss: 0.4925 - classification_loss: 0.0491 299/500 [================>.............] - ETA: 1:17 - loss: 0.5412 - regression_loss: 0.4921 - classification_loss: 0.0491 300/500 [=================>............] - ETA: 1:16 - loss: 0.5406 - regression_loss: 0.4916 - classification_loss: 0.0490 301/500 [=================>............] - ETA: 1:16 - loss: 0.5414 - regression_loss: 0.4923 - classification_loss: 0.0490 302/500 [=================>............] - ETA: 1:15 - loss: 0.5412 - regression_loss: 0.4921 - classification_loss: 0.0490 303/500 [=================>............] - ETA: 1:15 - loss: 0.5419 - regression_loss: 0.4926 - classification_loss: 0.0492 304/500 [=================>............] - ETA: 1:15 - loss: 0.5419 - regression_loss: 0.4927 - classification_loss: 0.0492 305/500 [=================>............] - ETA: 1:14 - loss: 0.5417 - regression_loss: 0.4925 - classification_loss: 0.0492 306/500 [=================>............] - ETA: 1:14 - loss: 0.5424 - regression_loss: 0.4931 - classification_loss: 0.0492 307/500 [=================>............] - ETA: 1:13 - loss: 0.5420 - regression_loss: 0.4928 - classification_loss: 0.0492 308/500 [=================>............] - ETA: 1:13 - loss: 0.5427 - regression_loss: 0.4934 - classification_loss: 0.0493 309/500 [=================>............] - ETA: 1:13 - loss: 0.5431 - regression_loss: 0.4938 - classification_loss: 0.0493 310/500 [=================>............] - ETA: 1:12 - loss: 0.5434 - regression_loss: 0.4940 - classification_loss: 0.0494 311/500 [=================>............] - ETA: 1:12 - loss: 0.5442 - regression_loss: 0.4948 - classification_loss: 0.0494 312/500 [=================>............] - ETA: 1:12 - loss: 0.5442 - regression_loss: 0.4948 - classification_loss: 0.0494 313/500 [=================>............] - ETA: 1:11 - loss: 0.5444 - regression_loss: 0.4950 - classification_loss: 0.0494 314/500 [=================>............] - ETA: 1:11 - loss: 0.5438 - regression_loss: 0.4944 - classification_loss: 0.0494 315/500 [=================>............] - ETA: 1:11 - loss: 0.5435 - regression_loss: 0.4941 - classification_loss: 0.0494 316/500 [=================>............] - ETA: 1:10 - loss: 0.5438 - regression_loss: 0.4943 - classification_loss: 0.0494 317/500 [==================>...........] - ETA: 1:10 - loss: 0.5436 - regression_loss: 0.4942 - classification_loss: 0.0494 318/500 [==================>...........] - ETA: 1:09 - loss: 0.5436 - regression_loss: 0.4943 - classification_loss: 0.0494 319/500 [==================>...........] - ETA: 1:09 - loss: 0.5427 - regression_loss: 0.4934 - classification_loss: 0.0493 320/500 [==================>...........] - ETA: 1:09 - loss: 0.5418 - regression_loss: 0.4926 - classification_loss: 0.0492 321/500 [==================>...........] - ETA: 1:08 - loss: 0.5420 - regression_loss: 0.4928 - classification_loss: 0.0492 322/500 [==================>...........] - ETA: 1:08 - loss: 0.5419 - regression_loss: 0.4927 - classification_loss: 0.0492 323/500 [==================>...........] - ETA: 1:07 - loss: 0.5425 - regression_loss: 0.4933 - classification_loss: 0.0491 324/500 [==================>...........] - ETA: 1:07 - loss: 0.5417 - regression_loss: 0.4927 - classification_loss: 0.0490 325/500 [==================>...........] - ETA: 1:07 - loss: 0.5422 - regression_loss: 0.4930 - classification_loss: 0.0491 326/500 [==================>...........] - ETA: 1:06 - loss: 0.5417 - regression_loss: 0.4927 - classification_loss: 0.0491 327/500 [==================>...........] - ETA: 1:06 - loss: 0.5424 - regression_loss: 0.4933 - classification_loss: 0.0492 328/500 [==================>...........] - ETA: 1:05 - loss: 0.5421 - regression_loss: 0.4930 - classification_loss: 0.0492 329/500 [==================>...........] - ETA: 1:05 - loss: 0.5413 - regression_loss: 0.4922 - classification_loss: 0.0491 330/500 [==================>...........] - ETA: 1:05 - loss: 0.5409 - regression_loss: 0.4919 - classification_loss: 0.0490 331/500 [==================>...........] - ETA: 1:04 - loss: 0.5410 - regression_loss: 0.4920 - classification_loss: 0.0490 332/500 [==================>...........] - ETA: 1:04 - loss: 0.5420 - regression_loss: 0.4928 - classification_loss: 0.0492 333/500 [==================>...........] - ETA: 1:04 - loss: 0.5417 - regression_loss: 0.4925 - classification_loss: 0.0492 334/500 [===================>..........] - ETA: 1:03 - loss: 0.5422 - regression_loss: 0.4930 - classification_loss: 0.0492 335/500 [===================>..........] - ETA: 1:03 - loss: 0.5418 - regression_loss: 0.4927 - classification_loss: 0.0491 336/500 [===================>..........] - ETA: 1:02 - loss: 0.5411 - regression_loss: 0.4921 - classification_loss: 0.0490 337/500 [===================>..........] - ETA: 1:02 - loss: 0.5416 - regression_loss: 0.4926 - classification_loss: 0.0490 338/500 [===================>..........] - ETA: 1:02 - loss: 0.5413 - regression_loss: 0.4924 - classification_loss: 0.0489 339/500 [===================>..........] - ETA: 1:01 - loss: 0.5410 - regression_loss: 0.4922 - classification_loss: 0.0488 340/500 [===================>..........] - ETA: 1:01 - loss: 0.5407 - regression_loss: 0.4919 - classification_loss: 0.0488 341/500 [===================>..........] - ETA: 1:00 - loss: 0.5410 - regression_loss: 0.4922 - classification_loss: 0.0488 342/500 [===================>..........] - ETA: 1:00 - loss: 0.5418 - regression_loss: 0.4930 - classification_loss: 0.0488 343/500 [===================>..........] - ETA: 1:00 - loss: 0.5426 - regression_loss: 0.4937 - classification_loss: 0.0489 344/500 [===================>..........] - ETA: 59s - loss: 0.5425 - regression_loss: 0.4936 - classification_loss: 0.0489  345/500 [===================>..........] - ETA: 59s - loss: 0.5415 - regression_loss: 0.4927 - classification_loss: 0.0488 346/500 [===================>..........] - ETA: 59s - loss: 0.5414 - regression_loss: 0.4926 - classification_loss: 0.0488 347/500 [===================>..........] - ETA: 58s - loss: 0.5421 - regression_loss: 0.4933 - classification_loss: 0.0488 348/500 [===================>..........] - ETA: 58s - loss: 0.5422 - regression_loss: 0.4933 - classification_loss: 0.0489 349/500 [===================>..........] - ETA: 57s - loss: 0.5413 - regression_loss: 0.4925 - classification_loss: 0.0488 350/500 [====================>.........] - ETA: 57s - loss: 0.5411 - regression_loss: 0.4922 - classification_loss: 0.0488 351/500 [====================>.........] - ETA: 57s - loss: 0.5400 - regression_loss: 0.4913 - classification_loss: 0.0487 352/500 [====================>.........] - ETA: 56s - loss: 0.5394 - regression_loss: 0.4907 - classification_loss: 0.0487 353/500 [====================>.........] - ETA: 56s - loss: 0.5389 - regression_loss: 0.4903 - classification_loss: 0.0486 354/500 [====================>.........] - ETA: 55s - loss: 0.5385 - regression_loss: 0.4900 - classification_loss: 0.0485 355/500 [====================>.........] - ETA: 55s - loss: 0.5380 - regression_loss: 0.4896 - classification_loss: 0.0485 356/500 [====================>.........] - ETA: 55s - loss: 0.5395 - regression_loss: 0.4908 - classification_loss: 0.0487 357/500 [====================>.........] - ETA: 54s - loss: 0.5405 - regression_loss: 0.4916 - classification_loss: 0.0489 358/500 [====================>.........] - ETA: 54s - loss: 0.5409 - regression_loss: 0.4920 - classification_loss: 0.0489 359/500 [====================>.........] - ETA: 53s - loss: 0.5403 - regression_loss: 0.4915 - classification_loss: 0.0488 360/500 [====================>.........] - ETA: 53s - loss: 0.5410 - regression_loss: 0.4923 - classification_loss: 0.0488 361/500 [====================>.........] - ETA: 53s - loss: 0.5417 - regression_loss: 0.4929 - classification_loss: 0.0488 362/500 [====================>.........] - ETA: 52s - loss: 0.5420 - regression_loss: 0.4932 - classification_loss: 0.0488 363/500 [====================>.........] - ETA: 52s - loss: 0.5420 - regression_loss: 0.4932 - classification_loss: 0.0488 364/500 [====================>.........] - ETA: 52s - loss: 0.5424 - regression_loss: 0.4936 - classification_loss: 0.0488 365/500 [====================>.........] - ETA: 51s - loss: 0.5423 - regression_loss: 0.4936 - classification_loss: 0.0487 366/500 [====================>.........] - ETA: 51s - loss: 0.5416 - regression_loss: 0.4929 - classification_loss: 0.0487 367/500 [=====================>........] - ETA: 50s - loss: 0.5423 - regression_loss: 0.4936 - classification_loss: 0.0487 368/500 [=====================>........] - ETA: 50s - loss: 0.5417 - regression_loss: 0.4930 - classification_loss: 0.0487 369/500 [=====================>........] - ETA: 50s - loss: 0.5410 - regression_loss: 0.4924 - classification_loss: 0.0486 370/500 [=====================>........] - ETA: 49s - loss: 0.5411 - regression_loss: 0.4925 - classification_loss: 0.0486 371/500 [=====================>........] - ETA: 49s - loss: 0.5413 - regression_loss: 0.4927 - classification_loss: 0.0486 372/500 [=====================>........] - ETA: 49s - loss: 0.5412 - regression_loss: 0.4925 - classification_loss: 0.0486 373/500 [=====================>........] - ETA: 48s - loss: 0.5418 - regression_loss: 0.4931 - classification_loss: 0.0487 374/500 [=====================>........] - ETA: 48s - loss: 0.5426 - regression_loss: 0.4938 - classification_loss: 0.0488 375/500 [=====================>........] - ETA: 47s - loss: 0.5427 - regression_loss: 0.4939 - classification_loss: 0.0488 376/500 [=====================>........] - ETA: 47s - loss: 0.5432 - regression_loss: 0.4944 - classification_loss: 0.0488 377/500 [=====================>........] - ETA: 47s - loss: 0.5430 - regression_loss: 0.4942 - classification_loss: 0.0488 378/500 [=====================>........] - ETA: 46s - loss: 0.5425 - regression_loss: 0.4937 - classification_loss: 0.0488 379/500 [=====================>........] - ETA: 46s - loss: 0.5421 - regression_loss: 0.4934 - classification_loss: 0.0487 380/500 [=====================>........] - ETA: 45s - loss: 0.5417 - regression_loss: 0.4930 - classification_loss: 0.0487 381/500 [=====================>........] - ETA: 45s - loss: 0.5410 - regression_loss: 0.4924 - classification_loss: 0.0486 382/500 [=====================>........] - ETA: 45s - loss: 0.5409 - regression_loss: 0.4924 - classification_loss: 0.0485 383/500 [=====================>........] - ETA: 44s - loss: 0.5416 - regression_loss: 0.4930 - classification_loss: 0.0486 384/500 [======================>.......] - ETA: 44s - loss: 0.5407 - regression_loss: 0.4922 - classification_loss: 0.0485 385/500 [======================>.......] - ETA: 44s - loss: 0.5400 - regression_loss: 0.4916 - classification_loss: 0.0484 386/500 [======================>.......] - ETA: 43s - loss: 0.5407 - regression_loss: 0.4923 - classification_loss: 0.0484 387/500 [======================>.......] - ETA: 43s - loss: 0.5409 - regression_loss: 0.4925 - classification_loss: 0.0485 388/500 [======================>.......] - ETA: 42s - loss: 0.5401 - regression_loss: 0.4917 - classification_loss: 0.0484 389/500 [======================>.......] - ETA: 42s - loss: 0.5403 - regression_loss: 0.4919 - classification_loss: 0.0484 390/500 [======================>.......] - ETA: 42s - loss: 0.5402 - regression_loss: 0.4919 - classification_loss: 0.0483 391/500 [======================>.......] - ETA: 41s - loss: 0.5398 - regression_loss: 0.4915 - classification_loss: 0.0483 392/500 [======================>.......] - ETA: 41s - loss: 0.5396 - regression_loss: 0.4914 - classification_loss: 0.0482 393/500 [======================>.......] - ETA: 40s - loss: 0.5392 - regression_loss: 0.4911 - classification_loss: 0.0481 394/500 [======================>.......] - ETA: 40s - loss: 0.5386 - regression_loss: 0.4906 - classification_loss: 0.0481 395/500 [======================>.......] - ETA: 40s - loss: 0.5388 - regression_loss: 0.4907 - classification_loss: 0.0481 396/500 [======================>.......] - ETA: 39s - loss: 0.5394 - regression_loss: 0.4913 - classification_loss: 0.0481 397/500 [======================>.......] - ETA: 39s - loss: 0.5390 - regression_loss: 0.4910 - classification_loss: 0.0480 398/500 [======================>.......] - ETA: 39s - loss: 0.5392 - regression_loss: 0.4911 - classification_loss: 0.0481 399/500 [======================>.......] - ETA: 38s - loss: 0.5396 - regression_loss: 0.4915 - classification_loss: 0.0481 400/500 [=======================>......] - ETA: 38s - loss: 0.5393 - regression_loss: 0.4913 - classification_loss: 0.0480 401/500 [=======================>......] - ETA: 37s - loss: 0.5402 - regression_loss: 0.4921 - classification_loss: 0.0481 402/500 [=======================>......] - ETA: 37s - loss: 0.5399 - regression_loss: 0.4918 - classification_loss: 0.0481 403/500 [=======================>......] - ETA: 37s - loss: 0.5404 - regression_loss: 0.4923 - classification_loss: 0.0481 404/500 [=======================>......] - ETA: 36s - loss: 0.5406 - regression_loss: 0.4924 - classification_loss: 0.0481 405/500 [=======================>......] - ETA: 36s - loss: 0.5412 - regression_loss: 0.4931 - classification_loss: 0.0481 406/500 [=======================>......] - ETA: 36s - loss: 0.5420 - regression_loss: 0.4938 - classification_loss: 0.0482 407/500 [=======================>......] - ETA: 35s - loss: 0.5422 - regression_loss: 0.4940 - classification_loss: 0.0482 408/500 [=======================>......] - ETA: 35s - loss: 0.5420 - regression_loss: 0.4939 - classification_loss: 0.0481 409/500 [=======================>......] - ETA: 34s - loss: 0.5419 - regression_loss: 0.4937 - classification_loss: 0.0481 410/500 [=======================>......] - ETA: 34s - loss: 0.5418 - regression_loss: 0.4937 - classification_loss: 0.0481 411/500 [=======================>......] - ETA: 34s - loss: 0.5414 - regression_loss: 0.4933 - classification_loss: 0.0480 412/500 [=======================>......] - ETA: 33s - loss: 0.5415 - regression_loss: 0.4934 - classification_loss: 0.0480 413/500 [=======================>......] - ETA: 33s - loss: 0.5416 - regression_loss: 0.4935 - classification_loss: 0.0481 414/500 [=======================>......] - ETA: 32s - loss: 0.5413 - regression_loss: 0.4932 - classification_loss: 0.0480 415/500 [=======================>......] - ETA: 32s - loss: 0.5416 - regression_loss: 0.4934 - classification_loss: 0.0482 416/500 [=======================>......] - ETA: 32s - loss: 0.5419 - regression_loss: 0.4937 - classification_loss: 0.0482 417/500 [========================>.....] - ETA: 31s - loss: 0.5426 - regression_loss: 0.4942 - classification_loss: 0.0484 418/500 [========================>.....] - ETA: 31s - loss: 0.5426 - regression_loss: 0.4942 - classification_loss: 0.0484 419/500 [========================>.....] - ETA: 31s - loss: 0.5433 - regression_loss: 0.4948 - classification_loss: 0.0485 420/500 [========================>.....] - ETA: 30s - loss: 0.5431 - regression_loss: 0.4946 - classification_loss: 0.0485 421/500 [========================>.....] - ETA: 30s - loss: 0.5427 - regression_loss: 0.4943 - classification_loss: 0.0484 422/500 [========================>.....] - ETA: 29s - loss: 0.5423 - regression_loss: 0.4939 - classification_loss: 0.0484 423/500 [========================>.....] - ETA: 29s - loss: 0.5433 - regression_loss: 0.4948 - classification_loss: 0.0485 424/500 [========================>.....] - ETA: 29s - loss: 0.5428 - regression_loss: 0.4944 - classification_loss: 0.0485 425/500 [========================>.....] - ETA: 28s - loss: 0.5430 - regression_loss: 0.4945 - classification_loss: 0.0485 426/500 [========================>.....] - ETA: 28s - loss: 0.5427 - regression_loss: 0.4942 - classification_loss: 0.0485 427/500 [========================>.....] - ETA: 27s - loss: 0.5427 - regression_loss: 0.4943 - classification_loss: 0.0484 428/500 [========================>.....] - ETA: 27s - loss: 0.5422 - regression_loss: 0.4937 - classification_loss: 0.0485 429/500 [========================>.....] - ETA: 27s - loss: 0.5416 - regression_loss: 0.4931 - classification_loss: 0.0484 430/500 [========================>.....] - ETA: 26s - loss: 0.5409 - regression_loss: 0.4926 - classification_loss: 0.0484 431/500 [========================>.....] - ETA: 26s - loss: 0.5407 - regression_loss: 0.4924 - classification_loss: 0.0483 432/500 [========================>.....] - ETA: 26s - loss: 0.5401 - regression_loss: 0.4919 - classification_loss: 0.0482 433/500 [========================>.....] - ETA: 25s - loss: 0.5403 - regression_loss: 0.4921 - classification_loss: 0.0483 434/500 [=========================>....] - ETA: 25s - loss: 0.5399 - regression_loss: 0.4917 - classification_loss: 0.0482 435/500 [=========================>....] - ETA: 24s - loss: 0.5399 - regression_loss: 0.4917 - classification_loss: 0.0482 436/500 [=========================>....] - ETA: 24s - loss: 0.5393 - regression_loss: 0.4911 - classification_loss: 0.0481 437/500 [=========================>....] - ETA: 24s - loss: 0.5394 - regression_loss: 0.4913 - classification_loss: 0.0481 438/500 [=========================>....] - ETA: 23s - loss: 0.5388 - regression_loss: 0.4907 - classification_loss: 0.0481 439/500 [=========================>....] - ETA: 23s - loss: 0.5383 - regression_loss: 0.4902 - classification_loss: 0.0480 440/500 [=========================>....] - ETA: 22s - loss: 0.5376 - regression_loss: 0.4896 - classification_loss: 0.0480 441/500 [=========================>....] - ETA: 22s - loss: 0.5367 - regression_loss: 0.4888 - classification_loss: 0.0479 442/500 [=========================>....] - ETA: 22s - loss: 0.5363 - regression_loss: 0.4884 - classification_loss: 0.0479 443/500 [=========================>....] - ETA: 21s - loss: 0.5366 - regression_loss: 0.4887 - classification_loss: 0.0479 444/500 [=========================>....] - ETA: 21s - loss: 0.5368 - regression_loss: 0.4888 - classification_loss: 0.0479 445/500 [=========================>....] - ETA: 21s - loss: 0.5370 - regression_loss: 0.4891 - classification_loss: 0.0479 446/500 [=========================>....] - ETA: 20s - loss: 0.5364 - regression_loss: 0.4886 - classification_loss: 0.0479 447/500 [=========================>....] - ETA: 20s - loss: 0.5360 - regression_loss: 0.4882 - classification_loss: 0.0478 448/500 [=========================>....] - ETA: 19s - loss: 0.5362 - regression_loss: 0.4885 - classification_loss: 0.0478 449/500 [=========================>....] - ETA: 19s - loss: 0.5363 - regression_loss: 0.4885 - classification_loss: 0.0478 450/500 [==========================>...] - ETA: 19s - loss: 0.5363 - regression_loss: 0.4884 - classification_loss: 0.0478 451/500 [==========================>...] - ETA: 18s - loss: 0.5364 - regression_loss: 0.4885 - classification_loss: 0.0479 452/500 [==========================>...] - ETA: 18s - loss: 0.5368 - regression_loss: 0.4889 - classification_loss: 0.0479 453/500 [==========================>...] - ETA: 17s - loss: 0.5373 - regression_loss: 0.4894 - classification_loss: 0.0479 454/500 [==========================>...] - ETA: 17s - loss: 0.5368 - regression_loss: 0.4888 - classification_loss: 0.0479 455/500 [==========================>...] - ETA: 17s - loss: 0.5370 - regression_loss: 0.4890 - classification_loss: 0.0480 456/500 [==========================>...] - ETA: 16s - loss: 0.5371 - regression_loss: 0.4890 - classification_loss: 0.0480 457/500 [==========================>...] - ETA: 16s - loss: 0.5372 - regression_loss: 0.4892 - classification_loss: 0.0481 458/500 [==========================>...] - ETA: 16s - loss: 0.5376 - regression_loss: 0.4895 - classification_loss: 0.0481 459/500 [==========================>...] - ETA: 15s - loss: 0.5379 - regression_loss: 0.4898 - classification_loss: 0.0481 460/500 [==========================>...] - ETA: 15s - loss: 0.5375 - regression_loss: 0.4894 - classification_loss: 0.0481 461/500 [==========================>...] - ETA: 14s - loss: 0.5374 - regression_loss: 0.4893 - classification_loss: 0.0481 462/500 [==========================>...] - ETA: 14s - loss: 0.5374 - regression_loss: 0.4893 - classification_loss: 0.0482 463/500 [==========================>...] - ETA: 14s - loss: 0.5369 - regression_loss: 0.4887 - classification_loss: 0.0481 464/500 [==========================>...] - ETA: 13s - loss: 0.5365 - regression_loss: 0.4884 - classification_loss: 0.0481 465/500 [==========================>...] - ETA: 13s - loss: 0.5361 - regression_loss: 0.4880 - classification_loss: 0.0481 466/500 [==========================>...] - ETA: 13s - loss: 0.5357 - regression_loss: 0.4876 - classification_loss: 0.0480 467/500 [===========================>..] - ETA: 12s - loss: 0.5354 - regression_loss: 0.4873 - classification_loss: 0.0480 468/500 [===========================>..] - ETA: 12s - loss: 0.5347 - regression_loss: 0.4868 - classification_loss: 0.0480 469/500 [===========================>..] - ETA: 11s - loss: 0.5344 - regression_loss: 0.4865 - classification_loss: 0.0479 470/500 [===========================>..] - ETA: 11s - loss: 0.5339 - regression_loss: 0.4860 - classification_loss: 0.0479 471/500 [===========================>..] - ETA: 11s - loss: 0.5338 - regression_loss: 0.4859 - classification_loss: 0.0479 472/500 [===========================>..] - ETA: 10s - loss: 0.5338 - regression_loss: 0.4859 - classification_loss: 0.0479 473/500 [===========================>..] - ETA: 10s - loss: 0.5331 - regression_loss: 0.4853 - classification_loss: 0.0478 474/500 [===========================>..] - ETA: 9s - loss: 0.5323 - regression_loss: 0.4845 - classification_loss: 0.0478  475/500 [===========================>..] - ETA: 9s - loss: 0.5325 - regression_loss: 0.4846 - classification_loss: 0.0478 476/500 [===========================>..] - ETA: 9s - loss: 0.5325 - regression_loss: 0.4846 - classification_loss: 0.0479 477/500 [===========================>..] - ETA: 8s - loss: 0.5319 - regression_loss: 0.4841 - classification_loss: 0.0478 478/500 [===========================>..] - ETA: 8s - loss: 0.5317 - regression_loss: 0.4839 - classification_loss: 0.0478 479/500 [===========================>..] - ETA: 8s - loss: 0.5318 - regression_loss: 0.4839 - classification_loss: 0.0479 480/500 [===========================>..] - ETA: 7s - loss: 0.5315 - regression_loss: 0.4837 - classification_loss: 0.0478 481/500 [===========================>..] - ETA: 7s - loss: 0.5316 - regression_loss: 0.4837 - classification_loss: 0.0478 482/500 [===========================>..] - ETA: 6s - loss: 0.5317 - regression_loss: 0.4839 - classification_loss: 0.0478 483/500 [===========================>..] - ETA: 6s - loss: 0.5319 - regression_loss: 0.4842 - classification_loss: 0.0478 484/500 [============================>.] - ETA: 6s - loss: 0.5326 - regression_loss: 0.4848 - classification_loss: 0.0478 485/500 [============================>.] - ETA: 5s - loss: 0.5324 - regression_loss: 0.4846 - classification_loss: 0.0478 486/500 [============================>.] - ETA: 5s - loss: 0.5329 - regression_loss: 0.4851 - classification_loss: 0.0478 487/500 [============================>.] - ETA: 4s - loss: 0.5333 - regression_loss: 0.4855 - classification_loss: 0.0478 488/500 [============================>.] - ETA: 4s - loss: 0.5334 - regression_loss: 0.4856 - classification_loss: 0.0478 489/500 [============================>.] - ETA: 4s - loss: 0.5331 - regression_loss: 0.4853 - classification_loss: 0.0478 490/500 [============================>.] - ETA: 3s - loss: 0.5328 - regression_loss: 0.4850 - classification_loss: 0.0477 491/500 [============================>.] - ETA: 3s - loss: 0.5324 - regression_loss: 0.4847 - classification_loss: 0.0477 492/500 [============================>.] - ETA: 3s - loss: 0.5322 - regression_loss: 0.4845 - classification_loss: 0.0477 493/500 [============================>.] - ETA: 2s - loss: 0.5327 - regression_loss: 0.4849 - classification_loss: 0.0478 494/500 [============================>.] - ETA: 2s - loss: 0.5324 - regression_loss: 0.4846 - classification_loss: 0.0477 495/500 [============================>.] - ETA: 1s - loss: 0.5322 - regression_loss: 0.4845 - classification_loss: 0.0478 496/500 [============================>.] - ETA: 1s - loss: 0.5320 - regression_loss: 0.4843 - classification_loss: 0.0477 497/500 [============================>.] - ETA: 1s - loss: 0.5321 - regression_loss: 0.4844 - classification_loss: 0.0477 498/500 [============================>.] - ETA: 0s - loss: 0.5319 - regression_loss: 0.4842 - classification_loss: 0.0477 499/500 [============================>.] - ETA: 0s - loss: 0.5318 - regression_loss: 0.4842 - classification_loss: 0.0476 500/500 [==============================] - 191s 383ms/step - loss: 0.5316 - regression_loss: 0.4840 - classification_loss: 0.0476 1172 instances of class plum with average precision: 0.7051 mAP: 0.7051 Epoch 00020: saving model to ./training/snapshots/resnet101_pascal_20.h5 Now training depth backbone as well Model: "retinanet" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) (None, None, None, 4 0 __________________________________________________________________________________________________ lambda_1 (Lambda) (None, None, None, 3 0 input_1[0][0] __________________________________________________________________________________________________ padding_conv1_rgb (ZeroPadding2 (None, None, None, 3 0 lambda_1[0][0] __________________________________________________________________________________________________ conv1_rgb (Conv2D) (None, None, None, 6 9408 padding_conv1_rgb[0][0] __________________________________________________________________________________________________ bn_conv1_rgb (BatchNormalizatio (None, None, None, 6 256 conv1_rgb[0][0] __________________________________________________________________________________________________ conv1_relu_rgb (Activation) (None, None, None, 6 0 bn_conv1_rgb[0][0] __________________________________________________________________________________________________ pool1_rgb (MaxPooling2D) (None, None, None, 6 0 conv1_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2a_rgb (Conv2D) (None, None, None, 6 4096 pool1_rgb[0][0] __________________________________________________________________________________________________ bn2a_branch2a_rgb (BatchNormali (None, None, None, 6 256 res2a_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2a_relu_rgb (Activa (None, None, None, 6 0 bn2a_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding2a_branch2b_rgb (ZeroPad (None, None, None, 6 0 res2a_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2b_rgb (Conv2D) (None, None, None, 6 36864 padding2a_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn2a_branch2b_rgb (BatchNormali (None, None, None, 6 256 res2a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2b_relu_rgb (Activa (None, None, None, 6 0 bn2a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2c_rgb (Conv2D) (None, None, None, 2 16384 res2a_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_branch1_rgb (Conv2D) (None, None, None, 2 16384 pool1_rgb[0][0] __________________________________________________________________________________________________ bn2a_branch2c_rgb (BatchNormali (None, None, None, 2 1024 res2a_branch2c_rgb[0][0] __________________________________________________________________________________________________ bn2a_branch1_rgb (BatchNormaliz (None, None, None, 2 1024 res2a_branch1_rgb[0][0] __________________________________________________________________________________________________ res2a_rgb (Add) (None, None, None, 2 0 bn2a_branch2c_rgb[0][0] bn2a_branch1_rgb[0][0] __________________________________________________________________________________________________ res2a_relu_rgb (Activation) (None, None, None, 2 0 res2a_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2a_rgb (Conv2D) (None, None, None, 6 16384 res2a_relu_rgb[0][0] __________________________________________________________________________________________________ bn2b_branch2a_rgb (BatchNormali (None, None, None, 6 256 res2b_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2a_relu_rgb (Activa (None, None, None, 6 0 bn2b_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding2b_branch2b_rgb (ZeroPad (None, None, None, 6 0 res2b_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2b_rgb (Conv2D) (None, None, None, 6 36864 padding2b_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn2b_branch2b_rgb (BatchNormali (None, None, None, 6 256 res2b_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2b_relu_rgb (Activa (None, None, None, 6 0 bn2b_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2c_rgb (Conv2D) (None, None, None, 2 16384 res2b_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn2b_branch2c_rgb (BatchNormali (None, None, None, 2 1024 res2b_branch2c_rgb[0][0] __________________________________________________________________________________________________ res2b_rgb (Add) (None, None, None, 2 0 bn2b_branch2c_rgb[0][0] res2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2b_relu_rgb (Activation) (None, None, None, 2 0 res2b_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2a_rgb (Conv2D) (None, None, None, 6 16384 res2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn2c_branch2a_rgb (BatchNormali (None, None, None, 6 256 res2c_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2a_relu_rgb (Activa (None, None, None, 6 0 bn2c_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding2c_branch2b_rgb (ZeroPad (None, None, None, 6 0 res2c_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2b_rgb (Conv2D) (None, None, None, 6 36864 padding2c_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn2c_branch2b_rgb (BatchNormali (None, None, None, 6 256 res2c_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2b_relu_rgb (Activa (None, None, None, 6 0 bn2c_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2c_rgb (Conv2D) (None, None, None, 2 16384 res2c_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn2c_branch2c_rgb (BatchNormali (None, None, None, 2 1024 res2c_branch2c_rgb[0][0] __________________________________________________________________________________________________ res2c_rgb (Add) (None, None, None, 2 0 bn2c_branch2c_rgb[0][0] res2b_relu_rgb[0][0] __________________________________________________________________________________________________ res2c_relu_rgb (Activation) (None, None, None, 2 0 res2c_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2a_rgb (Conv2D) (None, None, None, 1 32768 res2c_relu_rgb[0][0] __________________________________________________________________________________________________ bn3a_branch2a_rgb (BatchNormali (None, None, None, 1 512 res3a_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2a_relu_rgb (Activa (None, None, None, 1 0 bn3a_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding3a_branch2b_rgb (ZeroPad (None, None, None, 1 0 res3a_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2b_rgb (Conv2D) (None, None, None, 1 147456 padding3a_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn3a_branch2b_rgb (BatchNormali (None, None, None, 1 512 res3a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2b_relu_rgb (Activa (None, None, None, 1 0 bn3a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2c_rgb (Conv2D) (None, None, None, 5 65536 res3a_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res3a_branch1_rgb (Conv2D) (None, None, None, 5 131072 res2c_relu_rgb[0][0] __________________________________________________________________________________________________ bn3a_branch2c_rgb (BatchNormali (None, None, None, 5 2048 res3a_branch2c_rgb[0][0] __________________________________________________________________________________________________ bn3a_branch1_rgb (BatchNormaliz (None, None, None, 5 2048 res3a_branch1_rgb[0][0] __________________________________________________________________________________________________ res3a_rgb (Add) (None, None, None, 5 0 bn3a_branch2c_rgb[0][0] bn3a_branch1_rgb[0][0] __________________________________________________________________________________________________ res3a_relu_rgb (Activation) (None, None, None, 5 0 res3a_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2a_rgb (Conv2D) (None, None, None, 1 65536 res3a_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b1_branch2a_rgb (BatchNormal (None, None, None, 1 512 res3b1_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2a_relu_rgb (Activ (None, None, None, 1 0 bn3b1_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding3b1_branch2b_rgb (ZeroPa (None, None, None, 1 0 res3b1_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2b_rgb (Conv2D) (None, None, None, 1 147456 padding3b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn3b1_branch2b_rgb (BatchNormal (None, None, None, 1 512 res3b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2b_relu_rgb (Activ (None, None, None, 1 0 bn3b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2c_rgb (Conv2D) (None, None, None, 5 65536 res3b1_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b1_branch2c_rgb (BatchNormal (None, None, None, 5 2048 res3b1_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3b1_rgb (Add) (None, None, None, 5 0 bn3b1_branch2c_rgb[0][0] res3a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b1_relu_rgb (Activation) (None, None, None, 5 0 res3b1_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2a_rgb (Conv2D) (None, None, None, 1 65536 res3b1_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b2_branch2a_rgb (BatchNormal (None, None, None, 1 512 res3b2_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2a_relu_rgb (Activ (None, None, None, 1 0 bn3b2_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding3b2_branch2b_rgb (ZeroPa (None, None, None, 1 0 res3b2_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2b_rgb (Conv2D) (None, None, None, 1 147456 padding3b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn3b2_branch2b_rgb (BatchNormal (None, None, None, 1 512 res3b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2b_relu_rgb (Activ (None, None, None, 1 0 bn3b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2c_rgb (Conv2D) (None, None, None, 5 65536 res3b2_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b2_branch2c_rgb (BatchNormal (None, None, None, 5 2048 res3b2_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3b2_rgb (Add) (None, None, None, 5 0 bn3b2_branch2c_rgb[0][0] res3b1_relu_rgb[0][0] __________________________________________________________________________________________________ res3b2_relu_rgb (Activation) (None, None, None, 5 0 res3b2_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2a_rgb (Conv2D) (None, None, None, 1 65536 res3b2_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b3_branch2a_rgb (BatchNormal (None, None, None, 1 512 res3b3_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2a_relu_rgb (Activ (None, None, None, 1 0 bn3b3_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding3b3_branch2b_rgb (ZeroPa (None, None, None, 1 0 res3b3_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2b_rgb (Conv2D) (None, None, None, 1 147456 padding3b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn3b3_branch2b_rgb (BatchNormal (None, None, None, 1 512 res3b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2b_relu_rgb (Activ (None, None, None, 1 0 bn3b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2c_rgb (Conv2D) (None, None, None, 5 65536 res3b3_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ bn3b3_branch2c_rgb (BatchNormal (None, None, None, 5 2048 res3b3_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3b3_rgb (Add) (None, None, None, 5 0 bn3b3_branch2c_rgb[0][0] res3b2_relu_rgb[0][0] __________________________________________________________________________________________________ res3b3_relu_rgb (Activation) (None, None, None, 5 0 res3b3_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2a_rgb (Conv2D) (None, None, None, 2 131072 res3b3_relu_rgb[0][0] __________________________________________________________________________________________________ bn4a_branch2a_rgb (BatchNormali (None, None, None, 2 1024 res4a_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2a_relu_rgb (Activa (None, None, None, 2 0 bn4a_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4a_branch2b_rgb (ZeroPad (None, None, None, 2 0 res4a_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4a_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn4a_branch2b_rgb (BatchNormali (None, None, None, 2 1024 res4a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2b_relu_rgb (Activa (None, None, None, 2 0 bn4a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4a_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4a_branch1_rgb (Conv2D) (None, None, None, 1 524288 res3b3_relu_rgb[0][0] __________________________________________________________________________________________________ bn4a_branch2c_rgb (BatchNormali (None, None, None, 1 4096 res4a_branch2c_rgb[0][0] __________________________________________________________________________________________________ bn4a_branch1_rgb (BatchNormaliz (None, None, None, 1 4096 res4a_branch1_rgb[0][0] __________________________________________________________________________________________________ res4a_rgb (Add) (None, None, None, 1 0 bn4a_branch2c_rgb[0][0] bn4a_branch1_rgb[0][0] __________________________________________________________________________________________________ res4a_relu_rgb (Activation) (None, None, None, 1 0 res4a_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4a_relu_rgb[0][0] __________________________________________________________________________________________________ bn4b1_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b1_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b1_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4b1_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b1_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ bn4b1_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b1_branch2b_rgb[0][0] __________________________________________________________________________________________________ lambda_2 (Lambda) (None, None, None, 1 0 input_1[0][0] __________________________________________________________________________________________________ res4b1_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b1_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ padding_conv1_d (ZeroPadding2D) (None, None, None, 1 0 lambda_2[0][0] __________________________________________________________________________________________________ bn4b1_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b1_branch2c_rgb[0][0] __________________________________________________________________________________________________ conv1_d (Conv2D) (None, None, None, 6 3136 padding_conv1_d[0][0] __________________________________________________________________________________________________ res4b1_rgb (Add) (None, None, None, 1 0 bn4b1_branch2c_rgb[0][0] res4a_relu_rgb[0][0] __________________________________________________________________________________________________ conv1_relu_d (Activation) (None, None, None, 6 0 conv1_d[0][0] __________________________________________________________________________________________________ res4b1_relu_rgb (Activation) (None, None, None, 1 0 res4b1_rgb[0][0] __________________________________________________________________________________________________ pool1_d (MaxPooling2D) (None, None, None, 6 0 conv1_relu_d[0][0] __________________________________________________________________________________________________ res4b2_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b1_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2a_d (Conv2D) (None, None, None, 6 4096 pool1_d[0][0] __________________________________________________________________________________________________ bn4b2_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b2_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2a_relu_d (Activati (None, None, None, 6 0 res2a_branch2a_d[0][0] __________________________________________________________________________________________________ res4b2_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b2_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding2a_branch2b_d (ZeroPaddi (None, None, None, 6 0 res2a_branch2a_relu_d[0][0] __________________________________________________________________________________________________ padding4b2_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b2_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2b_d (Conv2D) (None, None, None, 6 36864 padding2a_branch2b_d[0][0] __________________________________________________________________________________________________ res4b2_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2b_relu_d (Activati (None, None, None, 6 0 res2a_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b2_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2a_branch2c_d (Conv2D) (None, None, None, 2 16384 res2a_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res2a_branch1_d (Conv2D) (None, None, None, 2 16384 pool1_d[0][0] __________________________________________________________________________________________________ res4b2_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b2_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2a_d (Add) (None, None, None, 2 0 res2a_branch2c_d[0][0] res2a_branch1_d[0][0] __________________________________________________________________________________________________ res4b2_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b2_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res2a_relu_d (Activation) (None, None, None, 2 0 res2a_d[0][0] __________________________________________________________________________________________________ bn4b2_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b2_branch2c_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2a_d (Conv2D) (None, None, None, 6 16384 res2a_relu_d[0][0] __________________________________________________________________________________________________ res4b2_rgb (Add) (None, None, None, 1 0 bn4b2_branch2c_rgb[0][0] res4b1_relu_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2a_relu_d (Activati (None, None, None, 6 0 res2b_branch2a_d[0][0] __________________________________________________________________________________________________ res4b2_relu_rgb (Activation) (None, None, None, 1 0 res4b2_rgb[0][0] __________________________________________________________________________________________________ padding2b_branch2b_d (ZeroPaddi (None, None, None, 6 0 res2b_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b3_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b2_relu_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2b_d (Conv2D) (None, None, None, 6 36864 padding2b_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b3_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b3_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2b_relu_d (Activati (None, None, None, 6 0 res2b_branch2b_d[0][0] __________________________________________________________________________________________________ res4b3_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b3_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2b_branch2c_d (Conv2D) (None, None, None, 2 16384 res2b_branch2b_relu_d[0][0] __________________________________________________________________________________________________ padding4b3_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b3_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res2b_d (Add) (None, None, None, 2 0 res2b_branch2c_d[0][0] res2a_relu_d[0][0] __________________________________________________________________________________________________ res4b3_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2b_relu_d (Activation) (None, None, None, 2 0 res2b_d[0][0] __________________________________________________________________________________________________ bn4b3_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2a_d (Conv2D) (None, None, None, 6 16384 res2b_relu_d[0][0] __________________________________________________________________________________________________ res4b3_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b3_branch2b_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2a_relu_d (Activati (None, None, None, 6 0 res2c_branch2a_d[0][0] __________________________________________________________________________________________________ res4b3_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b3_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ padding2c_branch2b_d (ZeroPaddi (None, None, None, 6 0 res2c_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b3_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b3_branch2c_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2b_d (Conv2D) (None, None, None, 6 36864 padding2c_branch2b_d[0][0] __________________________________________________________________________________________________ res4b3_rgb (Add) (None, None, None, 1 0 bn4b3_branch2c_rgb[0][0] res4b2_relu_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2b_relu_d (Activati (None, None, None, 6 0 res2c_branch2b_d[0][0] __________________________________________________________________________________________________ res4b3_relu_rgb (Activation) (None, None, None, 1 0 res4b3_rgb[0][0] __________________________________________________________________________________________________ res2c_branch2c_d (Conv2D) (None, None, None, 2 16384 res2c_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b4_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b3_relu_rgb[0][0] __________________________________________________________________________________________________ res2c_d (Add) (None, None, None, 2 0 res2c_branch2c_d[0][0] res2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b4_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b4_branch2a_rgb[0][0] __________________________________________________________________________________________________ res2c_relu_d (Activation) (None, None, None, 2 0 res2c_d[0][0] __________________________________________________________________________________________________ res4b4_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b4_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2a_d (Conv2D) (None, None, None, 1 32768 res2c_relu_d[0][0] __________________________________________________________________________________________________ padding4b4_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b4_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2a_relu_d (Activati (None, None, None, 1 0 res3a_branch2a_d[0][0] __________________________________________________________________________________________________ res4b4_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b4_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding3a_branch2b_d (ZeroPaddi (None, None, None, 1 0 res3a_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b4_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b4_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2b_d (Conv2D) (None, None, None, 1 147456 padding3a_branch2b_d[0][0] __________________________________________________________________________________________________ res4b4_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b4_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2b_relu_d (Activati (None, None, None, 1 0 res3a_branch2b_d[0][0] __________________________________________________________________________________________________ res4b4_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b4_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res3a_branch2c_d (Conv2D) (None, None, None, 5 65536 res3a_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res3a_branch1_d (Conv2D) (None, None, None, 5 131072 res2c_relu_d[0][0] __________________________________________________________________________________________________ bn4b4_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b4_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3a_d (Add) (None, None, None, 5 0 res3a_branch2c_d[0][0] res3a_branch1_d[0][0] __________________________________________________________________________________________________ res4b4_rgb (Add) (None, None, None, 1 0 bn4b4_branch2c_rgb[0][0] res4b3_relu_rgb[0][0] __________________________________________________________________________________________________ res3a_relu_d (Activation) (None, None, None, 5 0 res3a_d[0][0] __________________________________________________________________________________________________ res4b4_relu_rgb (Activation) (None, None, None, 1 0 res4b4_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2a_d (Conv2D) (None, None, None, 1 65536 res3a_relu_d[0][0] __________________________________________________________________________________________________ res4b5_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b4_relu_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2a_relu_d (Activat (None, None, None, 1 0 res3b1_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b5_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b5_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding3b1_branch2b_d (ZeroPadd (None, None, None, 1 0 res3b1_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b5_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b5_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2b_d (Conv2D) (None, None, None, 1 147456 padding3b1_branch2b_d[0][0] __________________________________________________________________________________________________ padding4b5_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b5_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2b_relu_d (Activat (None, None, None, 1 0 res3b1_branch2b_d[0][0] __________________________________________________________________________________________________ res4b5_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b5_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b1_branch2c_d (Conv2D) (None, None, None, 5 65536 res3b1_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b5_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b5_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b1_d (Add) (None, None, None, 5 0 res3b1_branch2c_d[0][0] res3a_relu_d[0][0] __________________________________________________________________________________________________ res4b5_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b5_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b1_relu_d (Activation) (None, None, None, 5 0 res3b1_d[0][0] __________________________________________________________________________________________________ res4b5_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b5_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2a_d (Conv2D) (None, None, None, 1 65536 res3b1_relu_d[0][0] __________________________________________________________________________________________________ bn4b5_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b5_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2a_relu_d (Activat (None, None, None, 1 0 res3b2_branch2a_d[0][0] __________________________________________________________________________________________________ res4b5_rgb (Add) (None, None, None, 1 0 bn4b5_branch2c_rgb[0][0] res4b4_relu_rgb[0][0] __________________________________________________________________________________________________ padding3b2_branch2b_d (ZeroPadd (None, None, None, 1 0 res3b2_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b5_relu_rgb (Activation) (None, None, None, 1 0 res4b5_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2b_d (Conv2D) (None, None, None, 1 147456 padding3b2_branch2b_d[0][0] __________________________________________________________________________________________________ res4b6_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b5_relu_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2b_relu_d (Activat (None, None, None, 1 0 res3b2_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b6_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b6_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b2_branch2c_d (Conv2D) (None, None, None, 5 65536 res3b2_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b6_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b6_branch2a_rgb[0][0] __________________________________________________________________________________________________ res3b2_d (Add) (None, None, None, 5 0 res3b2_branch2c_d[0][0] res3b1_relu_d[0][0] __________________________________________________________________________________________________ padding4b6_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b6_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res3b2_relu_d (Activation) (None, None, None, 5 0 res3b2_d[0][0] __________________________________________________________________________________________________ res4b6_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b6_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2a_d (Conv2D) (None, None, None, 1 65536 res3b2_relu_d[0][0] __________________________________________________________________________________________________ bn4b6_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b6_branch2b_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2a_relu_d (Activat (None, None, None, 1 0 res3b3_branch2a_d[0][0] __________________________________________________________________________________________________ res4b6_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b6_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding3b3_branch2b_d (ZeroPadd (None, None, None, 1 0 res3b3_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b6_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b6_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2b_d (Conv2D) (None, None, None, 1 147456 padding3b3_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b6_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b6_branch2c_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2b_relu_d (Activat (None, None, None, 1 0 res3b3_branch2b_d[0][0] __________________________________________________________________________________________________ res4b6_rgb (Add) (None, None, None, 1 0 bn4b6_branch2c_rgb[0][0] res4b5_relu_rgb[0][0] __________________________________________________________________________________________________ res3b3_branch2c_d (Conv2D) (None, None, None, 5 65536 res3b3_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b6_relu_rgb (Activation) (None, None, None, 1 0 res4b6_rgb[0][0] __________________________________________________________________________________________________ res3b3_d (Add) (None, None, None, 5 0 res3b3_branch2c_d[0][0] res3b2_relu_d[0][0] __________________________________________________________________________________________________ res4b7_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b6_relu_rgb[0][0] __________________________________________________________________________________________________ res3b3_relu_d (Activation) (None, None, None, 5 0 res3b3_d[0][0] __________________________________________________________________________________________________ bn4b7_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b7_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2a_d (Conv2D) (None, None, None, 2 131072 res3b3_relu_d[0][0] __________________________________________________________________________________________________ res4b7_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b7_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2a_relu_d (Activati (None, None, None, 2 0 res4a_branch2a_d[0][0] __________________________________________________________________________________________________ padding4b7_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b7_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ padding4a_branch2b_d (ZeroPaddi (None, None, None, 2 0 res4a_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b7_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b7_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4a_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b7_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b7_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2b_relu_d (Activati (None, None, None, 2 0 res4a_branch2b_d[0][0] __________________________________________________________________________________________________ res4b7_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b7_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4a_branch2c_d (Conv2D) (None, None, None, 1 262144 res4a_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4a_branch1_d (Conv2D) (None, None, None, 1 524288 res3b3_relu_d[0][0] __________________________________________________________________________________________________ res4b7_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b7_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4a_d (Add) (None, None, None, 1 0 res4a_branch2c_d[0][0] res4a_branch1_d[0][0] __________________________________________________________________________________________________ bn4b7_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b7_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4a_relu_d (Activation) (None, None, None, 1 0 res4a_d[0][0] __________________________________________________________________________________________________ res4b7_rgb (Add) (None, None, None, 1 0 bn4b7_branch2c_rgb[0][0] res4b6_relu_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2a_d (Conv2D) (None, None, None, 2 262144 res4a_relu_d[0][0] __________________________________________________________________________________________________ res4b7_relu_rgb (Activation) (None, None, None, 1 0 res4b7_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2a_relu_d (Activat (None, None, None, 2 0 res4b1_branch2a_d[0][0] __________________________________________________________________________________________________ res4b8_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b7_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b1_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b1_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b8_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b8_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b1_branch2b_d[0][0] __________________________________________________________________________________________________ res4b8_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b8_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2b_relu_d (Activat (None, None, None, 2 0 res4b1_branch2b_d[0][0] __________________________________________________________________________________________________ padding4b8_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b8_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b1_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b1_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b8_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b8_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b1_d (Add) (None, None, None, 1 0 res4b1_branch2c_d[0][0] res4a_relu_d[0][0] __________________________________________________________________________________________________ bn4b8_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b8_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b1_relu_d (Activation) (None, None, None, 1 0 res4b1_d[0][0] __________________________________________________________________________________________________ res4b8_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b8_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b2_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b1_relu_d[0][0] __________________________________________________________________________________________________ res4b8_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b8_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b2_branch2a_relu_d (Activat (None, None, None, 2 0 res4b2_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b8_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b8_branch2c_rgb[0][0] __________________________________________________________________________________________________ padding4b2_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b2_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b8_rgb (Add) (None, None, None, 1 0 bn4b8_branch2c_rgb[0][0] res4b7_relu_rgb[0][0] __________________________________________________________________________________________________ res4b2_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b2_branch2b_d[0][0] __________________________________________________________________________________________________ res4b8_relu_rgb (Activation) (None, None, None, 1 0 res4b8_rgb[0][0] __________________________________________________________________________________________________ res4b2_branch2b_relu_d (Activat (None, None, None, 2 0 res4b2_branch2b_d[0][0] __________________________________________________________________________________________________ res4b9_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b8_relu_rgb[0][0] __________________________________________________________________________________________________ res4b2_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b2_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b9_branch2a_rgb (BatchNormal (None, None, None, 2 1024 res4b9_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b2_d (Add) (None, None, None, 1 0 res4b2_branch2c_d[0][0] res4b1_relu_d[0][0] __________________________________________________________________________________________________ res4b9_branch2a_relu_rgb (Activ (None, None, None, 2 0 bn4b9_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b2_relu_d (Activation) (None, None, None, 1 0 res4b2_d[0][0] __________________________________________________________________________________________________ padding4b9_branch2b_rgb (ZeroPa (None, None, None, 2 0 res4b9_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b3_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b2_relu_d[0][0] __________________________________________________________________________________________________ res4b9_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b9_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b3_branch2a_relu_d (Activat (None, None, None, 2 0 res4b3_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b9_branch2b_rgb (BatchNormal (None, None, None, 2 1024 res4b9_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b3_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b3_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b9_branch2b_relu_rgb (Activ (None, None, None, 2 0 bn4b9_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b3_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b3_branch2b_d[0][0] __________________________________________________________________________________________________ res4b9_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b9_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b3_branch2b_relu_d (Activat (None, None, None, 2 0 res4b3_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b9_branch2c_rgb (BatchNormal (None, None, None, 1 4096 res4b9_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b3_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b3_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b9_rgb (Add) (None, None, None, 1 0 bn4b9_branch2c_rgb[0][0] res4b8_relu_rgb[0][0] __________________________________________________________________________________________________ res4b3_d (Add) (None, None, None, 1 0 res4b3_branch2c_d[0][0] res4b2_relu_d[0][0] __________________________________________________________________________________________________ res4b9_relu_rgb (Activation) (None, None, None, 1 0 res4b9_rgb[0][0] __________________________________________________________________________________________________ res4b3_relu_d (Activation) (None, None, None, 1 0 res4b3_d[0][0] __________________________________________________________________________________________________ res4b10_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b9_relu_rgb[0][0] __________________________________________________________________________________________________ res4b4_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b3_relu_d[0][0] __________________________________________________________________________________________________ bn4b10_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b10_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b4_branch2a_relu_d (Activat (None, None, None, 2 0 res4b4_branch2a_d[0][0] __________________________________________________________________________________________________ res4b10_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b10_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4b4_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b4_branch2a_relu_d[0][0] __________________________________________________________________________________________________ padding4b10_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b10_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b4_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b4_branch2b_d[0][0] __________________________________________________________________________________________________ res4b10_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b10_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b4_branch2b_relu_d (Activat (None, None, None, 2 0 res4b4_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b10_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b10_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b4_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b4_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b10_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b10_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b4_d (Add) (None, None, None, 1 0 res4b4_branch2c_d[0][0] res4b3_relu_d[0][0] __________________________________________________________________________________________________ res4b10_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b10_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b4_relu_d (Activation) (None, None, None, 1 0 res4b4_d[0][0] __________________________________________________________________________________________________ bn4b10_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b10_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b5_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b4_relu_d[0][0] __________________________________________________________________________________________________ res4b10_rgb (Add) (None, None, None, 1 0 bn4b10_branch2c_rgb[0][0] res4b9_relu_rgb[0][0] __________________________________________________________________________________________________ res4b5_branch2a_relu_d (Activat (None, None, None, 2 0 res4b5_branch2a_d[0][0] __________________________________________________________________________________________________ res4b10_relu_rgb (Activation) (None, None, None, 1 0 res4b10_rgb[0][0] __________________________________________________________________________________________________ padding4b5_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b5_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b11_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b10_relu_rgb[0][0] __________________________________________________________________________________________________ res4b5_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b5_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b11_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b11_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b5_branch2b_relu_d (Activat (None, None, None, 2 0 res4b5_branch2b_d[0][0] __________________________________________________________________________________________________ res4b11_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b11_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b5_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b5_branch2b_relu_d[0][0] __________________________________________________________________________________________________ padding4b11_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b11_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b5_d (Add) (None, None, None, 1 0 res4b5_branch2c_d[0][0] res4b4_relu_d[0][0] __________________________________________________________________________________________________ res4b11_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b11_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b5_relu_d (Activation) (None, None, None, 1 0 res4b5_d[0][0] __________________________________________________________________________________________________ bn4b11_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b11_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b6_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b5_relu_d[0][0] __________________________________________________________________________________________________ res4b11_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b11_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b6_branch2a_relu_d (Activat (None, None, None, 2 0 res4b6_branch2a_d[0][0] __________________________________________________________________________________________________ res4b11_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b11_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b6_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b6_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b11_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b11_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b6_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b6_branch2b_d[0][0] __________________________________________________________________________________________________ res4b11_rgb (Add) (None, None, None, 1 0 bn4b11_branch2c_rgb[0][0] res4b10_relu_rgb[0][0] __________________________________________________________________________________________________ res4b6_branch2b_relu_d (Activat (None, None, None, 2 0 res4b6_branch2b_d[0][0] __________________________________________________________________________________________________ res4b11_relu_rgb (Activation) (None, None, None, 1 0 res4b11_rgb[0][0] __________________________________________________________________________________________________ res4b6_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b6_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b12_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b11_relu_rgb[0][0] __________________________________________________________________________________________________ res4b6_d (Add) (None, None, None, 1 0 res4b6_branch2c_d[0][0] res4b5_relu_d[0][0] __________________________________________________________________________________________________ bn4b12_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b12_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b6_relu_d (Activation) (None, None, None, 1 0 res4b6_d[0][0] __________________________________________________________________________________________________ res4b12_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b12_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b7_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b6_relu_d[0][0] __________________________________________________________________________________________________ padding4b12_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b12_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b7_branch2a_relu_d (Activat (None, None, None, 2 0 res4b7_branch2a_d[0][0] __________________________________________________________________________________________________ res4b12_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b12_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b7_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b7_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b12_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b12_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b7_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b7_branch2b_d[0][0] __________________________________________________________________________________________________ res4b12_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b12_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b7_branch2b_relu_d (Activat (None, None, None, 2 0 res4b7_branch2b_d[0][0] __________________________________________________________________________________________________ res4b12_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b12_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b7_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b7_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b12_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b12_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b7_d (Add) (None, None, None, 1 0 res4b7_branch2c_d[0][0] res4b6_relu_d[0][0] __________________________________________________________________________________________________ res4b12_rgb (Add) (None, None, None, 1 0 bn4b12_branch2c_rgb[0][0] res4b11_relu_rgb[0][0] __________________________________________________________________________________________________ res4b7_relu_d (Activation) (None, None, None, 1 0 res4b7_d[0][0] __________________________________________________________________________________________________ res4b12_relu_rgb (Activation) (None, None, None, 1 0 res4b12_rgb[0][0] __________________________________________________________________________________________________ res4b8_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b7_relu_d[0][0] __________________________________________________________________________________________________ res4b13_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b12_relu_rgb[0][0] __________________________________________________________________________________________________ res4b8_branch2a_relu_d (Activat (None, None, None, 2 0 res4b8_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b13_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b13_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4b8_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b8_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b13_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b13_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b8_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b8_branch2b_d[0][0] __________________________________________________________________________________________________ padding4b13_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b13_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b8_branch2b_relu_d (Activat (None, None, None, 2 0 res4b8_branch2b_d[0][0] __________________________________________________________________________________________________ res4b13_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b13_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b8_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b8_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b13_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b13_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b8_d (Add) (None, None, None, 1 0 res4b8_branch2c_d[0][0] res4b7_relu_d[0][0] __________________________________________________________________________________________________ res4b13_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b13_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b8_relu_d (Activation) (None, None, None, 1 0 res4b8_d[0][0] __________________________________________________________________________________________________ res4b13_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b13_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b9_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b8_relu_d[0][0] __________________________________________________________________________________________________ bn4b13_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b13_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b9_branch2a_relu_d (Activat (None, None, None, 2 0 res4b9_branch2a_d[0][0] __________________________________________________________________________________________________ res4b13_rgb (Add) (None, None, None, 1 0 bn4b13_branch2c_rgb[0][0] res4b12_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b9_branch2b_d (ZeroPadd (None, None, None, 2 0 res4b9_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b13_relu_rgb (Activation) (None, None, None, 1 0 res4b13_rgb[0][0] __________________________________________________________________________________________________ res4b9_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b9_branch2b_d[0][0] __________________________________________________________________________________________________ res4b14_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b13_relu_rgb[0][0] __________________________________________________________________________________________________ res4b9_branch2b_relu_d (Activat (None, None, None, 2 0 res4b9_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b14_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b14_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b9_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b9_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b14_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b14_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b9_d (Add) (None, None, None, 1 0 res4b9_branch2c_d[0][0] res4b8_relu_d[0][0] __________________________________________________________________________________________________ padding4b14_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b14_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b9_relu_d (Activation) (None, None, None, 1 0 res4b9_d[0][0] __________________________________________________________________________________________________ res4b14_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b14_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b10_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b9_relu_d[0][0] __________________________________________________________________________________________________ bn4b14_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b14_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b10_branch2a_relu_d (Activa (None, None, None, 2 0 res4b10_branch2a_d[0][0] __________________________________________________________________________________________________ res4b14_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b14_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b10_branch2b_d (ZeroPad (None, None, None, 2 0 res4b10_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b14_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b14_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b10_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b10_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b14_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b14_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b10_branch2b_relu_d (Activa (None, None, None, 2 0 res4b10_branch2b_d[0][0] __________________________________________________________________________________________________ res4b14_rgb (Add) (None, None, None, 1 0 bn4b14_branch2c_rgb[0][0] res4b13_relu_rgb[0][0] __________________________________________________________________________________________________ res4b10_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b10_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b14_relu_rgb (Activation) (None, None, None, 1 0 res4b14_rgb[0][0] __________________________________________________________________________________________________ res4b10_d (Add) (None, None, None, 1 0 res4b10_branch2c_d[0][0] res4b9_relu_d[0][0] __________________________________________________________________________________________________ res4b15_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b14_relu_rgb[0][0] __________________________________________________________________________________________________ res4b10_relu_d (Activation) (None, None, None, 1 0 res4b10_d[0][0] __________________________________________________________________________________________________ bn4b15_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b15_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b11_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b10_relu_d[0][0] __________________________________________________________________________________________________ res4b15_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b15_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b11_branch2a_relu_d (Activa (None, None, None, 2 0 res4b11_branch2a_d[0][0] __________________________________________________________________________________________________ padding4b15_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b15_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b11_branch2b_d (ZeroPad (None, None, None, 2 0 res4b11_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b15_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b15_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b11_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b11_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b15_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b15_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b11_branch2b_relu_d (Activa (None, None, None, 2 0 res4b11_branch2b_d[0][0] __________________________________________________________________________________________________ res4b15_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b15_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b11_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b11_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b15_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b15_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b11_d (Add) (None, None, None, 1 0 res4b11_branch2c_d[0][0] res4b10_relu_d[0][0] __________________________________________________________________________________________________ bn4b15_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b15_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b11_relu_d (Activation) (None, None, None, 1 0 res4b11_d[0][0] __________________________________________________________________________________________________ res4b15_rgb (Add) (None, None, None, 1 0 bn4b15_branch2c_rgb[0][0] res4b14_relu_rgb[0][0] __________________________________________________________________________________________________ res4b12_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b11_relu_d[0][0] __________________________________________________________________________________________________ res4b15_relu_rgb (Activation) (None, None, None, 1 0 res4b15_rgb[0][0] __________________________________________________________________________________________________ res4b12_branch2a_relu_d (Activa (None, None, None, 2 0 res4b12_branch2a_d[0][0] __________________________________________________________________________________________________ res4b16_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b15_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b12_branch2b_d (ZeroPad (None, None, None, 2 0 res4b12_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b16_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b16_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b12_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b12_branch2b_d[0][0] __________________________________________________________________________________________________ res4b16_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b16_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b12_branch2b_relu_d (Activa (None, None, None, 2 0 res4b12_branch2b_d[0][0] __________________________________________________________________________________________________ padding4b16_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b16_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b12_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b12_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b16_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b16_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b12_d (Add) (None, None, None, 1 0 res4b12_branch2c_d[0][0] res4b11_relu_d[0][0] __________________________________________________________________________________________________ bn4b16_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b16_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b12_relu_d (Activation) (None, None, None, 1 0 res4b12_d[0][0] __________________________________________________________________________________________________ res4b16_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b16_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b13_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b12_relu_d[0][0] __________________________________________________________________________________________________ res4b16_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b16_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b13_branch2a_relu_d (Activa (None, None, None, 2 0 res4b13_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b16_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b16_branch2c_rgb[0][0] __________________________________________________________________________________________________ padding4b13_branch2b_d (ZeroPad (None, None, None, 2 0 res4b13_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b16_rgb (Add) (None, None, None, 1 0 bn4b16_branch2c_rgb[0][0] res4b15_relu_rgb[0][0] __________________________________________________________________________________________________ res4b13_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b13_branch2b_d[0][0] __________________________________________________________________________________________________ res4b16_relu_rgb (Activation) (None, None, None, 1 0 res4b16_rgb[0][0] __________________________________________________________________________________________________ res4b13_branch2b_relu_d (Activa (None, None, None, 2 0 res4b13_branch2b_d[0][0] __________________________________________________________________________________________________ res4b17_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b16_relu_rgb[0][0] __________________________________________________________________________________________________ res4b13_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b13_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b17_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b17_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b13_d (Add) (None, None, None, 1 0 res4b13_branch2c_d[0][0] res4b12_relu_d[0][0] __________________________________________________________________________________________________ res4b17_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b17_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b13_relu_d (Activation) (None, None, None, 1 0 res4b13_d[0][0] __________________________________________________________________________________________________ padding4b17_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b17_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b14_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b13_relu_d[0][0] __________________________________________________________________________________________________ res4b17_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b17_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b14_branch2a_relu_d (Activa (None, None, None, 2 0 res4b14_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b17_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b17_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b14_branch2b_d (ZeroPad (None, None, None, 2 0 res4b14_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b17_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b17_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b14_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b14_branch2b_d[0][0] __________________________________________________________________________________________________ res4b17_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b17_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b14_branch2b_relu_d (Activa (None, None, None, 2 0 res4b14_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b17_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b17_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b14_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b14_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b17_rgb (Add) (None, None, None, 1 0 bn4b17_branch2c_rgb[0][0] res4b16_relu_rgb[0][0] __________________________________________________________________________________________________ res4b14_d (Add) (None, None, None, 1 0 res4b14_branch2c_d[0][0] res4b13_relu_d[0][0] __________________________________________________________________________________________________ res4b17_relu_rgb (Activation) (None, None, None, 1 0 res4b17_rgb[0][0] __________________________________________________________________________________________________ res4b14_relu_d (Activation) (None, None, None, 1 0 res4b14_d[0][0] __________________________________________________________________________________________________ res4b18_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b17_relu_rgb[0][0] __________________________________________________________________________________________________ res4b15_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b14_relu_d[0][0] __________________________________________________________________________________________________ bn4b18_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b18_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b15_branch2a_relu_d (Activa (None, None, None, 2 0 res4b15_branch2a_d[0][0] __________________________________________________________________________________________________ res4b18_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b18_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4b15_branch2b_d (ZeroPad (None, None, None, 2 0 res4b15_branch2a_relu_d[0][0] __________________________________________________________________________________________________ padding4b18_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b18_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b15_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b15_branch2b_d[0][0] __________________________________________________________________________________________________ res4b18_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b18_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b15_branch2b_relu_d (Activa (None, None, None, 2 0 res4b15_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b18_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b18_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b15_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b15_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b18_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b18_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b15_d (Add) (None, None, None, 1 0 res4b15_branch2c_d[0][0] res4b14_relu_d[0][0] __________________________________________________________________________________________________ res4b18_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b18_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b15_relu_d (Activation) (None, None, None, 1 0 res4b15_d[0][0] __________________________________________________________________________________________________ bn4b18_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b18_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b16_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b15_relu_d[0][0] __________________________________________________________________________________________________ res4b18_rgb (Add) (None, None, None, 1 0 bn4b18_branch2c_rgb[0][0] res4b17_relu_rgb[0][0] __________________________________________________________________________________________________ res4b16_branch2a_relu_d (Activa (None, None, None, 2 0 res4b16_branch2a_d[0][0] __________________________________________________________________________________________________ res4b18_relu_rgb (Activation) (None, None, None, 1 0 res4b18_rgb[0][0] __________________________________________________________________________________________________ padding4b16_branch2b_d (ZeroPad (None, None, None, 2 0 res4b16_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b19_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b18_relu_rgb[0][0] __________________________________________________________________________________________________ res4b16_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b16_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b19_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b19_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b16_branch2b_relu_d (Activa (None, None, None, 2 0 res4b16_branch2b_d[0][0] __________________________________________________________________________________________________ res4b19_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b19_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b16_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b16_branch2b_relu_d[0][0] __________________________________________________________________________________________________ padding4b19_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b19_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b16_d (Add) (None, None, None, 1 0 res4b16_branch2c_d[0][0] res4b15_relu_d[0][0] __________________________________________________________________________________________________ res4b19_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b19_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b16_relu_d (Activation) (None, None, None, 1 0 res4b16_d[0][0] __________________________________________________________________________________________________ bn4b19_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b19_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b17_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b16_relu_d[0][0] __________________________________________________________________________________________________ res4b19_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b19_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b17_branch2a_relu_d (Activa (None, None, None, 2 0 res4b17_branch2a_d[0][0] __________________________________________________________________________________________________ res4b19_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b19_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b17_branch2b_d (ZeroPad (None, None, None, 2 0 res4b17_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b19_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b19_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b17_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b17_branch2b_d[0][0] __________________________________________________________________________________________________ res4b19_rgb (Add) (None, None, None, 1 0 bn4b19_branch2c_rgb[0][0] res4b18_relu_rgb[0][0] __________________________________________________________________________________________________ res4b17_branch2b_relu_d (Activa (None, None, None, 2 0 res4b17_branch2b_d[0][0] __________________________________________________________________________________________________ res4b19_relu_rgb (Activation) (None, None, None, 1 0 res4b19_rgb[0][0] __________________________________________________________________________________________________ res4b17_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b17_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b20_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b19_relu_rgb[0][0] __________________________________________________________________________________________________ res4b17_d (Add) (None, None, None, 1 0 res4b17_branch2c_d[0][0] res4b16_relu_d[0][0] __________________________________________________________________________________________________ bn4b20_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b20_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b17_relu_d (Activation) (None, None, None, 1 0 res4b17_d[0][0] __________________________________________________________________________________________________ res4b20_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b20_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b18_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b17_relu_d[0][0] __________________________________________________________________________________________________ padding4b20_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b20_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b18_branch2a_relu_d (Activa (None, None, None, 2 0 res4b18_branch2a_d[0][0] __________________________________________________________________________________________________ res4b20_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b20_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b18_branch2b_d (ZeroPad (None, None, None, 2 0 res4b18_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn4b20_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b20_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b18_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b18_branch2b_d[0][0] __________________________________________________________________________________________________ res4b20_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b20_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b18_branch2b_relu_d (Activa (None, None, None, 2 0 res4b18_branch2b_d[0][0] __________________________________________________________________________________________________ res4b20_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b20_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b18_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b18_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b20_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b20_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b18_d (Add) (None, None, None, 1 0 res4b18_branch2c_d[0][0] res4b17_relu_d[0][0] __________________________________________________________________________________________________ res4b20_rgb (Add) (None, None, None, 1 0 bn4b20_branch2c_rgb[0][0] res4b19_relu_rgb[0][0] __________________________________________________________________________________________________ res4b18_relu_d (Activation) (None, None, None, 1 0 res4b18_d[0][0] __________________________________________________________________________________________________ res4b20_relu_rgb (Activation) (None, None, None, 1 0 res4b20_rgb[0][0] __________________________________________________________________________________________________ res4b19_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b18_relu_d[0][0] __________________________________________________________________________________________________ res4b21_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b20_relu_rgb[0][0] __________________________________________________________________________________________________ res4b19_branch2a_relu_d (Activa (None, None, None, 2 0 res4b19_branch2a_d[0][0] __________________________________________________________________________________________________ bn4b21_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b21_branch2a_rgb[0][0] __________________________________________________________________________________________________ padding4b19_branch2b_d (ZeroPad (None, None, None, 2 0 res4b19_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b21_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b21_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b19_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b19_branch2b_d[0][0] __________________________________________________________________________________________________ padding4b21_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b21_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b19_branch2b_relu_d (Activa (None, None, None, 2 0 res4b19_branch2b_d[0][0] __________________________________________________________________________________________________ res4b21_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b21_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b19_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b19_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn4b21_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b21_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b19_d (Add) (None, None, None, 1 0 res4b19_branch2c_d[0][0] res4b18_relu_d[0][0] __________________________________________________________________________________________________ res4b21_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b21_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b19_relu_d (Activation) (None, None, None, 1 0 res4b19_d[0][0] __________________________________________________________________________________________________ res4b21_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b21_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b20_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b19_relu_d[0][0] __________________________________________________________________________________________________ bn4b21_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b21_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b20_branch2a_relu_d (Activa (None, None, None, 2 0 res4b20_branch2a_d[0][0] __________________________________________________________________________________________________ res4b21_rgb (Add) (None, None, None, 1 0 bn4b21_branch2c_rgb[0][0] res4b20_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b20_branch2b_d (ZeroPad (None, None, None, 2 0 res4b20_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b21_relu_rgb (Activation) (None, None, None, 1 0 res4b21_rgb[0][0] __________________________________________________________________________________________________ res4b20_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b20_branch2b_d[0][0] __________________________________________________________________________________________________ res4b22_branch2a_rgb (Conv2D) (None, None, None, 2 262144 res4b21_relu_rgb[0][0] __________________________________________________________________________________________________ res4b20_branch2b_relu_d (Activa (None, None, None, 2 0 res4b20_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b22_branch2a_rgb (BatchNorma (None, None, None, 2 1024 res4b22_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b20_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b20_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b22_branch2a_relu_rgb (Acti (None, None, None, 2 0 bn4b22_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b20_d (Add) (None, None, None, 1 0 res4b20_branch2c_d[0][0] res4b19_relu_d[0][0] __________________________________________________________________________________________________ padding4b22_branch2b_rgb (ZeroP (None, None, None, 2 0 res4b22_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res4b20_relu_d (Activation) (None, None, None, 1 0 res4b20_d[0][0] __________________________________________________________________________________________________ res4b22_branch2b_rgb (Conv2D) (None, None, None, 2 589824 padding4b22_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b21_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b20_relu_d[0][0] __________________________________________________________________________________________________ bn4b22_branch2b_rgb (BatchNorma (None, None, None, 2 1024 res4b22_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b21_branch2a_relu_d (Activa (None, None, None, 2 0 res4b21_branch2a_d[0][0] __________________________________________________________________________________________________ res4b22_branch2b_relu_rgb (Acti (None, None, None, 2 0 bn4b22_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding4b21_branch2b_d (ZeroPad (None, None, None, 2 0 res4b21_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res4b22_branch2c_rgb (Conv2D) (None, None, None, 1 262144 res4b22_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res4b21_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b21_branch2b_d[0][0] __________________________________________________________________________________________________ bn4b22_branch2c_rgb (BatchNorma (None, None, None, 1 4096 res4b22_branch2c_rgb[0][0] __________________________________________________________________________________________________ res4b21_branch2b_relu_d (Activa (None, None, None, 2 0 res4b21_branch2b_d[0][0] __________________________________________________________________________________________________ res4b22_rgb (Add) (None, None, None, 1 0 bn4b22_branch2c_rgb[0][0] res4b21_relu_rgb[0][0] __________________________________________________________________________________________________ res4b21_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b21_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res4b22_relu_rgb (Activation) (None, None, None, 1 0 res4b22_rgb[0][0] __________________________________________________________________________________________________ res4b21_d (Add) (None, None, None, 1 0 res4b21_branch2c_d[0][0] res4b20_relu_d[0][0] __________________________________________________________________________________________________ res5a_branch2a_rgb (Conv2D) (None, None, None, 5 524288 res4b22_relu_rgb[0][0] __________________________________________________________________________________________________ res4b21_relu_d (Activation) (None, None, None, 1 0 res4b21_d[0][0] __________________________________________________________________________________________________ bn5a_branch2a_rgb (BatchNormali (None, None, None, 5 2048 res5a_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b22_branch2a_d (Conv2D) (None, None, None, 2 262144 res4b21_relu_d[0][0] __________________________________________________________________________________________________ res5a_branch2a_relu_rgb (Activa (None, None, None, 5 0 bn5a_branch2a_rgb[0][0] __________________________________________________________________________________________________ res4b22_branch2a_relu_d (Activa (None, None, None, 2 0 res4b22_branch2a_d[0][0] __________________________________________________________________________________________________ padding5a_branch2b_rgb (ZeroPad (None, None, None, 5 0 res5a_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ padding4b22_branch2b_d (ZeroPad (None, None, None, 2 0 res4b22_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res5a_branch2b_rgb (Conv2D) (None, None, None, 5 2359296 padding5a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b22_branch2b_d (Conv2D) (None, None, None, 2 589824 padding4b22_branch2b_d[0][0] __________________________________________________________________________________________________ bn5a_branch2b_rgb (BatchNormali (None, None, None, 5 2048 res5a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b22_branch2b_relu_d (Activa (None, None, None, 2 0 res4b22_branch2b_d[0][0] __________________________________________________________________________________________________ res5a_branch2b_relu_rgb (Activa (None, None, None, 5 0 bn5a_branch2b_rgb[0][0] __________________________________________________________________________________________________ res4b22_branch2c_d (Conv2D) (None, None, None, 1 262144 res4b22_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res5a_branch2c_rgb (Conv2D) (None, None, None, 2 1048576 res5a_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res5a_branch1_rgb (Conv2D) (None, None, None, 2 2097152 res4b22_relu_rgb[0][0] __________________________________________________________________________________________________ res4b22_d (Add) (None, None, None, 1 0 res4b22_branch2c_d[0][0] res4b21_relu_d[0][0] __________________________________________________________________________________________________ bn5a_branch2c_rgb (BatchNormali (None, None, None, 2 8192 res5a_branch2c_rgb[0][0] __________________________________________________________________________________________________ bn5a_branch1_rgb (BatchNormaliz (None, None, None, 2 8192 res5a_branch1_rgb[0][0] __________________________________________________________________________________________________ res4b22_relu_d (Activation) (None, None, None, 1 0 res4b22_d[0][0] __________________________________________________________________________________________________ res5a_rgb (Add) (None, None, None, 2 0 bn5a_branch2c_rgb[0][0] bn5a_branch1_rgb[0][0] __________________________________________________________________________________________________ res5a_branch2a_d (Conv2D) (None, None, None, 5 524288 res4b22_relu_d[0][0] __________________________________________________________________________________________________ res5a_relu_rgb (Activation) (None, None, None, 2 0 res5a_rgb[0][0] __________________________________________________________________________________________________ res5a_branch2a_relu_d (Activati (None, None, None, 5 0 res5a_branch2a_d[0][0] __________________________________________________________________________________________________ res5b_branch2a_rgb (Conv2D) (None, None, None, 5 1048576 res5a_relu_rgb[0][0] __________________________________________________________________________________________________ padding5a_branch2b_d (ZeroPaddi (None, None, None, 5 0 res5a_branch2a_relu_d[0][0] __________________________________________________________________________________________________ bn5b_branch2a_rgb (BatchNormali (None, None, None, 5 2048 res5b_branch2a_rgb[0][0] __________________________________________________________________________________________________ res5a_branch2b_d (Conv2D) (None, None, None, 5 2359296 padding5a_branch2b_d[0][0] __________________________________________________________________________________________________ res5b_branch2a_relu_rgb (Activa (None, None, None, 5 0 bn5b_branch2a_rgb[0][0] __________________________________________________________________________________________________ res5a_branch2b_relu_d (Activati (None, None, None, 5 0 res5a_branch2b_d[0][0] __________________________________________________________________________________________________ padding5b_branch2b_rgb (ZeroPad (None, None, None, 5 0 res5b_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res5a_branch2c_d (Conv2D) (None, None, None, 2 1048576 res5a_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res5a_branch1_d (Conv2D) (None, None, None, 2 2097152 res4b22_relu_d[0][0] __________________________________________________________________________________________________ res5b_branch2b_rgb (Conv2D) (None, None, None, 5 2359296 padding5b_branch2b_rgb[0][0] __________________________________________________________________________________________________ res5a_d (Add) (None, None, None, 2 0 res5a_branch2c_d[0][0] res5a_branch1_d[0][0] __________________________________________________________________________________________________ bn5b_branch2b_rgb (BatchNormali (None, None, None, 5 2048 res5b_branch2b_rgb[0][0] __________________________________________________________________________________________________ res5a_relu_d (Activation) (None, None, None, 2 0 res5a_d[0][0] __________________________________________________________________________________________________ res5b_branch2b_relu_rgb (Activa (None, None, None, 5 0 bn5b_branch2b_rgb[0][0] __________________________________________________________________________________________________ res5b_branch2a_d (Conv2D) (None, None, None, 5 1048576 res5a_relu_d[0][0] __________________________________________________________________________________________________ res5b_branch2c_rgb (Conv2D) (None, None, None, 2 1048576 res5b_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res5b_branch2a_relu_d (Activati (None, None, None, 5 0 res5b_branch2a_d[0][0] __________________________________________________________________________________________________ bn5b_branch2c_rgb (BatchNormali (None, None, None, 2 8192 res5b_branch2c_rgb[0][0] __________________________________________________________________________________________________ padding5b_branch2b_d (ZeroPaddi (None, None, None, 5 0 res5b_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res5b_rgb (Add) (None, None, None, 2 0 bn5b_branch2c_rgb[0][0] res5a_relu_rgb[0][0] __________________________________________________________________________________________________ res5b_branch2b_d (Conv2D) (None, None, None, 5 2359296 padding5b_branch2b_d[0][0] __________________________________________________________________________________________________ res5b_relu_rgb (Activation) (None, None, None, 2 0 res5b_rgb[0][0] __________________________________________________________________________________________________ res5b_branch2b_relu_d (Activati (None, None, None, 5 0 res5b_branch2b_d[0][0] __________________________________________________________________________________________________ res5c_branch2a_rgb (Conv2D) (None, None, None, 5 1048576 res5b_relu_rgb[0][0] __________________________________________________________________________________________________ res5b_branch2c_d (Conv2D) (None, None, None, 2 1048576 res5b_branch2b_relu_d[0][0] __________________________________________________________________________________________________ bn5c_branch2a_rgb (BatchNormali (None, None, None, 5 2048 res5c_branch2a_rgb[0][0] __________________________________________________________________________________________________ res5b_d (Add) (None, None, None, 2 0 res5b_branch2c_d[0][0] res5a_relu_d[0][0] __________________________________________________________________________________________________ res5c_branch2a_relu_rgb (Activa (None, None, None, 5 0 bn5c_branch2a_rgb[0][0] __________________________________________________________________________________________________ res5b_relu_d (Activation) (None, None, None, 2 0 res5b_d[0][0] __________________________________________________________________________________________________ padding5c_branch2b_rgb (ZeroPad (None, None, None, 5 0 res5c_branch2a_relu_rgb[0][0] __________________________________________________________________________________________________ res5c_branch2a_d (Conv2D) (None, None, None, 5 1048576 res5b_relu_d[0][0] __________________________________________________________________________________________________ res5c_branch2b_rgb (Conv2D) (None, None, None, 5 2359296 padding5c_branch2b_rgb[0][0] __________________________________________________________________________________________________ res5c_branch2a_relu_d (Activati (None, None, None, 5 0 res5c_branch2a_d[0][0] __________________________________________________________________________________________________ bn5c_branch2b_rgb (BatchNormali (None, None, None, 5 2048 res5c_branch2b_rgb[0][0] __________________________________________________________________________________________________ padding5c_branch2b_d (ZeroPaddi (None, None, None, 5 0 res5c_branch2a_relu_d[0][0] __________________________________________________________________________________________________ res5c_branch2b_relu_rgb (Activa (None, None, None, 5 0 bn5c_branch2b_rgb[0][0] __________________________________________________________________________________________________ res5c_branch2b_d (Conv2D) (None, None, None, 5 2359296 padding5c_branch2b_d[0][0] __________________________________________________________________________________________________ res5c_branch2c_rgb (Conv2D) (None, None, None, 2 1048576 res5c_branch2b_relu_rgb[0][0] __________________________________________________________________________________________________ res5c_branch2b_relu_d (Activati (None, None, None, 5 0 res5c_branch2b_d[0][0] __________________________________________________________________________________________________ bn5c_branch2c_rgb (BatchNormali (None, None, None, 2 8192 res5c_branch2c_rgb[0][0] __________________________________________________________________________________________________ res5c_branch2c_d (Conv2D) (None, None, None, 2 1048576 res5c_branch2b_relu_d[0][0] __________________________________________________________________________________________________ res5c_rgb (Add) (None, None, None, 2 0 bn5c_branch2c_rgb[0][0] res5b_relu_rgb[0][0] __________________________________________________________________________________________________ res5c_d (Add) (None, None, None, 2 0 res5c_branch2c_d[0][0] res5b_relu_d[0][0] __________________________________________________________________________________________________ res5c_relu_rgb (Activation) (None, None, None, 2 0 res5c_rgb[0][0] __________________________________________________________________________________________________ res5c_relu_d (Activation) (None, None, None, 2 0 res5c_d[0][0] __________________________________________________________________________________________________ concatenate_33 (Concatenate) (None, None, None, 4 0 res5c_relu_rgb[0][0] res5c_relu_d[0][0] __________________________________________________________________________________________________ C5_reduced (Conv2D) (None, None, None, 2 1048832 concatenate_33[0][0] __________________________________________________________________________________________________ concatenate_30 (Concatenate) (None, None, None, 2 0 res4b22_relu_rgb[0][0] res4b22_relu_d[0][0] __________________________________________________________________________________________________ P5_upsampled (UpsampleLike) (None, None, None, 2 0 C5_reduced[0][0] concatenate_30[0][0] __________________________________________________________________________________________________ C4_reduced (Conv2D) (None, None, None, 2 524544 concatenate_30[0][0] __________________________________________________________________________________________________ P4_merged (Add) (None, None, None, 2 0 P5_upsampled[0][0] C4_reduced[0][0] __________________________________________________________________________________________________ concatenate_7 (Concatenate) (None, None, None, 1 0 res3b3_relu_rgb[0][0] res3b3_relu_d[0][0] __________________________________________________________________________________________________ P4_upsampled (UpsampleLike) (None, None, None, 2 0 P4_merged[0][0] concatenate_7[0][0] __________________________________________________________________________________________________ C3_reduced (Conv2D) (None, None, None, 2 262400 concatenate_7[0][0] __________________________________________________________________________________________________ P6 (Conv2D) (None, None, None, 2 9437440 concatenate_33[0][0] __________________________________________________________________________________________________ P3_merged (Add) (None, None, None, 2 0 P4_upsampled[0][0] C3_reduced[0][0] __________________________________________________________________________________________________ C6_relu (Activation) (None, None, None, 2 0 P6[0][0] __________________________________________________________________________________________________ P3 (Conv2D) (None, None, None, 2 590080 P3_merged[0][0] __________________________________________________________________________________________________ P4 (Conv2D) (None, None, None, 2 590080 P4_merged[0][0] __________________________________________________________________________________________________ P5 (Conv2D) (None, None, None, 2 590080 C5_reduced[0][0] __________________________________________________________________________________________________ P7 (Conv2D) (None, None, None, 2 590080 C6_relu[0][0] __________________________________________________________________________________________________ regression_submodel (Model) (None, None, 4) 2443300 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ classification_submodel (Model) (None, None, 1) 2381065 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ regression (Concatenate) (None, None, 4) 0 regression_submodel[1][0] regression_submodel[2][0] regression_submodel[3][0] regression_submodel[4][0] regression_submodel[5][0] __________________________________________________________________________________________________ classification (Concatenate) (None, None, 1) 0 classification_submodel[1][0] classification_submodel[2][0] classification_submodel[3][0] classification_submodel[4][0] classification_submodel[5][0] ================================================================================================== Total params: 103,451,949 Trainable params: 103,241,261 Non-trainable params: 210,688 __________________________________________________________________________________________________ None Epoch 1/150 1/500 [..............................] - ETA: 1:37:02 - loss: 0.4415 - regression_loss: 0.4125 - classification_loss: 0.0290 2/500 [..............................] - ETA: 50:30 - loss: 0.5317 - regression_loss: 0.4851 - classification_loss: 0.0466  3/500 [..............................] - ETA: 34:55 - loss: 0.6577 - regression_loss: 0.5872 - classification_loss: 0.0705 4/500 [..............................] - ETA: 27:08 - loss: 0.5942 - regression_loss: 0.5318 - classification_loss: 0.0625 5/500 [..............................] - ETA: 22:32 - loss: 0.5276 - regression_loss: 0.4742 - classification_loss: 0.0534 6/500 [..............................] - ETA: 19:25 - loss: 0.5224 - regression_loss: 0.4691 - classification_loss: 0.0532 7/500 [..............................] - ETA: 17:12 - loss: 0.5159 - regression_loss: 0.4608 - classification_loss: 0.0552 8/500 [..............................] - ETA: 15:33 - loss: 0.4997 - regression_loss: 0.4489 - classification_loss: 0.0508 9/500 [..............................] - ETA: 14:14 - loss: 0.5280 - regression_loss: 0.4724 - classification_loss: 0.0557 10/500 [..............................] - ETA: 13:13 - loss: 0.5011 - regression_loss: 0.4494 - classification_loss: 0.0517 11/500 [..............................] - ETA: 12:21 - loss: 0.5002 - regression_loss: 0.4490 - classification_loss: 0.0512 12/500 [..............................] - ETA: 11:37 - loss: 0.4952 - regression_loss: 0.4453 - classification_loss: 0.0500 13/500 [..............................] - ETA: 11:01 - loss: 0.5114 - regression_loss: 0.4586 - classification_loss: 0.0528 14/500 [..............................] - ETA: 10:31 - loss: 0.4925 - regression_loss: 0.4425 - classification_loss: 0.0500 15/500 [..............................] - ETA: 10:05 - loss: 0.4968 - regression_loss: 0.4468 - classification_loss: 0.0500 16/500 [..............................] - ETA: 9:41 - loss: 0.5052 - regression_loss: 0.4540 - classification_loss: 0.0512  17/500 [>.............................] - ETA: 9:19 - loss: 0.4951 - regression_loss: 0.4456 - classification_loss: 0.0495 18/500 [>.............................] - ETA: 9:00 - loss: 0.4895 - regression_loss: 0.4400 - classification_loss: 0.0495 19/500 [>.............................] - ETA: 8:43 - loss: 0.4772 - regression_loss: 0.4289 - classification_loss: 0.0483 20/500 [>.............................] - ETA: 8:29 - loss: 0.4793 - regression_loss: 0.4300 - classification_loss: 0.0493 21/500 [>.............................] - ETA: 8:15 - loss: 0.4827 - regression_loss: 0.4317 - classification_loss: 0.0510 22/500 [>.............................] - ETA: 8:02 - loss: 0.4738 - regression_loss: 0.4241 - classification_loss: 0.0497 23/500 [>.............................] - ETA: 7:51 - loss: 0.4673 - regression_loss: 0.4187 - classification_loss: 0.0486 24/500 [>.............................] - ETA: 7:40 - loss: 0.4608 - regression_loss: 0.4137 - classification_loss: 0.0471 25/500 [>.............................] - ETA: 7:30 - loss: 0.4552 - regression_loss: 0.4083 - classification_loss: 0.0469 26/500 [>.............................] - ETA: 7:21 - loss: 0.4540 - regression_loss: 0.4079 - classification_loss: 0.0461 27/500 [>.............................] - ETA: 7:13 - loss: 0.4576 - regression_loss: 0.4114 - classification_loss: 0.0463 28/500 [>.............................] - ETA: 7:05 - loss: 0.4630 - regression_loss: 0.4164 - classification_loss: 0.0465 29/500 [>.............................] - ETA: 6:57 - loss: 0.4599 - regression_loss: 0.4143 - classification_loss: 0.0456 30/500 [>.............................] - ETA: 6:50 - loss: 0.4576 - regression_loss: 0.4124 - classification_loss: 0.0452 31/500 [>.............................] - ETA: 6:44 - loss: 0.4588 - regression_loss: 0.4132 - classification_loss: 0.0456 32/500 [>.............................] - ETA: 6:38 - loss: 0.4505 - regression_loss: 0.4058 - classification_loss: 0.0446 33/500 [>.............................] - ETA: 6:32 - loss: 0.4496 - regression_loss: 0.4048 - classification_loss: 0.0448 34/500 [=>............................] - ETA: 6:27 - loss: 0.4491 - regression_loss: 0.4046 - classification_loss: 0.0446 35/500 [=>............................] - ETA: 6:33 - loss: 0.4539 - regression_loss: 0.4094 - classification_loss: 0.0446 36/500 [=>............................] - ETA: 6:27 - loss: 0.4526 - regression_loss: 0.4083 - classification_loss: 0.0443 37/500 [=>............................] - ETA: 6:22 - loss: 0.4461 - regression_loss: 0.4024 - classification_loss: 0.0437 38/500 [=>............................] - ETA: 6:17 - loss: 0.4393 - regression_loss: 0.3966 - classification_loss: 0.0427 39/500 [=>............................] - ETA: 6:13 - loss: 0.4322 - regression_loss: 0.3904 - classification_loss: 0.0418 40/500 [=>............................] - ETA: 6:08 - loss: 0.4370 - regression_loss: 0.3951 - classification_loss: 0.0419 41/500 [=>............................] - ETA: 6:04 - loss: 0.4341 - regression_loss: 0.3927 - classification_loss: 0.0413 42/500 [=>............................] - ETA: 6:00 - loss: 0.4312 - regression_loss: 0.3902 - classification_loss: 0.0410 43/500 [=>............................] - ETA: 5:56 - loss: 0.4392 - regression_loss: 0.3971 - classification_loss: 0.0421 44/500 [=>............................] - ETA: 5:53 - loss: 0.4415 - regression_loss: 0.3995 - classification_loss: 0.0420 45/500 [=>............................] - ETA: 5:49 - loss: 0.4415 - regression_loss: 0.3996 - classification_loss: 0.0420 46/500 [=>............................] - ETA: 5:46 - loss: 0.4475 - regression_loss: 0.4046 - classification_loss: 0.0429 47/500 [=>............................] - ETA: 5:43 - loss: 0.4461 - regression_loss: 0.4033 - classification_loss: 0.0428 48/500 [=>............................] - ETA: 5:40 - loss: 0.4475 - regression_loss: 0.4045 - classification_loss: 0.0430 49/500 [=>............................] - ETA: 5:36 - loss: 0.4439 - regression_loss: 0.4013 - classification_loss: 0.0426 50/500 [==>...........................] - ETA: 5:33 - loss: 0.4401 - regression_loss: 0.3981 - classification_loss: 0.0420 51/500 [==>...........................] - ETA: 5:30 - loss: 0.4398 - regression_loss: 0.3979 - classification_loss: 0.0419 52/500 [==>...........................] - ETA: 5:28 - loss: 0.4461 - regression_loss: 0.4038 - classification_loss: 0.0422 53/500 [==>...........................] - ETA: 5:25 - loss: 0.4453 - regression_loss: 0.4030 - classification_loss: 0.0423 54/500 [==>...........................] - ETA: 5:23 - loss: 0.4466 - regression_loss: 0.4042 - classification_loss: 0.0423 55/500 [==>...........................] - ETA: 5:20 - loss: 0.4509 - regression_loss: 0.4084 - classification_loss: 0.0425 56/500 [==>...........................] - ETA: 5:18 - loss: 0.4516 - regression_loss: 0.4092 - classification_loss: 0.0424 57/500 [==>...........................] - ETA: 5:15 - loss: 0.4492 - regression_loss: 0.4073 - classification_loss: 0.0419 58/500 [==>...........................] - ETA: 5:13 - loss: 0.4546 - regression_loss: 0.4122 - classification_loss: 0.0424 59/500 [==>...........................] - ETA: 5:11 - loss: 0.4510 - regression_loss: 0.4090 - classification_loss: 0.0420 60/500 [==>...........................] - ETA: 5:08 - loss: 0.4490 - regression_loss: 0.4073 - classification_loss: 0.0417 61/500 [==>...........................] - ETA: 5:06 - loss: 0.4490 - regression_loss: 0.4073 - classification_loss: 0.0417 62/500 [==>...........................] - ETA: 5:04 - loss: 0.4496 - regression_loss: 0.4078 - classification_loss: 0.0417 63/500 [==>...........................] - ETA: 5:02 - loss: 0.4539 - regression_loss: 0.4118 - classification_loss: 0.0420 64/500 [==>...........................] - ETA: 5:00 - loss: 0.4565 - regression_loss: 0.4140 - classification_loss: 0.0425 65/500 [==>...........................] - ETA: 4:58 - loss: 0.4559 - regression_loss: 0.4135 - classification_loss: 0.0424 66/500 [==>...........................] - ETA: 4:56 - loss: 0.4566 - regression_loss: 0.4141 - classification_loss: 0.0425 67/500 [===>..........................] - ETA: 4:54 - loss: 0.4575 - regression_loss: 0.4151 - classification_loss: 0.0425 68/500 [===>..........................] - ETA: 4:52 - loss: 0.4536 - regression_loss: 0.4117 - classification_loss: 0.0420 69/500 [===>..........................] - ETA: 4:50 - loss: 0.4505 - regression_loss: 0.4090 - classification_loss: 0.0415 70/500 [===>..........................] - ETA: 4:49 - loss: 0.4531 - regression_loss: 0.4115 - classification_loss: 0.0416 71/500 [===>..........................] - ETA: 4:47 - loss: 0.4544 - regression_loss: 0.4128 - classification_loss: 0.0416 72/500 [===>..........................] - ETA: 4:45 - loss: 0.4576 - regression_loss: 0.4154 - classification_loss: 0.0422 73/500 [===>..........................] - ETA: 4:44 - loss: 0.4569 - regression_loss: 0.4150 - classification_loss: 0.0419 74/500 [===>..........................] - ETA: 4:42 - loss: 0.4570 - regression_loss: 0.4151 - classification_loss: 0.0419 75/500 [===>..........................] - ETA: 4:40 - loss: 0.4581 - regression_loss: 0.4160 - classification_loss: 0.0421 76/500 [===>..........................] - ETA: 4:39 - loss: 0.4594 - regression_loss: 0.4175 - classification_loss: 0.0419 77/500 [===>..........................] - ETA: 4:37 - loss: 0.4575 - regression_loss: 0.4159 - classification_loss: 0.0416 78/500 [===>..........................] - ETA: 4:36 - loss: 0.4560 - regression_loss: 0.4147 - classification_loss: 0.0413 79/500 [===>..........................] - ETA: 4:34 - loss: 0.4587 - regression_loss: 0.4175 - classification_loss: 0.0413 80/500 [===>..........................] - ETA: 4:33 - loss: 0.4571 - regression_loss: 0.4160 - classification_loss: 0.0410 81/500 [===>..........................] - ETA: 4:32 - loss: 0.4597 - regression_loss: 0.4187 - classification_loss: 0.0410 82/500 [===>..........................] - ETA: 4:30 - loss: 0.4605 - regression_loss: 0.4194 - classification_loss: 0.0411 83/500 [===>..........................] - ETA: 4:29 - loss: 0.4585 - regression_loss: 0.4175 - classification_loss: 0.0410 84/500 [====>.........................] - ETA: 4:27 - loss: 0.4594 - regression_loss: 0.4182 - classification_loss: 0.0411 85/500 [====>.........................] - ETA: 4:26 - loss: 0.4588 - regression_loss: 0.4176 - classification_loss: 0.0412 86/500 [====>.........................] - ETA: 4:25 - loss: 0.4592 - regression_loss: 0.4177 - classification_loss: 0.0415 87/500 [====>.........................] - ETA: 4:23 - loss: 0.4604 - regression_loss: 0.4189 - classification_loss: 0.0415 88/500 [====>.........................] - ETA: 4:22 - loss: 0.4613 - regression_loss: 0.4196 - classification_loss: 0.0417 89/500 [====>.........................] - ETA: 4:21 - loss: 0.4637 - regression_loss: 0.4218 - classification_loss: 0.0418 90/500 [====>.........................] - ETA: 4:20 - loss: 0.4643 - regression_loss: 0.4224 - classification_loss: 0.0419 91/500 [====>.........................] - ETA: 4:18 - loss: 0.4643 - regression_loss: 0.4225 - classification_loss: 0.0418 92/500 [====>.........................] - ETA: 4:17 - loss: 0.4677 - regression_loss: 0.4259 - classification_loss: 0.0419 93/500 [====>.........................] - ETA: 4:16 - loss: 0.4696 - regression_loss: 0.4274 - classification_loss: 0.0422 94/500 [====>.........................] - ETA: 4:15 - loss: 0.4721 - regression_loss: 0.4297 - classification_loss: 0.0424 95/500 [====>.........................] - ETA: 4:14 - loss: 0.4728 - regression_loss: 0.4304 - classification_loss: 0.0423 96/500 [====>.........................] - ETA: 4:13 - loss: 0.4731 - regression_loss: 0.4307 - classification_loss: 0.0424 97/500 [====>.........................] - ETA: 4:12 - loss: 0.4714 - regression_loss: 0.4293 - classification_loss: 0.0421 98/500 [====>.........................] - ETA: 4:10 - loss: 0.4703 - regression_loss: 0.4285 - classification_loss: 0.0418 99/500 [====>.........................] - ETA: 4:09 - loss: 0.4678 - regression_loss: 0.4262 - classification_loss: 0.0416 100/500 [=====>........................] - ETA: 4:08 - loss: 0.4724 - regression_loss: 0.4304 - classification_loss: 0.0420 101/500 [=====>........................] - ETA: 4:07 - loss: 0.4769 - regression_loss: 0.4345 - classification_loss: 0.0424 102/500 [=====>........................] - ETA: 4:06 - loss: 0.4750 - regression_loss: 0.4328 - classification_loss: 0.0421 103/500 [=====>........................] - ETA: 4:05 - loss: 0.4803 - regression_loss: 0.4373 - classification_loss: 0.0430 104/500 [=====>........................] - ETA: 4:04 - loss: 0.4804 - regression_loss: 0.4374 - classification_loss: 0.0430 105/500 [=====>........................] - ETA: 4:03 - loss: 0.4793 - regression_loss: 0.4364 - classification_loss: 0.0429 106/500 [=====>........................] - ETA: 4:02 - loss: 0.4799 - regression_loss: 0.4368 - classification_loss: 0.0430 107/500 [=====>........................] - ETA: 4:01 - loss: 0.4777 - regression_loss: 0.4349 - classification_loss: 0.0428 108/500 [=====>........................] - ETA: 4:00 - loss: 0.4811 - regression_loss: 0.4379 - classification_loss: 0.0432 109/500 [=====>........................] - ETA: 3:59 - loss: 0.4783 - regression_loss: 0.4353 - classification_loss: 0.0430 110/500 [=====>........................] - ETA: 3:58 - loss: 0.4782 - regression_loss: 0.4353 - classification_loss: 0.0429 111/500 [=====>........................] - ETA: 3:57 - loss: 0.4810 - regression_loss: 0.4380 - classification_loss: 0.0430 112/500 [=====>........................] - ETA: 3:56 - loss: 0.4811 - regression_loss: 0.4382 - classification_loss: 0.0429 113/500 [=====>........................] - ETA: 3:55 - loss: 0.4800 - regression_loss: 0.4373 - classification_loss: 0.0427 114/500 [=====>........................] - ETA: 3:54 - loss: 0.4822 - regression_loss: 0.4393 - classification_loss: 0.0429 115/500 [=====>........................] - ETA: 3:53 - loss: 0.4804 - regression_loss: 0.4377 - classification_loss: 0.0427 116/500 [=====>........................] - ETA: 3:52 - loss: 0.4793 - regression_loss: 0.4368 - classification_loss: 0.0426 117/500 [======>.......................] - ETA: 3:51 - loss: 0.4770 - regression_loss: 0.4347 - classification_loss: 0.0423 118/500 [======>.......................] - ETA: 3:50 - loss: 0.4754 - regression_loss: 0.4333 - classification_loss: 0.0421 119/500 [======>.......................] - ETA: 3:49 - loss: 0.4768 - regression_loss: 0.4346 - classification_loss: 0.0422 120/500 [======>.......................] - ETA: 3:48 - loss: 0.4784 - regression_loss: 0.4357 - classification_loss: 0.0427 121/500 [======>.......................] - ETA: 3:47 - loss: 0.4792 - regression_loss: 0.4367 - classification_loss: 0.0425 122/500 [======>.......................] - ETA: 3:46 - loss: 0.4813 - regression_loss: 0.4387 - classification_loss: 0.0426 123/500 [======>.......................] - ETA: 3:45 - loss: 0.4840 - regression_loss: 0.4413 - classification_loss: 0.0428 124/500 [======>.......................] - ETA: 3:45 - loss: 0.4858 - regression_loss: 0.4429 - classification_loss: 0.0429 125/500 [======>.......................] - ETA: 3:44 - loss: 0.4858 - regression_loss: 0.4428 - classification_loss: 0.0430 126/500 [======>.......................] - ETA: 3:43 - loss: 0.4880 - regression_loss: 0.4451 - classification_loss: 0.0429 127/500 [======>.......................] - ETA: 3:42 - loss: 0.4868 - regression_loss: 0.4441 - classification_loss: 0.0427 128/500 [======>.......................] - ETA: 3:41 - loss: 0.4855 - regression_loss: 0.4429 - classification_loss: 0.0426 129/500 [======>.......................] - ETA: 3:40 - loss: 0.4868 - regression_loss: 0.4439 - classification_loss: 0.0429 130/500 [======>.......................] - ETA: 3:39 - loss: 0.4886 - regression_loss: 0.4457 - classification_loss: 0.0429 131/500 [======>.......................] - ETA: 3:39 - loss: 0.4890 - regression_loss: 0.4461 - classification_loss: 0.0429 132/500 [======>.......................] - ETA: 3:38 - loss: 0.4893 - regression_loss: 0.4464 - classification_loss: 0.0429 133/500 [======>.......................] - ETA: 3:37 - loss: 0.4902 - regression_loss: 0.4474 - classification_loss: 0.0428 134/500 [=======>......................] - ETA: 3:36 - loss: 0.4895 - regression_loss: 0.4467 - classification_loss: 0.0428 135/500 [=======>......................] - ETA: 3:35 - loss: 0.4894 - regression_loss: 0.4466 - classification_loss: 0.0428 136/500 [=======>......................] - ETA: 3:34 - loss: 0.4879 - regression_loss: 0.4453 - classification_loss: 0.0426 137/500 [=======>......................] - ETA: 3:33 - loss: 0.4857 - regression_loss: 0.4432 - classification_loss: 0.0425 138/500 [=======>......................] - ETA: 3:33 - loss: 0.4849 - regression_loss: 0.4426 - classification_loss: 0.0424 139/500 [=======>......................] - ETA: 3:32 - loss: 0.4855 - regression_loss: 0.4431 - classification_loss: 0.0424 140/500 [=======>......................] - ETA: 3:31 - loss: 0.4849 - regression_loss: 0.4426 - classification_loss: 0.0423 141/500 [=======>......................] - ETA: 3:30 - loss: 0.4866 - regression_loss: 0.4439 - classification_loss: 0.0427 142/500 [=======>......................] - ETA: 3:29 - loss: 0.4890 - regression_loss: 0.4461 - classification_loss: 0.0429 143/500 [=======>......................] - ETA: 3:29 - loss: 0.4904 - regression_loss: 0.4474 - classification_loss: 0.0430 144/500 [=======>......................] - ETA: 3:28 - loss: 0.4888 - regression_loss: 0.4460 - classification_loss: 0.0428 145/500 [=======>......................] - ETA: 3:27 - loss: 0.4887 - regression_loss: 0.4459 - classification_loss: 0.0428 146/500 [=======>......................] - ETA: 3:26 - loss: 0.4869 - regression_loss: 0.4444 - classification_loss: 0.0426 147/500 [=======>......................] - ETA: 3:26 - loss: 0.4861 - regression_loss: 0.4435 - classification_loss: 0.0427 148/500 [=======>......................] - ETA: 3:25 - loss: 0.4860 - regression_loss: 0.4433 - classification_loss: 0.0427 149/500 [=======>......................] - ETA: 3:24 - loss: 0.4856 - regression_loss: 0.4430 - classification_loss: 0.0426 150/500 [========>.....................] - ETA: 3:23 - loss: 0.4856 - regression_loss: 0.4429 - classification_loss: 0.0427 151/500 [========>.....................] - ETA: 3:22 - loss: 0.4843 - regression_loss: 0.4418 - classification_loss: 0.0425 152/500 [========>.....................] - ETA: 3:22 - loss: 0.4863 - regression_loss: 0.4436 - classification_loss: 0.0427 153/500 [========>.....................] - ETA: 3:21 - loss: 0.4845 - regression_loss: 0.4420 - classification_loss: 0.0426 154/500 [========>.....................] - ETA: 3:20 - loss: 0.4846 - regression_loss: 0.4420 - classification_loss: 0.0425 155/500 [========>.....................] - ETA: 3:19 - loss: 0.4847 - regression_loss: 0.4422 - classification_loss: 0.0424 156/500 [========>.....................] - ETA: 3:19 - loss: 0.4832 - regression_loss: 0.4409 - classification_loss: 0.0423 157/500 [========>.....................] - ETA: 3:18 - loss: 0.4855 - regression_loss: 0.4430 - classification_loss: 0.0424 158/500 [========>.....................] - ETA: 3:17 - loss: 0.4850 - regression_loss: 0.4426 - classification_loss: 0.0424 159/500 [========>.....................] - ETA: 3:16 - loss: 0.4844 - regression_loss: 0.4420 - classification_loss: 0.0424 160/500 [========>.....................] - ETA: 3:16 - loss: 0.4841 - regression_loss: 0.4418 - classification_loss: 0.0423 161/500 [========>.....................] - ETA: 3:15 - loss: 0.4838 - regression_loss: 0.4417 - classification_loss: 0.0421 162/500 [========>.....................] - ETA: 3:14 - loss: 0.4827 - regression_loss: 0.4407 - classification_loss: 0.0420 163/500 [========>.....................] - ETA: 3:13 - loss: 0.4811 - regression_loss: 0.4392 - classification_loss: 0.0419 164/500 [========>.....................] - ETA: 3:13 - loss: 0.4804 - regression_loss: 0.4386 - classification_loss: 0.0418 165/500 [========>.....................] - ETA: 3:12 - loss: 0.4815 - regression_loss: 0.4395 - classification_loss: 0.0420 166/500 [========>.....................] - ETA: 3:11 - loss: 0.4802 - regression_loss: 0.4383 - classification_loss: 0.0419 167/500 [=========>....................] - ETA: 3:10 - loss: 0.4799 - regression_loss: 0.4380 - classification_loss: 0.0418 168/500 [=========>....................] - ETA: 3:10 - loss: 0.4796 - regression_loss: 0.4378 - classification_loss: 0.0418 169/500 [=========>....................] - ETA: 3:09 - loss: 0.4820 - regression_loss: 0.4397 - classification_loss: 0.0423 170/500 [=========>....................] - ETA: 3:08 - loss: 0.4816 - regression_loss: 0.4393 - classification_loss: 0.0423 171/500 [=========>....................] - ETA: 3:08 - loss: 0.4829 - regression_loss: 0.4406 - classification_loss: 0.0423 172/500 [=========>....................] - ETA: 3:07 - loss: 0.4841 - regression_loss: 0.4416 - classification_loss: 0.0425 173/500 [=========>....................] - ETA: 3:06 - loss: 0.4853 - regression_loss: 0.4427 - classification_loss: 0.0426 174/500 [=========>....................] - ETA: 3:06 - loss: 0.4881 - regression_loss: 0.4454 - classification_loss: 0.0427 175/500 [=========>....................] - ETA: 3:05 - loss: 0.4887 - regression_loss: 0.4459 - classification_loss: 0.0428 176/500 [=========>....................] - ETA: 3:04 - loss: 0.4890 - regression_loss: 0.4463 - classification_loss: 0.0427 177/500 [=========>....................] - ETA: 3:03 - loss: 0.4906 - regression_loss: 0.4479 - classification_loss: 0.0427 178/500 [=========>....................] - ETA: 3:03 - loss: 0.4908 - regression_loss: 0.4480 - classification_loss: 0.0427 179/500 [=========>....................] - ETA: 3:02 - loss: 0.4900 - regression_loss: 0.4474 - classification_loss: 0.0426 180/500 [=========>....................] - ETA: 3:01 - loss: 0.4897 - regression_loss: 0.4472 - classification_loss: 0.0425 181/500 [=========>....................] - ETA: 3:01 - loss: 0.4900 - regression_loss: 0.4477 - classification_loss: 0.0424 182/500 [=========>....................] - ETA: 3:00 - loss: 0.4900 - regression_loss: 0.4477 - classification_loss: 0.0423 183/500 [=========>....................] - ETA: 2:59 - loss: 0.4907 - regression_loss: 0.4484 - classification_loss: 0.0423 184/500 [==========>...................] - ETA: 2:59 - loss: 0.4903 - regression_loss: 0.4481 - classification_loss: 0.0423 185/500 [==========>...................] - ETA: 2:58 - loss: 0.4895 - regression_loss: 0.4472 - classification_loss: 0.0422 186/500 [==========>...................] - ETA: 2:57 - loss: 0.4888 - regression_loss: 0.4467 - classification_loss: 0.0421 187/500 [==========>...................] - ETA: 2:57 - loss: 0.4901 - regression_loss: 0.4478 - classification_loss: 0.0423 188/500 [==========>...................] - ETA: 2:56 - loss: 0.4888 - regression_loss: 0.4466 - classification_loss: 0.0422 189/500 [==========>...................] - ETA: 2:55 - loss: 0.4884 - regression_loss: 0.4461 - classification_loss: 0.0423 190/500 [==========>...................] - ETA: 2:55 - loss: 0.4871 - regression_loss: 0.4449 - classification_loss: 0.0422 191/500 [==========>...................] - ETA: 2:54 - loss: 0.4874 - regression_loss: 0.4452 - classification_loss: 0.0422 192/500 [==========>...................] - ETA: 2:53 - loss: 0.4873 - regression_loss: 0.4451 - classification_loss: 0.0421 193/500 [==========>...................] - ETA: 2:53 - loss: 0.4894 - regression_loss: 0.4468 - classification_loss: 0.0425 194/500 [==========>...................] - ETA: 2:52 - loss: 0.4896 - regression_loss: 0.4470 - classification_loss: 0.0426 195/500 [==========>...................] - ETA: 2:51 - loss: 0.4890 - regression_loss: 0.4465 - classification_loss: 0.0425 196/500 [==========>...................] - ETA: 2:51 - loss: 0.4883 - regression_loss: 0.4460 - classification_loss: 0.0424 197/500 [==========>...................] - ETA: 2:50 - loss: 0.4890 - regression_loss: 0.4466 - classification_loss: 0.0424 198/500 [==========>...................] - ETA: 2:49 - loss: 0.4877 - regression_loss: 0.4454 - classification_loss: 0.0423 199/500 [==========>...................] - ETA: 2:49 - loss: 0.4868 - regression_loss: 0.4445 - classification_loss: 0.0423 200/500 [===========>..................] - ETA: 2:48 - loss: 0.4861 - regression_loss: 0.4439 - classification_loss: 0.0422 201/500 [===========>..................] - ETA: 2:47 - loss: 0.4853 - regression_loss: 0.4432 - classification_loss: 0.0421 202/500 [===========>..................] - ETA: 2:47 - loss: 0.4838 - regression_loss: 0.4418 - classification_loss: 0.0419 203/500 [===========>..................] - ETA: 2:46 - loss: 0.4844 - regression_loss: 0.4423 - classification_loss: 0.0421 204/500 [===========>..................] - ETA: 2:45 - loss: 0.4861 - regression_loss: 0.4438 - classification_loss: 0.0423 205/500 [===========>..................] - ETA: 2:45 - loss: 0.4866 - regression_loss: 0.4442 - classification_loss: 0.0424 206/500 [===========>..................] - ETA: 2:44 - loss: 0.4858 - regression_loss: 0.4435 - classification_loss: 0.0423 207/500 [===========>..................] - ETA: 2:44 - loss: 0.4868 - regression_loss: 0.4446 - classification_loss: 0.0422 208/500 [===========>..................] - ETA: 2:43 - loss: 0.4871 - regression_loss: 0.4448 - classification_loss: 0.0423 209/500 [===========>..................] - ETA: 2:42 - loss: 0.4864 - regression_loss: 0.4441 - classification_loss: 0.0423 210/500 [===========>..................] - ETA: 2:42 - loss: 0.4853 - regression_loss: 0.4432 - classification_loss: 0.0422 211/500 [===========>..................] - ETA: 2:41 - loss: 0.4853 - regression_loss: 0.4431 - classification_loss: 0.0422 212/500 [===========>..................] - ETA: 2:40 - loss: 0.4856 - regression_loss: 0.4434 - classification_loss: 0.0422 213/500 [===========>..................] - ETA: 2:40 - loss: 0.4839 - regression_loss: 0.4419 - classification_loss: 0.0420 214/500 [===========>..................] - ETA: 2:39 - loss: 0.4830 - regression_loss: 0.4411 - classification_loss: 0.0419 215/500 [===========>..................] - ETA: 2:38 - loss: 0.4824 - regression_loss: 0.4406 - classification_loss: 0.0418 216/500 [===========>..................] - ETA: 2:38 - loss: 0.4813 - regression_loss: 0.4397 - classification_loss: 0.0416 217/500 [============>.................] - ETA: 2:37 - loss: 0.4804 - regression_loss: 0.4388 - classification_loss: 0.0415 218/500 [============>.................] - ETA: 2:37 - loss: 0.4820 - regression_loss: 0.4404 - classification_loss: 0.0416 219/500 [============>.................] - ETA: 2:36 - loss: 0.4823 - regression_loss: 0.4406 - classification_loss: 0.0417 220/500 [============>.................] - ETA: 2:35 - loss: 0.4828 - regression_loss: 0.4412 - classification_loss: 0.0417 221/500 [============>.................] - ETA: 2:35 - loss: 0.4817 - regression_loss: 0.4402 - classification_loss: 0.0415 222/500 [============>.................] - ETA: 2:34 - loss: 0.4823 - regression_loss: 0.4408 - classification_loss: 0.0415 223/500 [============>.................] - ETA: 2:33 - loss: 0.4822 - regression_loss: 0.4408 - classification_loss: 0.0414 224/500 [============>.................] - ETA: 2:33 - loss: 0.4815 - regression_loss: 0.4402 - classification_loss: 0.0414 225/500 [============>.................] - ETA: 2:32 - loss: 0.4814 - regression_loss: 0.4400 - classification_loss: 0.0415 226/500 [============>.................] - ETA: 2:32 - loss: 0.4805 - regression_loss: 0.4392 - classification_loss: 0.0413 227/500 [============>.................] - ETA: 2:31 - loss: 0.4819 - regression_loss: 0.4405 - classification_loss: 0.0414 228/500 [============>.................] - ETA: 2:30 - loss: 0.4821 - regression_loss: 0.4408 - classification_loss: 0.0414 229/500 [============>.................] - ETA: 2:30 - loss: 0.4818 - regression_loss: 0.4404 - classification_loss: 0.0413 230/500 [============>.................] - ETA: 2:29 - loss: 0.4821 - regression_loss: 0.4407 - classification_loss: 0.0413 231/500 [============>.................] - ETA: 2:29 - loss: 0.4823 - regression_loss: 0.4410 - classification_loss: 0.0414 232/500 [============>.................] - ETA: 2:28 - loss: 0.4833 - regression_loss: 0.4418 - classification_loss: 0.0415 233/500 [============>.................] - ETA: 2:27 - loss: 0.4828 - regression_loss: 0.4414 - classification_loss: 0.0415 234/500 [=============>................] - ETA: 2:27 - loss: 0.4818 - regression_loss: 0.4404 - classification_loss: 0.0413 235/500 [=============>................] - ETA: 2:26 - loss: 0.4815 - regression_loss: 0.4402 - classification_loss: 0.0413 236/500 [=============>................] - ETA: 2:26 - loss: 0.4827 - regression_loss: 0.4413 - classification_loss: 0.0414 237/500 [=============>................] - ETA: 2:25 - loss: 0.4832 - regression_loss: 0.4418 - classification_loss: 0.0414 238/500 [=============>................] - ETA: 2:24 - loss: 0.4837 - regression_loss: 0.4423 - classification_loss: 0.0414 239/500 [=============>................] - ETA: 2:24 - loss: 0.4832 - regression_loss: 0.4419 - classification_loss: 0.0413 240/500 [=============>................] - ETA: 2:23 - loss: 0.4828 - regression_loss: 0.4415 - classification_loss: 0.0413 241/500 [=============>................] - ETA: 2:23 - loss: 0.4819 - regression_loss: 0.4407 - classification_loss: 0.0412 242/500 [=============>................] - ETA: 2:22 - loss: 0.4824 - regression_loss: 0.4412 - classification_loss: 0.0412 243/500 [=============>................] - ETA: 2:21 - loss: 0.4826 - regression_loss: 0.4413 - classification_loss: 0.0412 244/500 [=============>................] - ETA: 2:21 - loss: 0.4823 - regression_loss: 0.4411 - classification_loss: 0.0412 245/500 [=============>................] - ETA: 2:20 - loss: 0.4822 - regression_loss: 0.4410 - classification_loss: 0.0412 246/500 [=============>................] - ETA: 2:19 - loss: 0.4827 - regression_loss: 0.4413 - classification_loss: 0.0414 247/500 [=============>................] - ETA: 2:19 - loss: 0.4832 - regression_loss: 0.4418 - classification_loss: 0.0414 248/500 [=============>................] - ETA: 2:18 - loss: 0.4831 - regression_loss: 0.4417 - classification_loss: 0.0415 249/500 [=============>................] - ETA: 2:18 - loss: 0.4845 - regression_loss: 0.4429 - classification_loss: 0.0416 250/500 [==============>...............] - ETA: 2:17 - loss: 0.4848 - regression_loss: 0.4432 - classification_loss: 0.0416 251/500 [==============>...............] - ETA: 2:17 - loss: 0.4864 - regression_loss: 0.4445 - classification_loss: 0.0419 252/500 [==============>...............] - ETA: 2:16 - loss: 0.4852 - regression_loss: 0.4434 - classification_loss: 0.0418 253/500 [==============>...............] - ETA: 2:15 - loss: 0.4846 - regression_loss: 0.4429 - classification_loss: 0.0418 254/500 [==============>...............] - ETA: 2:15 - loss: 0.4846 - regression_loss: 0.4428 - classification_loss: 0.0418 255/500 [==============>...............] - ETA: 2:14 - loss: 0.4838 - regression_loss: 0.4421 - classification_loss: 0.0417 256/500 [==============>...............] - ETA: 2:14 - loss: 0.4850 - regression_loss: 0.4433 - classification_loss: 0.0417 257/500 [==============>...............] - ETA: 2:13 - loss: 0.4862 - regression_loss: 0.4444 - classification_loss: 0.0419 258/500 [==============>...............] - ETA: 2:12 - loss: 0.4851 - regression_loss: 0.4433 - classification_loss: 0.0418 259/500 [==============>...............] - ETA: 2:12 - loss: 0.4850 - regression_loss: 0.4433 - classification_loss: 0.0417 260/500 [==============>...............] - ETA: 2:11 - loss: 0.4851 - regression_loss: 0.4433 - classification_loss: 0.0418 261/500 [==============>...............] - ETA: 2:11 - loss: 0.4846 - regression_loss: 0.4428 - classification_loss: 0.0417 262/500 [==============>...............] - ETA: 2:10 - loss: 0.4845 - regression_loss: 0.4428 - classification_loss: 0.0417 263/500 [==============>...............] - ETA: 2:09 - loss: 0.4850 - regression_loss: 0.4431 - classification_loss: 0.0419 264/500 [==============>...............] - ETA: 2:09 - loss: 0.4839 - regression_loss: 0.4420 - classification_loss: 0.0418 265/500 [==============>...............] - ETA: 2:08 - loss: 0.4842 - regression_loss: 0.4423 - classification_loss: 0.0419 266/500 [==============>...............] - ETA: 2:08 - loss: 0.4839 - regression_loss: 0.4420 - classification_loss: 0.0419 267/500 [===============>..............] - ETA: 2:07 - loss: 0.4837 - regression_loss: 0.4418 - classification_loss: 0.0419 268/500 [===============>..............] - ETA: 2:06 - loss: 0.4838 - regression_loss: 0.4418 - classification_loss: 0.0420 269/500 [===============>..............] - ETA: 2:06 - loss: 0.4845 - regression_loss: 0.4424 - classification_loss: 0.0421 270/500 [===============>..............] - ETA: 2:05 - loss: 0.4846 - regression_loss: 0.4425 - classification_loss: 0.0421 271/500 [===============>..............] - ETA: 2:05 - loss: 0.4841 - regression_loss: 0.4420 - classification_loss: 0.0420 272/500 [===============>..............] - ETA: 2:04 - loss: 0.4833 - regression_loss: 0.4413 - classification_loss: 0.0420 273/500 [===============>..............] - ETA: 2:04 - loss: 0.4832 - regression_loss: 0.4412 - classification_loss: 0.0420 274/500 [===============>..............] - ETA: 2:03 - loss: 0.4830 - regression_loss: 0.4409 - classification_loss: 0.0421 275/500 [===============>..............] - ETA: 2:02 - loss: 0.4843 - regression_loss: 0.4420 - classification_loss: 0.0423 276/500 [===============>..............] - ETA: 2:02 - loss: 0.4841 - regression_loss: 0.4418 - classification_loss: 0.0423 277/500 [===============>..............] - ETA: 2:01 - loss: 0.4847 - regression_loss: 0.4422 - classification_loss: 0.0425 278/500 [===============>..............] - ETA: 2:01 - loss: 0.4846 - regression_loss: 0.4422 - classification_loss: 0.0424 279/500 [===============>..............] - ETA: 2:00 - loss: 0.4840 - regression_loss: 0.4416 - classification_loss: 0.0424 280/500 [===============>..............] - ETA: 1:59 - loss: 0.4837 - regression_loss: 0.4413 - classification_loss: 0.0424 281/500 [===============>..............] - ETA: 1:59 - loss: 0.4829 - regression_loss: 0.4406 - classification_loss: 0.0423 282/500 [===============>..............] - ETA: 1:58 - loss: 0.4831 - regression_loss: 0.4407 - classification_loss: 0.0424 283/500 [===============>..............] - ETA: 1:58 - loss: 0.4828 - regression_loss: 0.4405 - classification_loss: 0.0424 284/500 [================>.............] - ETA: 1:57 - loss: 0.4832 - regression_loss: 0.4408 - classification_loss: 0.0424 285/500 [================>.............] - ETA: 1:57 - loss: 0.4820 - regression_loss: 0.4396 - classification_loss: 0.0423 286/500 [================>.............] - ETA: 1:56 - loss: 0.4818 - regression_loss: 0.4394 - classification_loss: 0.0423 287/500 [================>.............] - ETA: 1:55 - loss: 0.4813 - regression_loss: 0.4391 - classification_loss: 0.0423 288/500 [================>.............] - ETA: 1:55 - loss: 0.4806 - regression_loss: 0.4384 - classification_loss: 0.0422 289/500 [================>.............] - ETA: 1:54 - loss: 0.4799 - regression_loss: 0.4378 - classification_loss: 0.0421 290/500 [================>.............] - ETA: 1:54 - loss: 0.4801 - regression_loss: 0.4380 - classification_loss: 0.0421 291/500 [================>.............] - ETA: 1:53 - loss: 0.4802 - regression_loss: 0.4381 - classification_loss: 0.0421 292/500 [================>.............] - ETA: 1:52 - loss: 0.4803 - regression_loss: 0.4382 - classification_loss: 0.0421 293/500 [================>.............] - ETA: 1:52 - loss: 0.4799 - regression_loss: 0.4378 - classification_loss: 0.0421 294/500 [================>.............] - ETA: 1:51 - loss: 0.4794 - regression_loss: 0.4374 - classification_loss: 0.0420 295/500 [================>.............] - ETA: 1:51 - loss: 0.4800 - regression_loss: 0.4380 - classification_loss: 0.0420 296/500 [================>.............] - ETA: 1:50 - loss: 0.4789 - regression_loss: 0.4370 - classification_loss: 0.0419 297/500 [================>.............] - ETA: 1:50 - loss: 0.4784 - regression_loss: 0.4365 - classification_loss: 0.0418 298/500 [================>.............] - ETA: 1:49 - loss: 0.4782 - regression_loss: 0.4364 - classification_loss: 0.0418 299/500 [================>.............] - ETA: 1:49 - loss: 0.4780 - regression_loss: 0.4363 - classification_loss: 0.0418 300/500 [=================>............] - ETA: 1:48 - loss: 0.4777 - regression_loss: 0.4360 - classification_loss: 0.0417 301/500 [=================>............] - ETA: 1:47 - loss: 0.4771 - regression_loss: 0.4354 - classification_loss: 0.0417 302/500 [=================>............] - ETA: 1:47 - loss: 0.4769 - regression_loss: 0.4352 - classification_loss: 0.0417 303/500 [=================>............] - ETA: 1:46 - loss: 0.4772 - regression_loss: 0.4354 - classification_loss: 0.0418 304/500 [=================>............] - ETA: 1:46 - loss: 0.4762 - regression_loss: 0.4345 - classification_loss: 0.0417 305/500 [=================>............] - ETA: 1:45 - loss: 0.4760 - regression_loss: 0.4344 - classification_loss: 0.0416 306/500 [=================>............] - ETA: 1:45 - loss: 0.4762 - regression_loss: 0.4347 - classification_loss: 0.0416 307/500 [=================>............] - ETA: 1:44 - loss: 0.4759 - regression_loss: 0.4344 - classification_loss: 0.0415 308/500 [=================>............] - ETA: 1:43 - loss: 0.4767 - regression_loss: 0.4351 - classification_loss: 0.0416 309/500 [=================>............] - ETA: 1:43 - loss: 0.4765 - regression_loss: 0.4349 - classification_loss: 0.0416 310/500 [=================>............] - ETA: 1:42 - loss: 0.4768 - regression_loss: 0.4352 - classification_loss: 0.0416 311/500 [=================>............] - ETA: 1:42 - loss: 0.4774 - regression_loss: 0.4358 - classification_loss: 0.0416 312/500 [=================>............] - ETA: 1:41 - loss: 0.4785 - regression_loss: 0.4367 - classification_loss: 0.0418 313/500 [=================>............] - ETA: 1:41 - loss: 0.4785 - regression_loss: 0.4367 - classification_loss: 0.0418 314/500 [=================>............] - ETA: 1:40 - loss: 0.4777 - regression_loss: 0.4360 - classification_loss: 0.0417 315/500 [=================>............] - ETA: 1:39 - loss: 0.4777 - regression_loss: 0.4360 - classification_loss: 0.0417 316/500 [=================>............] - ETA: 1:39 - loss: 0.4772 - regression_loss: 0.4356 - classification_loss: 0.0416 317/500 [==================>...........] - ETA: 1:38 - loss: 0.4778 - regression_loss: 0.4361 - classification_loss: 0.0417 318/500 [==================>...........] - ETA: 1:38 - loss: 0.4781 - regression_loss: 0.4364 - classification_loss: 0.0417 319/500 [==================>...........] - ETA: 1:37 - loss: 0.4783 - regression_loss: 0.4366 - classification_loss: 0.0417 320/500 [==================>...........] - ETA: 1:37 - loss: 0.4784 - regression_loss: 0.4367 - classification_loss: 0.0418 321/500 [==================>...........] - ETA: 1:36 - loss: 0.4781 - regression_loss: 0.4364 - classification_loss: 0.0417 322/500 [==================>...........] - ETA: 1:36 - loss: 0.4779 - regression_loss: 0.4362 - classification_loss: 0.0417 323/500 [==================>...........] - ETA: 1:35 - loss: 0.4780 - regression_loss: 0.4362 - classification_loss: 0.0417 324/500 [==================>...........] - ETA: 1:34 - loss: 0.4785 - regression_loss: 0.4368 - classification_loss: 0.0417 325/500 [==================>...........] - ETA: 1:34 - loss: 0.4783 - regression_loss: 0.4366 - classification_loss: 0.0417 326/500 [==================>...........] - ETA: 1:33 - loss: 0.4788 - regression_loss: 0.4371 - classification_loss: 0.0416 327/500 [==================>...........] - ETA: 1:33 - loss: 0.4785 - regression_loss: 0.4369 - classification_loss: 0.0416 328/500 [==================>...........] - ETA: 1:32 - loss: 0.4792 - regression_loss: 0.4375 - classification_loss: 0.0417 329/500 [==================>...........] - ETA: 1:32 - loss: 0.4785 - regression_loss: 0.4369 - classification_loss: 0.0417 330/500 [==================>...........] - ETA: 1:31 - loss: 0.4788 - regression_loss: 0.4371 - classification_loss: 0.0417 331/500 [==================>...........] - ETA: 1:30 - loss: 0.4783 - regression_loss: 0.4367 - classification_loss: 0.0416 332/500 [==================>...........] - ETA: 1:30 - loss: 0.4787 - regression_loss: 0.4371 - classification_loss: 0.0416 333/500 [==================>...........] - ETA: 1:29 - loss: 0.4794 - regression_loss: 0.4378 - classification_loss: 0.0416 334/500 [===================>..........] - ETA: 1:29 - loss: 0.4804 - regression_loss: 0.4388 - classification_loss: 0.0417 335/500 [===================>..........] - ETA: 1:28 - loss: 0.4798 - regression_loss: 0.4382 - classification_loss: 0.0416 336/500 [===================>..........] - ETA: 1:28 - loss: 0.4801 - regression_loss: 0.4385 - classification_loss: 0.0416 337/500 [===================>..........] - ETA: 1:27 - loss: 0.4798 - regression_loss: 0.4382 - classification_loss: 0.0416 338/500 [===================>..........] - ETA: 1:27 - loss: 0.4792 - regression_loss: 0.4377 - classification_loss: 0.0415 339/500 [===================>..........] - ETA: 1:26 - loss: 0.4793 - regression_loss: 0.4377 - classification_loss: 0.0415 340/500 [===================>..........] - ETA: 1:26 - loss: 0.4794 - regression_loss: 0.4378 - classification_loss: 0.0415 341/500 [===================>..........] - ETA: 1:25 - loss: 0.4797 - regression_loss: 0.4382 - classification_loss: 0.0416 342/500 [===================>..........] - ETA: 1:24 - loss: 0.4800 - regression_loss: 0.4384 - classification_loss: 0.0416 343/500 [===================>..........] - ETA: 1:24 - loss: 0.4805 - regression_loss: 0.4389 - classification_loss: 0.0416 344/500 [===================>..........] - ETA: 1:23 - loss: 0.4810 - regression_loss: 0.4393 - classification_loss: 0.0416 345/500 [===================>..........] - ETA: 1:23 - loss: 0.4809 - regression_loss: 0.4393 - classification_loss: 0.0416 346/500 [===================>..........] - ETA: 1:22 - loss: 0.4811 - regression_loss: 0.4395 - classification_loss: 0.0416 347/500 [===================>..........] - ETA: 1:22 - loss: 0.4804 - regression_loss: 0.4389 - classification_loss: 0.0415 348/500 [===================>..........] - ETA: 1:21 - loss: 0.4805 - regression_loss: 0.4390 - classification_loss: 0.0415 349/500 [===================>..........] - ETA: 1:21 - loss: 0.4807 - regression_loss: 0.4391 - classification_loss: 0.0416 350/500 [====================>.........] - ETA: 1:20 - loss: 0.4807 - regression_loss: 0.4391 - classification_loss: 0.0416 351/500 [====================>.........] - ETA: 1:19 - loss: 0.4800 - regression_loss: 0.4384 - classification_loss: 0.0415 352/500 [====================>.........] - ETA: 1:19 - loss: 0.4799 - regression_loss: 0.4384 - classification_loss: 0.0415 353/500 [====================>.........] - ETA: 1:18 - loss: 0.4799 - regression_loss: 0.4384 - classification_loss: 0.0415 354/500 [====================>.........] - ETA: 1:18 - loss: 0.4806 - regression_loss: 0.4390 - classification_loss: 0.0416 355/500 [====================>.........] - ETA: 1:17 - loss: 0.4801 - regression_loss: 0.4385 - classification_loss: 0.0416 356/500 [====================>.........] - ETA: 1:17 - loss: 0.4802 - regression_loss: 0.4386 - classification_loss: 0.0416 357/500 [====================>.........] - ETA: 1:16 - loss: 0.4799 - regression_loss: 0.4383 - classification_loss: 0.0416 358/500 [====================>.........] - ETA: 1:16 - loss: 0.4800 - regression_loss: 0.4384 - classification_loss: 0.0416 359/500 [====================>.........] - ETA: 1:15 - loss: 0.4791 - regression_loss: 0.4376 - classification_loss: 0.0416 360/500 [====================>.........] - ETA: 1:14 - loss: 0.4792 - regression_loss: 0.4376 - classification_loss: 0.0416 361/500 [====================>.........] - ETA: 1:14 - loss: 0.4799 - regression_loss: 0.4384 - classification_loss: 0.0415 362/500 [====================>.........] - ETA: 1:13 - loss: 0.4802 - regression_loss: 0.4386 - classification_loss: 0.0416 363/500 [====================>.........] - ETA: 1:13 - loss: 0.4803 - regression_loss: 0.4387 - classification_loss: 0.0416 364/500 [====================>.........] - ETA: 1:12 - loss: 0.4802 - regression_loss: 0.4386 - classification_loss: 0.0416 365/500 [====================>.........] - ETA: 1:12 - loss: 0.4800 - regression_loss: 0.4385 - classification_loss: 0.0415 366/500 [====================>.........] - ETA: 1:11 - loss: 0.4799 - regression_loss: 0.4384 - classification_loss: 0.0415 367/500 [=====================>........] - ETA: 1:11 - loss: 0.4797 - regression_loss: 0.4381 - classification_loss: 0.0416 368/500 [=====================>........] - ETA: 1:10 - loss: 0.4795 - regression_loss: 0.4380 - classification_loss: 0.0416 369/500 [=====================>........] - ETA: 1:10 - loss: 0.4791 - regression_loss: 0.4376 - classification_loss: 0.0415 370/500 [=====================>........] - ETA: 1:09 - loss: 0.4786 - regression_loss: 0.4371 - classification_loss: 0.0415 371/500 [=====================>........] - ETA: 1:08 - loss: 0.4786 - regression_loss: 0.4372 - classification_loss: 0.0415 372/500 [=====================>........] - ETA: 1:08 - loss: 0.4784 - regression_loss: 0.4370 - classification_loss: 0.0414 373/500 [=====================>........] - ETA: 1:07 - loss: 0.4779 - regression_loss: 0.4366 - classification_loss: 0.0414 374/500 [=====================>........] - ETA: 1:07 - loss: 0.4780 - regression_loss: 0.4366 - classification_loss: 0.0413 375/500 [=====================>........] - ETA: 1:06 - loss: 0.4780 - regression_loss: 0.4366 - classification_loss: 0.0413 376/500 [=====================>........] - ETA: 1:06 - loss: 0.4779 - regression_loss: 0.4366 - classification_loss: 0.0413 377/500 [=====================>........] - ETA: 1:05 - loss: 0.4776 - regression_loss: 0.4363 - classification_loss: 0.0413 378/500 [=====================>........] - ETA: 1:05 - loss: 0.4773 - regression_loss: 0.4361 - classification_loss: 0.0412 379/500 [=====================>........] - ETA: 1:04 - loss: 0.4773 - regression_loss: 0.4361 - classification_loss: 0.0412 380/500 [=====================>........] - ETA: 1:04 - loss: 0.4773 - regression_loss: 0.4360 - classification_loss: 0.0412 381/500 [=====================>........] - ETA: 1:03 - loss: 0.4775 - regression_loss: 0.4363 - classification_loss: 0.0412 382/500 [=====================>........] - ETA: 1:02 - loss: 0.4766 - regression_loss: 0.4355 - classification_loss: 0.0411 383/500 [=====================>........] - ETA: 1:02 - loss: 0.4761 - regression_loss: 0.4351 - classification_loss: 0.0411 384/500 [======================>.......] - ETA: 1:01 - loss: 0.4758 - regression_loss: 0.4348 - classification_loss: 0.0410 385/500 [======================>.......] - ETA: 1:01 - loss: 0.4753 - regression_loss: 0.4343 - classification_loss: 0.0410 386/500 [======================>.......] - ETA: 1:00 - loss: 0.4753 - regression_loss: 0.4343 - classification_loss: 0.0410 387/500 [======================>.......] - ETA: 1:00 - loss: 0.4749 - regression_loss: 0.4340 - classification_loss: 0.0409 388/500 [======================>.......] - ETA: 59s - loss: 0.4750 - regression_loss: 0.4340 - classification_loss: 0.0410  389/500 [======================>.......] - ETA: 59s - loss: 0.4750 - regression_loss: 0.4339 - classification_loss: 0.0410 390/500 [======================>.......] - ETA: 58s - loss: 0.4770 - regression_loss: 0.4358 - classification_loss: 0.0412 391/500 [======================>.......] - ETA: 58s - loss: 0.4779 - regression_loss: 0.4368 - classification_loss: 0.0411 392/500 [======================>.......] - ETA: 57s - loss: 0.4778 - regression_loss: 0.4367 - classification_loss: 0.0411 393/500 [======================>.......] - ETA: 56s - loss: 0.4778 - regression_loss: 0.4367 - classification_loss: 0.0411 394/500 [======================>.......] - ETA: 56s - loss: 0.4780 - regression_loss: 0.4368 - classification_loss: 0.0411 395/500 [======================>.......] - ETA: 55s - loss: 0.4783 - regression_loss: 0.4371 - classification_loss: 0.0412 396/500 [======================>.......] - ETA: 55s - loss: 0.4785 - regression_loss: 0.4373 - classification_loss: 0.0412 397/500 [======================>.......] - ETA: 54s - loss: 0.4781 - regression_loss: 0.4369 - classification_loss: 0.0412 398/500 [======================>.......] - ETA: 54s - loss: 0.4775 - regression_loss: 0.4364 - classification_loss: 0.0411 399/500 [======================>.......] - ETA: 53s - loss: 0.4772 - regression_loss: 0.4361 - classification_loss: 0.0411 400/500 [=======================>......] - ETA: 53s - loss: 0.4771 - regression_loss: 0.4360 - classification_loss: 0.0411 401/500 [=======================>......] - ETA: 52s - loss: 0.4778 - regression_loss: 0.4366 - classification_loss: 0.0412 402/500 [=======================>......] - ETA: 52s - loss: 0.4774 - regression_loss: 0.4362 - classification_loss: 0.0411 403/500 [=======================>......] - ETA: 51s - loss: 0.4771 - regression_loss: 0.4360 - classification_loss: 0.0411 404/500 [=======================>......] - ETA: 51s - loss: 0.4766 - regression_loss: 0.4355 - classification_loss: 0.0411 405/500 [=======================>......] - ETA: 50s - loss: 0.4768 - regression_loss: 0.4357 - classification_loss: 0.0411 406/500 [=======================>......] - ETA: 49s - loss: 0.4765 - regression_loss: 0.4354 - classification_loss: 0.0411 407/500 [=======================>......] - ETA: 49s - loss: 0.4763 - regression_loss: 0.4352 - classification_loss: 0.0411 408/500 [=======================>......] - ETA: 48s - loss: 0.4759 - regression_loss: 0.4348 - classification_loss: 0.0411 409/500 [=======================>......] - ETA: 48s - loss: 0.4755 - regression_loss: 0.4344 - classification_loss: 0.0411 410/500 [=======================>......] - ETA: 47s - loss: 0.4748 - regression_loss: 0.4338 - classification_loss: 0.0410 411/500 [=======================>......] - ETA: 47s - loss: 0.4748 - regression_loss: 0.4338 - classification_loss: 0.0410 412/500 [=======================>......] - ETA: 46s - loss: 0.4742 - regression_loss: 0.4332 - classification_loss: 0.0410 413/500 [=======================>......] - ETA: 46s - loss: 0.4750 - regression_loss: 0.4339 - classification_loss: 0.0410 414/500 [=======================>......] - ETA: 45s - loss: 0.4761 - regression_loss: 0.4349 - classification_loss: 0.0411 415/500 [=======================>......] - ETA: 45s - loss: 0.4760 - regression_loss: 0.4349 - classification_loss: 0.0411 416/500 [=======================>......] - ETA: 44s - loss: 0.4764 - regression_loss: 0.4353 - classification_loss: 0.0411 417/500 [========================>.....] - ETA: 44s - loss: 0.4767 - regression_loss: 0.4355 - classification_loss: 0.0412 418/500 [========================>.....] - ETA: 43s - loss: 0.4761 - regression_loss: 0.4350 - classification_loss: 0.0411 419/500 [========================>.....] - ETA: 42s - loss: 0.4753 - regression_loss: 0.4343 - classification_loss: 0.0410 420/500 [========================>.....] - ETA: 42s - loss: 0.4757 - regression_loss: 0.4347 - classification_loss: 0.0411 421/500 [========================>.....] - ETA: 41s - loss: 0.4751 - regression_loss: 0.4341 - classification_loss: 0.0410 422/500 [========================>.....] - ETA: 41s - loss: 0.4753 - regression_loss: 0.4343 - classification_loss: 0.0410 423/500 [========================>.....] - ETA: 40s - loss: 0.4756 - regression_loss: 0.4346 - classification_loss: 0.0410 424/500 [========================>.....] - ETA: 40s - loss: 0.4750 - regression_loss: 0.4340 - classification_loss: 0.0409 425/500 [========================>.....] - ETA: 39s - loss: 0.4756 - regression_loss: 0.4346 - classification_loss: 0.0410 426/500 [========================>.....] - ETA: 39s - loss: 0.4752 - regression_loss: 0.4342 - classification_loss: 0.0409 427/500 [========================>.....] - ETA: 38s - loss: 0.4751 - regression_loss: 0.4342 - classification_loss: 0.0409 428/500 [========================>.....] - ETA: 38s - loss: 0.4750 - regression_loss: 0.4342 - classification_loss: 0.0409 429/500 [========================>.....] - ETA: 37s - loss: 0.4750 - regression_loss: 0.4341 - classification_loss: 0.0408 430/500 [========================>.....] - ETA: 37s - loss: 0.4746 - regression_loss: 0.4337 - classification_loss: 0.0408 431/500 [========================>.....] - ETA: 36s - loss: 0.4747 - regression_loss: 0.4338 - classification_loss: 0.0408 432/500 [========================>.....] - ETA: 36s - loss: 0.4746 - regression_loss: 0.4338 - classification_loss: 0.0408 433/500 [========================>.....] - ETA: 35s - loss: 0.4753 - regression_loss: 0.4343 - classification_loss: 0.0409 434/500 [=========================>....] - ETA: 34s - loss: 0.4749 - regression_loss: 0.4341 - classification_loss: 0.0409 435/500 [=========================>....] - ETA: 34s - loss: 0.4744 - regression_loss: 0.4336 - classification_loss: 0.0408 436/500 [=========================>....] - ETA: 33s - loss: 0.4739 - regression_loss: 0.4331 - classification_loss: 0.0407 437/500 [=========================>....] - ETA: 33s - loss: 0.4737 - regression_loss: 0.4330 - classification_loss: 0.0407 438/500 [=========================>....] - ETA: 32s - loss: 0.4732 - regression_loss: 0.4325 - classification_loss: 0.0407 439/500 [=========================>....] - ETA: 32s - loss: 0.4736 - regression_loss: 0.4329 - classification_loss: 0.0407 440/500 [=========================>....] - ETA: 31s - loss: 0.4743 - regression_loss: 0.4335 - classification_loss: 0.0408 441/500 [=========================>....] - ETA: 31s - loss: 0.4735 - regression_loss: 0.4328 - classification_loss: 0.0407 442/500 [=========================>....] - ETA: 30s - loss: 0.4731 - regression_loss: 0.4324 - classification_loss: 0.0407 443/500 [=========================>....] - ETA: 30s - loss: 0.4732 - regression_loss: 0.4324 - classification_loss: 0.0408 444/500 [=========================>....] - ETA: 29s - loss: 0.4743 - regression_loss: 0.4334 - classification_loss: 0.0409 445/500 [=========================>....] - ETA: 29s - loss: 0.4739 - regression_loss: 0.4329 - classification_loss: 0.0409 446/500 [=========================>....] - ETA: 28s - loss: 0.4740 - regression_loss: 0.4331 - classification_loss: 0.0409 447/500 [=========================>....] - ETA: 28s - loss: 0.4740 - regression_loss: 0.4332 - classification_loss: 0.0408 448/500 [=========================>....] - ETA: 27s - loss: 0.4740 - regression_loss: 0.4331 - classification_loss: 0.0408 449/500 [=========================>....] - ETA: 26s - loss: 0.4747 - regression_loss: 0.4337 - classification_loss: 0.0410 450/500 [==========================>...] - ETA: 26s - loss: 0.4751 - regression_loss: 0.4340 - classification_loss: 0.0410 451/500 [==========================>...] - ETA: 25s - loss: 0.4745 - regression_loss: 0.4336 - classification_loss: 0.0410 452/500 [==========================>...] - ETA: 25s - loss: 0.4742 - regression_loss: 0.4333 - classification_loss: 0.0409 453/500 [==========================>...] - ETA: 24s - loss: 0.4738 - regression_loss: 0.4329 - classification_loss: 0.0409 454/500 [==========================>...] - ETA: 24s - loss: 0.4739 - regression_loss: 0.4330 - classification_loss: 0.0409 455/500 [==========================>...] - ETA: 23s - loss: 0.4744 - regression_loss: 0.4334 - classification_loss: 0.0410 456/500 [==========================>...] - ETA: 23s - loss: 0.4743 - regression_loss: 0.4332 - classification_loss: 0.0411 457/500 [==========================>...] - ETA: 22s - loss: 0.4743 - regression_loss: 0.4332 - classification_loss: 0.0411 458/500 [==========================>...] - ETA: 22s - loss: 0.4752 - regression_loss: 0.4341 - classification_loss: 0.0411 459/500 [==========================>...] - ETA: 21s - loss: 0.4758 - regression_loss: 0.4347 - classification_loss: 0.0411 460/500 [==========================>...] - ETA: 21s - loss: 0.4755 - regression_loss: 0.4344 - classification_loss: 0.0411 461/500 [==========================>...] - ETA: 20s - loss: 0.4757 - regression_loss: 0.4346 - classification_loss: 0.0411 462/500 [==========================>...] - ETA: 20s - loss: 0.4759 - regression_loss: 0.4348 - classification_loss: 0.0411 463/500 [==========================>...] - ETA: 19s - loss: 0.4767 - regression_loss: 0.4355 - classification_loss: 0.0412 464/500 [==========================>...] - ETA: 19s - loss: 0.4772 - regression_loss: 0.4359 - classification_loss: 0.0412 465/500 [==========================>...] - ETA: 18s - loss: 0.4768 - regression_loss: 0.4356 - classification_loss: 0.0412 466/500 [==========================>...] - ETA: 17s - loss: 0.4761 - regression_loss: 0.4350 - classification_loss: 0.0411 467/500 [===========================>..] - ETA: 17s - loss: 0.4758 - regression_loss: 0.4347 - classification_loss: 0.0411 468/500 [===========================>..] - ETA: 16s - loss: 0.4760 - regression_loss: 0.4349 - classification_loss: 0.0411 469/500 [===========================>..] - ETA: 16s - loss: 0.4757 - regression_loss: 0.4346 - classification_loss: 0.0411 470/500 [===========================>..] - ETA: 15s - loss: 0.4756 - regression_loss: 0.4345 - classification_loss: 0.0411 471/500 [===========================>..] - ETA: 15s - loss: 0.4757 - regression_loss: 0.4346 - classification_loss: 0.0411 472/500 [===========================>..] - ETA: 14s - loss: 0.4754 - regression_loss: 0.4343 - classification_loss: 0.0411 473/500 [===========================>..] - ETA: 14s - loss: 0.4751 - regression_loss: 0.4340 - classification_loss: 0.0411 474/500 [===========================>..] - ETA: 13s - loss: 0.4754 - regression_loss: 0.4342 - classification_loss: 0.0411 475/500 [===========================>..] - ETA: 13s - loss: 0.4758 - regression_loss: 0.4347 - classification_loss: 0.0411 476/500 [===========================>..] - ETA: 12s - loss: 0.4763 - regression_loss: 0.4350 - classification_loss: 0.0413 477/500 [===========================>..] - ETA: 12s - loss: 0.4765 - regression_loss: 0.4352 - classification_loss: 0.0413 478/500 [===========================>..] - ETA: 11s - loss: 0.4766 - regression_loss: 0.4354 - classification_loss: 0.0413 479/500 [===========================>..] - ETA: 11s - loss: 0.4766 - regression_loss: 0.4354 - classification_loss: 0.0413 480/500 [===========================>..] - ETA: 10s - loss: 0.4764 - regression_loss: 0.4352 - classification_loss: 0.0412 481/500 [===========================>..] - ETA: 10s - loss: 0.4759 - regression_loss: 0.4348 - classification_loss: 0.0412 482/500 [===========================>..] - ETA: 9s - loss: 0.4758 - regression_loss: 0.4347 - classification_loss: 0.0411  483/500 [===========================>..] - ETA: 8s - loss: 0.4755 - regression_loss: 0.4344 - classification_loss: 0.0411 484/500 [============================>.] - ETA: 8s - loss: 0.4754 - regression_loss: 0.4343 - classification_loss: 0.0411 485/500 [============================>.] - ETA: 7s - loss: 0.4751 - regression_loss: 0.4340 - classification_loss: 0.0410 486/500 [============================>.] - ETA: 7s - loss: 0.4751 - regression_loss: 0.4340 - classification_loss: 0.0410 487/500 [============================>.] - ETA: 6s - loss: 0.4749 - regression_loss: 0.4339 - classification_loss: 0.0410 488/500 [============================>.] - ETA: 6s - loss: 0.4744 - regression_loss: 0.4334 - classification_loss: 0.0409 489/500 [============================>.] - ETA: 5s - loss: 0.4747 - regression_loss: 0.4338 - classification_loss: 0.0409 490/500 [============================>.] - ETA: 5s - loss: 0.4746 - regression_loss: 0.4337 - classification_loss: 0.0409 491/500 [============================>.] - ETA: 4s - loss: 0.4750 - regression_loss: 0.4341 - classification_loss: 0.0409 492/500 [============================>.] - ETA: 4s - loss: 0.4751 - regression_loss: 0.4342 - classification_loss: 0.0409 493/500 [============================>.] - ETA: 3s - loss: 0.4748 - regression_loss: 0.4339 - classification_loss: 0.0409 494/500 [============================>.] - ETA: 3s - loss: 0.4745 - regression_loss: 0.4336 - classification_loss: 0.0408 495/500 [============================>.] - ETA: 2s - loss: 0.4747 - regression_loss: 0.4339 - classification_loss: 0.0409 496/500 [============================>.] - ETA: 2s - loss: 0.4746 - regression_loss: 0.4338 - classification_loss: 0.0409 497/500 [============================>.] - ETA: 1s - loss: 0.4748 - regression_loss: 0.4339 - classification_loss: 0.0409 498/500 [============================>.] - ETA: 1s - loss: 0.4745 - regression_loss: 0.4336 - classification_loss: 0.0408 499/500 [============================>.] - ETA: 0s - loss: 0.4743 - regression_loss: 0.4335 - classification_loss: 0.0408 500/500 [==============================] - 263s 526ms/step - loss: 0.4738 - regression_loss: 0.4330 - classification_loss: 0.0408 1172 instances of class plum with average precision: 0.7189 mAP: 0.7189 Epoch 00001: saving model to ./training/snapshots/resnet101_pascal_01.h5 Epoch 2/150 1/500 [..............................] - ETA: 4:40 - loss: 0.4716 - regression_loss: 0.4347 - classification_loss: 0.0369 2/500 [..............................] - ETA: 4:22 - loss: 0.5022 - regression_loss: 0.4609 - classification_loss: 0.0413 3/500 [..............................] - ETA: 4:15 - loss: 0.4703 - regression_loss: 0.4292 - classification_loss: 0.0410 4/500 [..............................] - ETA: 4:12 - loss: 0.4280 - regression_loss: 0.3906 - classification_loss: 0.0374 5/500 [..............................] - ETA: 4:13 - loss: 0.4660 - regression_loss: 0.4175 - classification_loss: 0.0485 6/500 [..............................] - ETA: 4:12 - loss: 0.4518 - regression_loss: 0.4063 - classification_loss: 0.0455 7/500 [..............................] - ETA: 4:09 - loss: 0.4891 - regression_loss: 0.4394 - classification_loss: 0.0497 8/500 [..............................] - ETA: 4:10 - loss: 0.5119 - regression_loss: 0.4582 - classification_loss: 0.0537 9/500 [..............................] - ETA: 4:08 - loss: 0.4876 - regression_loss: 0.4373 - classification_loss: 0.0503 10/500 [..............................] - ETA: 4:06 - loss: 0.4747 - regression_loss: 0.4265 - classification_loss: 0.0482 11/500 [..............................] - ETA: 4:05 - loss: 0.4745 - regression_loss: 0.4257 - classification_loss: 0.0488 12/500 [..............................] - ETA: 4:05 - loss: 0.4581 - regression_loss: 0.4110 - classification_loss: 0.0471 13/500 [..............................] - ETA: 4:05 - loss: 0.4448 - regression_loss: 0.3986 - classification_loss: 0.0462 14/500 [..............................] - ETA: 4:04 - loss: 0.4610 - regression_loss: 0.4134 - classification_loss: 0.0476 15/500 [..............................] - ETA: 4:04 - loss: 0.4619 - regression_loss: 0.4138 - classification_loss: 0.0481 16/500 [..............................] - ETA: 4:03 - loss: 0.4797 - regression_loss: 0.4291 - classification_loss: 0.0507 17/500 [>.............................] - ETA: 4:03 - loss: 0.4844 - regression_loss: 0.4316 - classification_loss: 0.0528 18/500 [>.............................] - ETA: 4:03 - loss: 0.4929 - regression_loss: 0.4395 - classification_loss: 0.0534 19/500 [>.............................] - ETA: 4:02 - loss: 0.4804 - regression_loss: 0.4287 - classification_loss: 0.0516 20/500 [>.............................] - ETA: 4:01 - loss: 0.4901 - regression_loss: 0.4366 - classification_loss: 0.0535 21/500 [>.............................] - ETA: 4:02 - loss: 0.4910 - regression_loss: 0.4379 - classification_loss: 0.0531 22/500 [>.............................] - ETA: 4:01 - loss: 0.4866 - regression_loss: 0.4352 - classification_loss: 0.0514 23/500 [>.............................] - ETA: 4:00 - loss: 0.4845 - regression_loss: 0.4333 - classification_loss: 0.0513 24/500 [>.............................] - ETA: 4:00 - loss: 0.4920 - regression_loss: 0.4396 - classification_loss: 0.0524 25/500 [>.............................] - ETA: 3:59 - loss: 0.4955 - regression_loss: 0.4425 - classification_loss: 0.0530 26/500 [>.............................] - ETA: 3:58 - loss: 0.4958 - regression_loss: 0.4423 - classification_loss: 0.0535 27/500 [>.............................] - ETA: 3:58 - loss: 0.4877 - regression_loss: 0.4356 - classification_loss: 0.0521 28/500 [>.............................] - ETA: 3:57 - loss: 0.4898 - regression_loss: 0.4381 - classification_loss: 0.0518 29/500 [>.............................] - ETA: 3:56 - loss: 0.4880 - regression_loss: 0.4367 - classification_loss: 0.0513 30/500 [>.............................] - ETA: 3:56 - loss: 0.4803 - regression_loss: 0.4304 - classification_loss: 0.0499 31/500 [>.............................] - ETA: 3:56 - loss: 0.4703 - regression_loss: 0.4213 - classification_loss: 0.0491 32/500 [>.............................] - ETA: 3:55 - loss: 0.4757 - regression_loss: 0.4260 - classification_loss: 0.0498 33/500 [>.............................] - ETA: 3:55 - loss: 0.4761 - regression_loss: 0.4260 - classification_loss: 0.0501 34/500 [=>............................] - ETA: 3:54 - loss: 0.4804 - regression_loss: 0.4297 - classification_loss: 0.0507 35/500 [=>............................] - ETA: 3:54 - loss: 0.4729 - regression_loss: 0.4231 - classification_loss: 0.0498 36/500 [=>............................] - ETA: 3:53 - loss: 0.4751 - regression_loss: 0.4249 - classification_loss: 0.0503 37/500 [=>............................] - ETA: 3:53 - loss: 0.4730 - regression_loss: 0.4234 - classification_loss: 0.0495 38/500 [=>............................] - ETA: 3:52 - loss: 0.4735 - regression_loss: 0.4243 - classification_loss: 0.0493 39/500 [=>............................] - ETA: 3:52 - loss: 0.4773 - regression_loss: 0.4267 - classification_loss: 0.0506 40/500 [=>............................] - ETA: 3:51 - loss: 0.4750 - regression_loss: 0.4251 - classification_loss: 0.0499 41/500 [=>............................] - ETA: 3:51 - loss: 0.4784 - regression_loss: 0.4285 - classification_loss: 0.0499 42/500 [=>............................] - ETA: 3:50 - loss: 0.4816 - regression_loss: 0.4319 - classification_loss: 0.0497 43/500 [=>............................] - ETA: 3:49 - loss: 0.4793 - regression_loss: 0.4302 - classification_loss: 0.0491 44/500 [=>............................] - ETA: 3:49 - loss: 0.4827 - regression_loss: 0.4328 - classification_loss: 0.0499 45/500 [=>............................] - ETA: 3:49 - loss: 0.4747 - regression_loss: 0.4257 - classification_loss: 0.0491 46/500 [=>............................] - ETA: 3:48 - loss: 0.4721 - regression_loss: 0.4232 - classification_loss: 0.0489 47/500 [=>............................] - ETA: 3:48 - loss: 0.4753 - regression_loss: 0.4265 - classification_loss: 0.0488 48/500 [=>............................] - ETA: 3:47 - loss: 0.4729 - regression_loss: 0.4247 - classification_loss: 0.0482 49/500 [=>............................] - ETA: 3:46 - loss: 0.4705 - regression_loss: 0.4230 - classification_loss: 0.0475 50/500 [==>...........................] - ETA: 3:46 - loss: 0.4721 - regression_loss: 0.4247 - classification_loss: 0.0474 51/500 [==>...........................] - ETA: 3:46 - loss: 0.4699 - regression_loss: 0.4229 - classification_loss: 0.0470 52/500 [==>...........................] - ETA: 3:45 - loss: 0.4678 - regression_loss: 0.4213 - classification_loss: 0.0465 53/500 [==>...........................] - ETA: 3:44 - loss: 0.4706 - regression_loss: 0.4240 - classification_loss: 0.0466 54/500 [==>...........................] - ETA: 3:44 - loss: 0.4679 - regression_loss: 0.4216 - classification_loss: 0.0463 55/500 [==>...........................] - ETA: 3:44 - loss: 0.4668 - regression_loss: 0.4208 - classification_loss: 0.0460 56/500 [==>...........................] - ETA: 3:43 - loss: 0.4646 - regression_loss: 0.4192 - classification_loss: 0.0454 57/500 [==>...........................] - ETA: 3:43 - loss: 0.4611 - regression_loss: 0.4162 - classification_loss: 0.0449 58/500 [==>...........................] - ETA: 3:42 - loss: 0.4560 - regression_loss: 0.4117 - classification_loss: 0.0443 59/500 [==>...........................] - ETA: 3:41 - loss: 0.4534 - regression_loss: 0.4095 - classification_loss: 0.0439 60/500 [==>...........................] - ETA: 3:41 - loss: 0.4593 - regression_loss: 0.4149 - classification_loss: 0.0444 61/500 [==>...........................] - ETA: 3:40 - loss: 0.4568 - regression_loss: 0.4128 - classification_loss: 0.0441 62/500 [==>...........................] - ETA: 3:40 - loss: 0.4542 - regression_loss: 0.4105 - classification_loss: 0.0437 63/500 [==>...........................] - ETA: 3:40 - loss: 0.4551 - regression_loss: 0.4112 - classification_loss: 0.0439 64/500 [==>...........................] - ETA: 3:39 - loss: 0.4564 - regression_loss: 0.4124 - classification_loss: 0.0440 65/500 [==>...........................] - ETA: 3:38 - loss: 0.4533 - regression_loss: 0.4097 - classification_loss: 0.0436 66/500 [==>...........................] - ETA: 3:38 - loss: 0.4576 - regression_loss: 0.4136 - classification_loss: 0.0440 67/500 [===>..........................] - ETA: 3:38 - loss: 0.4555 - regression_loss: 0.4117 - classification_loss: 0.0438 68/500 [===>..........................] - ETA: 3:37 - loss: 0.4605 - regression_loss: 0.4162 - classification_loss: 0.0443 69/500 [===>..........................] - ETA: 3:37 - loss: 0.4655 - regression_loss: 0.4206 - classification_loss: 0.0449 70/500 [===>..........................] - ETA: 3:36 - loss: 0.4668 - regression_loss: 0.4219 - classification_loss: 0.0449 71/500 [===>..........................] - ETA: 3:36 - loss: 0.4694 - regression_loss: 0.4244 - classification_loss: 0.0450 72/500 [===>..........................] - ETA: 3:35 - loss: 0.4694 - regression_loss: 0.4243 - classification_loss: 0.0451 73/500 [===>..........................] - ETA: 3:35 - loss: 0.4706 - regression_loss: 0.4247 - classification_loss: 0.0459 74/500 [===>..........................] - ETA: 3:34 - loss: 0.4718 - regression_loss: 0.4260 - classification_loss: 0.0457 75/500 [===>..........................] - ETA: 3:33 - loss: 0.4732 - regression_loss: 0.4271 - classification_loss: 0.0460 76/500 [===>..........................] - ETA: 3:33 - loss: 0.4722 - regression_loss: 0.4262 - classification_loss: 0.0460 77/500 [===>..........................] - ETA: 3:32 - loss: 0.4695 - regression_loss: 0.4240 - classification_loss: 0.0456 78/500 [===>..........................] - ETA: 3:32 - loss: 0.4677 - regression_loss: 0.4225 - classification_loss: 0.0452 79/500 [===>..........................] - ETA: 3:31 - loss: 0.4634 - regression_loss: 0.4185 - classification_loss: 0.0448 80/500 [===>..........................] - ETA: 3:31 - loss: 0.4621 - regression_loss: 0.4174 - classification_loss: 0.0447 81/500 [===>..........................] - ETA: 3:30 - loss: 0.4658 - regression_loss: 0.4205 - classification_loss: 0.0453 82/500 [===>..........................] - ETA: 3:30 - loss: 0.4663 - regression_loss: 0.4211 - classification_loss: 0.0452 83/500 [===>..........................] - ETA: 3:29 - loss: 0.4639 - regression_loss: 0.4190 - classification_loss: 0.0449 84/500 [====>.........................] - ETA: 3:29 - loss: 0.4631 - regression_loss: 0.4184 - classification_loss: 0.0447 85/500 [====>.........................] - ETA: 3:28 - loss: 0.4604 - regression_loss: 0.4161 - classification_loss: 0.0444 86/500 [====>.........................] - ETA: 3:28 - loss: 0.4603 - regression_loss: 0.4158 - classification_loss: 0.0445 87/500 [====>.........................] - ETA: 3:27 - loss: 0.4577 - regression_loss: 0.4136 - classification_loss: 0.0441 88/500 [====>.........................] - ETA: 3:27 - loss: 0.4569 - regression_loss: 0.4128 - classification_loss: 0.0441 89/500 [====>.........................] - ETA: 3:26 - loss: 0.4571 - regression_loss: 0.4131 - classification_loss: 0.0440 90/500 [====>.........................] - ETA: 3:26 - loss: 0.4556 - regression_loss: 0.4118 - classification_loss: 0.0438 91/500 [====>.........................] - ETA: 3:25 - loss: 0.4563 - regression_loss: 0.4125 - classification_loss: 0.0437 92/500 [====>.........................] - ETA: 3:25 - loss: 0.4553 - regression_loss: 0.4114 - classification_loss: 0.0438 93/500 [====>.........................] - ETA: 3:24 - loss: 0.4550 - regression_loss: 0.4112 - classification_loss: 0.0438 94/500 [====>.........................] - ETA: 3:24 - loss: 0.4514 - regression_loss: 0.4079 - classification_loss: 0.0434 95/500 [====>.........................] - ETA: 3:23 - loss: 0.4507 - regression_loss: 0.4075 - classification_loss: 0.0433 96/500 [====>.........................] - ETA: 3:23 - loss: 0.4490 - regression_loss: 0.4059 - classification_loss: 0.0431 97/500 [====>.........................] - ETA: 3:22 - loss: 0.4477 - regression_loss: 0.4047 - classification_loss: 0.0429 98/500 [====>.........................] - ETA: 3:22 - loss: 0.4491 - regression_loss: 0.4061 - classification_loss: 0.0430 99/500 [====>.........................] - ETA: 3:21 - loss: 0.4474 - regression_loss: 0.4047 - classification_loss: 0.0427 100/500 [=====>........................] - ETA: 3:21 - loss: 0.4469 - regression_loss: 0.4043 - classification_loss: 0.0426 101/500 [=====>........................] - ETA: 3:20 - loss: 0.4453 - regression_loss: 0.4029 - classification_loss: 0.0424 102/500 [=====>........................] - ETA: 3:20 - loss: 0.4432 - regression_loss: 0.4011 - classification_loss: 0.0420 103/500 [=====>........................] - ETA: 3:19 - loss: 0.4419 - regression_loss: 0.4000 - classification_loss: 0.0419 104/500 [=====>........................] - ETA: 3:19 - loss: 0.4458 - regression_loss: 0.4039 - classification_loss: 0.0419 105/500 [=====>........................] - ETA: 3:18 - loss: 0.4453 - regression_loss: 0.4034 - classification_loss: 0.0419 106/500 [=====>........................] - ETA: 3:18 - loss: 0.4461 - regression_loss: 0.4042 - classification_loss: 0.0419 107/500 [=====>........................] - ETA: 3:17 - loss: 0.4473 - regression_loss: 0.4054 - classification_loss: 0.0419 108/500 [=====>........................] - ETA: 3:17 - loss: 0.4479 - regression_loss: 0.4061 - classification_loss: 0.0419 109/500 [=====>........................] - ETA: 3:16 - loss: 0.4454 - regression_loss: 0.4039 - classification_loss: 0.0415 110/500 [=====>........................] - ETA: 3:16 - loss: 0.4499 - regression_loss: 0.4082 - classification_loss: 0.0417 111/500 [=====>........................] - ETA: 3:15 - loss: 0.4495 - regression_loss: 0.4078 - classification_loss: 0.0417 112/500 [=====>........................] - ETA: 3:15 - loss: 0.4524 - regression_loss: 0.4106 - classification_loss: 0.0417 113/500 [=====>........................] - ETA: 3:14 - loss: 0.4504 - regression_loss: 0.4088 - classification_loss: 0.0415 114/500 [=====>........................] - ETA: 3:14 - loss: 0.4514 - regression_loss: 0.4098 - classification_loss: 0.0415 115/500 [=====>........................] - ETA: 3:13 - loss: 0.4492 - regression_loss: 0.4080 - classification_loss: 0.0413 116/500 [=====>........................] - ETA: 3:13 - loss: 0.4481 - regression_loss: 0.4070 - classification_loss: 0.0411 117/500 [======>.......................] - ETA: 3:12 - loss: 0.4479 - regression_loss: 0.4069 - classification_loss: 0.0410 118/500 [======>.......................] - ETA: 3:12 - loss: 0.4471 - regression_loss: 0.4062 - classification_loss: 0.0409 119/500 [======>.......................] - ETA: 3:11 - loss: 0.4508 - regression_loss: 0.4096 - classification_loss: 0.0412 120/500 [======>.......................] - ETA: 3:11 - loss: 0.4522 - regression_loss: 0.4109 - classification_loss: 0.0412 121/500 [======>.......................] - ETA: 3:10 - loss: 0.4520 - regression_loss: 0.4109 - classification_loss: 0.0410 122/500 [======>.......................] - ETA: 3:10 - loss: 0.4496 - regression_loss: 0.4088 - classification_loss: 0.0407 123/500 [======>.......................] - ETA: 3:09 - loss: 0.4507 - regression_loss: 0.4100 - classification_loss: 0.0407 124/500 [======>.......................] - ETA: 3:09 - loss: 0.4518 - regression_loss: 0.4110 - classification_loss: 0.0408 125/500 [======>.......................] - ETA: 3:08 - loss: 0.4549 - regression_loss: 0.4136 - classification_loss: 0.0413 126/500 [======>.......................] - ETA: 3:08 - loss: 0.4538 - regression_loss: 0.4126 - classification_loss: 0.0412 127/500 [======>.......................] - ETA: 3:07 - loss: 0.4533 - regression_loss: 0.4122 - classification_loss: 0.0411 128/500 [======>.......................] - ETA: 3:07 - loss: 0.4543 - regression_loss: 0.4129 - classification_loss: 0.0413 129/500 [======>.......................] - ETA: 3:06 - loss: 0.4526 - regression_loss: 0.4114 - classification_loss: 0.0412 130/500 [======>.......................] - ETA: 3:06 - loss: 0.4516 - regression_loss: 0.4105 - classification_loss: 0.0410 131/500 [======>.......................] - ETA: 3:05 - loss: 0.4501 - regression_loss: 0.4092 - classification_loss: 0.0409 132/500 [======>.......................] - ETA: 3:05 - loss: 0.4500 - regression_loss: 0.4092 - classification_loss: 0.0408 133/500 [======>.......................] - ETA: 3:04 - loss: 0.4525 - regression_loss: 0.4118 - classification_loss: 0.0407 134/500 [=======>......................] - ETA: 3:04 - loss: 0.4550 - regression_loss: 0.4143 - classification_loss: 0.0408 135/500 [=======>......................] - ETA: 3:03 - loss: 0.4563 - regression_loss: 0.4154 - classification_loss: 0.0409 136/500 [=======>......................] - ETA: 3:03 - loss: 0.4574 - regression_loss: 0.4165 - classification_loss: 0.0410 137/500 [=======>......................] - ETA: 3:02 - loss: 0.4575 - regression_loss: 0.4165 - classification_loss: 0.0410 138/500 [=======>......................] - ETA: 3:02 - loss: 0.4569 - regression_loss: 0.4159 - classification_loss: 0.0410 139/500 [=======>......................] - ETA: 3:01 - loss: 0.4570 - regression_loss: 0.4161 - classification_loss: 0.0410 140/500 [=======>......................] - ETA: 3:01 - loss: 0.4575 - regression_loss: 0.4165 - classification_loss: 0.0409 141/500 [=======>......................] - ETA: 3:00 - loss: 0.4581 - regression_loss: 0.4171 - classification_loss: 0.0409 142/500 [=======>......................] - ETA: 3:00 - loss: 0.4564 - regression_loss: 0.4156 - classification_loss: 0.0408 143/500 [=======>......................] - ETA: 2:59 - loss: 0.4557 - regression_loss: 0.4150 - classification_loss: 0.0407 144/500 [=======>......................] - ETA: 2:59 - loss: 0.4561 - regression_loss: 0.4154 - classification_loss: 0.0407 145/500 [=======>......................] - ETA: 2:58 - loss: 0.4566 - regression_loss: 0.4160 - classification_loss: 0.0406 146/500 [=======>......................] - ETA: 2:58 - loss: 0.4586 - regression_loss: 0.4179 - classification_loss: 0.0408 147/500 [=======>......................] - ETA: 2:57 - loss: 0.4600 - regression_loss: 0.4193 - classification_loss: 0.0407 148/500 [=======>......................] - ETA: 2:57 - loss: 0.4583 - regression_loss: 0.4178 - classification_loss: 0.0405 149/500 [=======>......................] - ETA: 2:56 - loss: 0.4570 - regression_loss: 0.4167 - classification_loss: 0.0404 150/500 [========>.....................] - ETA: 2:56 - loss: 0.4582 - regression_loss: 0.4177 - classification_loss: 0.0405 151/500 [========>.....................] - ETA: 2:55 - loss: 0.4587 - regression_loss: 0.4182 - classification_loss: 0.0405 152/500 [========>.....................] - ETA: 2:55 - loss: 0.4612 - regression_loss: 0.4202 - classification_loss: 0.0410 153/500 [========>.....................] - ETA: 2:54 - loss: 0.4641 - regression_loss: 0.4227 - classification_loss: 0.0414 154/500 [========>.....................] - ETA: 2:54 - loss: 0.4650 - regression_loss: 0.4235 - classification_loss: 0.0415 155/500 [========>.....................] - ETA: 2:53 - loss: 0.4643 - regression_loss: 0.4227 - classification_loss: 0.0416 156/500 [========>.....................] - ETA: 2:53 - loss: 0.4633 - regression_loss: 0.4217 - classification_loss: 0.0416 157/500 [========>.....................] - ETA: 2:52 - loss: 0.4647 - regression_loss: 0.4230 - classification_loss: 0.0416 158/500 [========>.....................] - ETA: 2:52 - loss: 0.4661 - regression_loss: 0.4245 - classification_loss: 0.0416 159/500 [========>.....................] - ETA: 2:51 - loss: 0.4655 - regression_loss: 0.4240 - classification_loss: 0.0415 160/500 [========>.....................] - ETA: 2:51 - loss: 0.4643 - regression_loss: 0.4230 - classification_loss: 0.0414 161/500 [========>.....................] - ETA: 2:50 - loss: 0.4637 - regression_loss: 0.4225 - classification_loss: 0.0412 162/500 [========>.....................] - ETA: 2:50 - loss: 0.4634 - regression_loss: 0.4222 - classification_loss: 0.0412 163/500 [========>.....................] - ETA: 2:49 - loss: 0.4617 - regression_loss: 0.4206 - classification_loss: 0.0410 164/500 [========>.....................] - ETA: 2:49 - loss: 0.4635 - regression_loss: 0.4225 - classification_loss: 0.0410 165/500 [========>.....................] - ETA: 2:48 - loss: 0.4632 - regression_loss: 0.4223 - classification_loss: 0.0408 166/500 [========>.....................] - ETA: 2:48 - loss: 0.4620 - regression_loss: 0.4213 - classification_loss: 0.0407 167/500 [=========>....................] - ETA: 2:47 - loss: 0.4607 - regression_loss: 0.4201 - classification_loss: 0.0406 168/500 [=========>....................] - ETA: 2:47 - loss: 0.4596 - regression_loss: 0.4190 - classification_loss: 0.0406 169/500 [=========>....................] - ETA: 2:46 - loss: 0.4608 - regression_loss: 0.4200 - classification_loss: 0.0408 170/500 [=========>....................] - ETA: 2:46 - loss: 0.4634 - regression_loss: 0.4224 - classification_loss: 0.0409 171/500 [=========>....................] - ETA: 2:45 - loss: 0.4641 - regression_loss: 0.4231 - classification_loss: 0.0410 172/500 [=========>....................] - ETA: 2:45 - loss: 0.4640 - regression_loss: 0.4231 - classification_loss: 0.0409 173/500 [=========>....................] - ETA: 2:44 - loss: 0.4650 - regression_loss: 0.4240 - classification_loss: 0.0410 174/500 [=========>....................] - ETA: 2:44 - loss: 0.4654 - regression_loss: 0.4244 - classification_loss: 0.0410 175/500 [=========>....................] - ETA: 2:43 - loss: 0.4655 - regression_loss: 0.4245 - classification_loss: 0.0410 176/500 [=========>....................] - ETA: 2:43 - loss: 0.4643 - regression_loss: 0.4234 - classification_loss: 0.0409 177/500 [=========>....................] - ETA: 2:42 - loss: 0.4662 - regression_loss: 0.4252 - classification_loss: 0.0410 178/500 [=========>....................] - ETA: 2:42 - loss: 0.4665 - regression_loss: 0.4254 - classification_loss: 0.0410 179/500 [=========>....................] - ETA: 2:41 - loss: 0.4650 - regression_loss: 0.4241 - classification_loss: 0.0409 180/500 [=========>....................] - ETA: 2:41 - loss: 0.4655 - regression_loss: 0.4246 - classification_loss: 0.0409 181/500 [=========>....................] - ETA: 2:40 - loss: 0.4663 - regression_loss: 0.4253 - classification_loss: 0.0410 182/500 [=========>....................] - ETA: 2:39 - loss: 0.4660 - regression_loss: 0.4250 - classification_loss: 0.0410 183/500 [=========>....................] - ETA: 2:39 - loss: 0.4647 - regression_loss: 0.4239 - classification_loss: 0.0408 184/500 [==========>...................] - ETA: 2:38 - loss: 0.4638 - regression_loss: 0.4230 - classification_loss: 0.0408 185/500 [==========>...................] - ETA: 2:38 - loss: 0.4639 - regression_loss: 0.4233 - classification_loss: 0.0406 186/500 [==========>...................] - ETA: 2:38 - loss: 0.4623 - regression_loss: 0.4218 - classification_loss: 0.0405 187/500 [==========>...................] - ETA: 2:37 - loss: 0.4619 - regression_loss: 0.4214 - classification_loss: 0.0405 188/500 [==========>...................] - ETA: 2:36 - loss: 0.4618 - regression_loss: 0.4213 - classification_loss: 0.0405 189/500 [==========>...................] - ETA: 2:36 - loss: 0.4614 - regression_loss: 0.4210 - classification_loss: 0.0404 190/500 [==========>...................] - ETA: 2:35 - loss: 0.4612 - regression_loss: 0.4208 - classification_loss: 0.0404 191/500 [==========>...................] - ETA: 2:35 - loss: 0.4608 - regression_loss: 0.4204 - classification_loss: 0.0403 192/500 [==========>...................] - ETA: 2:34 - loss: 0.4600 - regression_loss: 0.4197 - classification_loss: 0.0402 193/500 [==========>...................] - ETA: 2:34 - loss: 0.4612 - regression_loss: 0.4206 - classification_loss: 0.0406 194/500 [==========>...................] - ETA: 2:33 - loss: 0.4603 - regression_loss: 0.4197 - classification_loss: 0.0405 195/500 [==========>...................] - ETA: 2:33 - loss: 0.4609 - regression_loss: 0.4202 - classification_loss: 0.0406 196/500 [==========>...................] - ETA: 2:32 - loss: 0.4616 - regression_loss: 0.4206 - classification_loss: 0.0409 197/500 [==========>...................] - ETA: 2:32 - loss: 0.4606 - regression_loss: 0.4198 - classification_loss: 0.0408 198/500 [==========>...................] - ETA: 2:31 - loss: 0.4599 - regression_loss: 0.4190 - classification_loss: 0.0409 199/500 [==========>...................] - ETA: 2:31 - loss: 0.4591 - regression_loss: 0.4184 - classification_loss: 0.0408 200/500 [===========>..................] - ETA: 2:30 - loss: 0.4589 - regression_loss: 0.4182 - classification_loss: 0.0407 201/500 [===========>..................] - ETA: 2:30 - loss: 0.4592 - regression_loss: 0.4184 - classification_loss: 0.0408 202/500 [===========>..................] - ETA: 2:29 - loss: 0.4600 - regression_loss: 0.4189 - classification_loss: 0.0411 203/500 [===========>..................] - ETA: 2:29 - loss: 0.4592 - regression_loss: 0.4182 - classification_loss: 0.0410 204/500 [===========>..................] - ETA: 2:28 - loss: 0.4585 - regression_loss: 0.4176 - classification_loss: 0.0409 205/500 [===========>..................] - ETA: 2:28 - loss: 0.4577 - regression_loss: 0.4169 - classification_loss: 0.0408 206/500 [===========>..................] - ETA: 2:27 - loss: 0.4564 - regression_loss: 0.4157 - classification_loss: 0.0407 207/500 [===========>..................] - ETA: 2:27 - loss: 0.4572 - regression_loss: 0.4166 - classification_loss: 0.0406 208/500 [===========>..................] - ETA: 2:26 - loss: 0.4560 - regression_loss: 0.4156 - classification_loss: 0.0404 209/500 [===========>..................] - ETA: 2:26 - loss: 0.4554 - regression_loss: 0.4151 - classification_loss: 0.0404 210/500 [===========>..................] - ETA: 2:25 - loss: 0.4549 - regression_loss: 0.4146 - classification_loss: 0.0403 211/500 [===========>..................] - ETA: 2:25 - loss: 0.4557 - regression_loss: 0.4153 - classification_loss: 0.0404 212/500 [===========>..................] - ETA: 2:24 - loss: 0.4562 - regression_loss: 0.4158 - classification_loss: 0.0403 213/500 [===========>..................] - ETA: 2:24 - loss: 0.4563 - regression_loss: 0.4159 - classification_loss: 0.0403 214/500 [===========>..................] - ETA: 2:23 - loss: 0.4576 - regression_loss: 0.4170 - classification_loss: 0.0406 215/500 [===========>..................] - ETA: 2:23 - loss: 0.4568 - regression_loss: 0.4163 - classification_loss: 0.0405 216/500 [===========>..................] - ETA: 2:22 - loss: 0.4567 - regression_loss: 0.4162 - classification_loss: 0.0405 217/500 [============>.................] - ETA: 2:22 - loss: 0.4578 - regression_loss: 0.4173 - classification_loss: 0.0405 218/500 [============>.................] - ETA: 2:21 - loss: 0.4596 - regression_loss: 0.4190 - classification_loss: 0.0406 219/500 [============>.................] - ETA: 2:21 - loss: 0.4594 - regression_loss: 0.4189 - classification_loss: 0.0405 220/500 [============>.................] - ETA: 2:20 - loss: 0.4585 - regression_loss: 0.4181 - classification_loss: 0.0404 221/500 [============>.................] - ETA: 2:20 - loss: 0.4587 - regression_loss: 0.4183 - classification_loss: 0.0404 222/500 [============>.................] - ETA: 2:19 - loss: 0.4581 - regression_loss: 0.4178 - classification_loss: 0.0403 223/500 [============>.................] - ETA: 2:19 - loss: 0.4585 - regression_loss: 0.4182 - classification_loss: 0.0402 224/500 [============>.................] - ETA: 2:18 - loss: 0.4581 - regression_loss: 0.4179 - classification_loss: 0.0401 225/500 [============>.................] - ETA: 2:18 - loss: 0.4576 - regression_loss: 0.4175 - classification_loss: 0.0401 226/500 [============>.................] - ETA: 2:17 - loss: 0.4567 - regression_loss: 0.4167 - classification_loss: 0.0400 227/500 [============>.................] - ETA: 2:17 - loss: 0.4570 - regression_loss: 0.4170 - classification_loss: 0.0400 228/500 [============>.................] - ETA: 2:16 - loss: 0.4568 - regression_loss: 0.4169 - classification_loss: 0.0400 229/500 [============>.................] - ETA: 2:16 - loss: 0.4570 - regression_loss: 0.4172 - classification_loss: 0.0398 230/500 [============>.................] - ETA: 2:15 - loss: 0.4564 - regression_loss: 0.4167 - classification_loss: 0.0397 231/500 [============>.................] - ETA: 2:15 - loss: 0.4575 - regression_loss: 0.4176 - classification_loss: 0.0399 232/500 [============>.................] - ETA: 2:14 - loss: 0.4568 - regression_loss: 0.4170 - classification_loss: 0.0398 233/500 [============>.................] - ETA: 2:14 - loss: 0.4562 - regression_loss: 0.4165 - classification_loss: 0.0397 234/500 [=============>................] - ETA: 2:13 - loss: 0.4562 - regression_loss: 0.4165 - classification_loss: 0.0398 235/500 [=============>................] - ETA: 2:13 - loss: 0.4561 - regression_loss: 0.4163 - classification_loss: 0.0398 236/500 [=============>................] - ETA: 2:12 - loss: 0.4565 - regression_loss: 0.4167 - classification_loss: 0.0398 237/500 [=============>................] - ETA: 2:12 - loss: 0.4584 - regression_loss: 0.4184 - classification_loss: 0.0400 238/500 [=============>................] - ETA: 2:11 - loss: 0.4582 - regression_loss: 0.4181 - classification_loss: 0.0400 239/500 [=============>................] - ETA: 2:11 - loss: 0.4579 - regression_loss: 0.4179 - classification_loss: 0.0400 240/500 [=============>................] - ETA: 2:10 - loss: 0.4572 - regression_loss: 0.4173 - classification_loss: 0.0399 241/500 [=============>................] - ETA: 2:10 - loss: 0.4576 - regression_loss: 0.4176 - classification_loss: 0.0400 242/500 [=============>................] - ETA: 2:09 - loss: 0.4569 - regression_loss: 0.4169 - classification_loss: 0.0399 243/500 [=============>................] - ETA: 2:09 - loss: 0.4576 - regression_loss: 0.4176 - classification_loss: 0.0400 244/500 [=============>................] - ETA: 2:08 - loss: 0.4580 - regression_loss: 0.4181 - classification_loss: 0.0400 245/500 [=============>................] - ETA: 2:08 - loss: 0.4588 - regression_loss: 0.4186 - classification_loss: 0.0402 246/500 [=============>................] - ETA: 2:07 - loss: 0.4586 - regression_loss: 0.4184 - classification_loss: 0.0402 247/500 [=============>................] - ETA: 2:07 - loss: 0.4589 - regression_loss: 0.4186 - classification_loss: 0.0403 248/500 [=============>................] - ETA: 2:06 - loss: 0.4593 - regression_loss: 0.4189 - classification_loss: 0.0404 249/500 [=============>................] - ETA: 2:06 - loss: 0.4613 - regression_loss: 0.4206 - classification_loss: 0.0407 250/500 [==============>...............] - ETA: 2:05 - loss: 0.4628 - regression_loss: 0.4219 - classification_loss: 0.0409 251/500 [==============>...............] - ETA: 2:05 - loss: 0.4615 - regression_loss: 0.4207 - classification_loss: 0.0408 252/500 [==============>...............] - ETA: 2:04 - loss: 0.4618 - regression_loss: 0.4210 - classification_loss: 0.0409 253/500 [==============>...............] - ETA: 2:04 - loss: 0.4613 - regression_loss: 0.4205 - classification_loss: 0.0408 254/500 [==============>...............] - ETA: 2:03 - loss: 0.4605 - regression_loss: 0.4197 - classification_loss: 0.0407 255/500 [==============>...............] - ETA: 2:03 - loss: 0.4605 - regression_loss: 0.4197 - classification_loss: 0.0408 256/500 [==============>...............] - ETA: 2:02 - loss: 0.4598 - regression_loss: 0.4191 - classification_loss: 0.0407 257/500 [==============>...............] - ETA: 2:02 - loss: 0.4605 - regression_loss: 0.4198 - classification_loss: 0.0407 258/500 [==============>...............] - ETA: 2:01 - loss: 0.4608 - regression_loss: 0.4200 - classification_loss: 0.0408 259/500 [==============>...............] - ETA: 2:01 - loss: 0.4606 - regression_loss: 0.4199 - classification_loss: 0.0407 260/500 [==============>...............] - ETA: 2:00 - loss: 0.4597 - regression_loss: 0.4191 - classification_loss: 0.0406 261/500 [==============>...............] - ETA: 2:00 - loss: 0.4619 - regression_loss: 0.4210 - classification_loss: 0.0409 262/500 [==============>...............] - ETA: 1:59 - loss: 0.4607 - regression_loss: 0.4199 - classification_loss: 0.0408 263/500 [==============>...............] - ETA: 1:59 - loss: 0.4619 - regression_loss: 0.4209 - classification_loss: 0.0409 264/500 [==============>...............] - ETA: 1:58 - loss: 0.4622 - regression_loss: 0.4212 - classification_loss: 0.0410 265/500 [==============>...............] - ETA: 1:58 - loss: 0.4632 - regression_loss: 0.4222 - classification_loss: 0.0410 266/500 [==============>...............] - ETA: 1:57 - loss: 0.4623 - regression_loss: 0.4214 - classification_loss: 0.0409 267/500 [===============>..............] - ETA: 1:57 - loss: 0.4616 - regression_loss: 0.4207 - classification_loss: 0.0409 268/500 [===============>..............] - ETA: 1:56 - loss: 0.4616 - regression_loss: 0.4207 - classification_loss: 0.0409 269/500 [===============>..............] - ETA: 1:56 - loss: 0.4613 - regression_loss: 0.4204 - classification_loss: 0.0409 270/500 [===============>..............] - ETA: 1:55 - loss: 0.4610 - regression_loss: 0.4200 - classification_loss: 0.0409 271/500 [===============>..............] - ETA: 1:55 - loss: 0.4603 - regression_loss: 0.4194 - classification_loss: 0.0409 272/500 [===============>..............] - ETA: 1:54 - loss: 0.4616 - regression_loss: 0.4207 - classification_loss: 0.0409 273/500 [===============>..............] - ETA: 1:54 - loss: 0.4613 - regression_loss: 0.4205 - classification_loss: 0.0408 274/500 [===============>..............] - ETA: 1:53 - loss: 0.4630 - regression_loss: 0.4220 - classification_loss: 0.0409 275/500 [===============>..............] - ETA: 1:53 - loss: 0.4646 - regression_loss: 0.4237 - classification_loss: 0.0409 276/500 [===============>..............] - ETA: 1:52 - loss: 0.4640 - regression_loss: 0.4232 - classification_loss: 0.0408 277/500 [===============>..............] - ETA: 1:52 - loss: 0.4640 - regression_loss: 0.4231 - classification_loss: 0.0409 278/500 [===============>..............] - ETA: 1:51 - loss: 0.4639 - regression_loss: 0.4229 - classification_loss: 0.0410 279/500 [===============>..............] - ETA: 1:51 - loss: 0.4643 - regression_loss: 0.4233 - classification_loss: 0.0410 280/500 [===============>..............] - ETA: 1:50 - loss: 0.4641 - regression_loss: 0.4232 - classification_loss: 0.0409 281/500 [===============>..............] - ETA: 1:50 - loss: 0.4645 - regression_loss: 0.4236 - classification_loss: 0.0410 282/500 [===============>..............] - ETA: 1:49 - loss: 0.4660 - regression_loss: 0.4249 - classification_loss: 0.0412 283/500 [===============>..............] - ETA: 1:49 - loss: 0.4651 - regression_loss: 0.4240 - classification_loss: 0.0411 284/500 [================>.............] - ETA: 1:48 - loss: 0.4662 - regression_loss: 0.4251 - classification_loss: 0.0411 285/500 [================>.............] - ETA: 1:48 - loss: 0.4665 - regression_loss: 0.4253 - classification_loss: 0.0411 286/500 [================>.............] - ETA: 1:47 - loss: 0.4675 - regression_loss: 0.4264 - classification_loss: 0.0412 287/500 [================>.............] - ETA: 1:47 - loss: 0.4678 - regression_loss: 0.4266 - classification_loss: 0.0412 288/500 [================>.............] - ETA: 1:46 - loss: 0.4673 - regression_loss: 0.4262 - classification_loss: 0.0411 289/500 [================>.............] - ETA: 1:46 - loss: 0.4674 - regression_loss: 0.4262 - classification_loss: 0.0412 290/500 [================>.............] - ETA: 1:45 - loss: 0.4682 - regression_loss: 0.4270 - classification_loss: 0.0412 291/500 [================>.............] - ETA: 1:45 - loss: 0.4683 - regression_loss: 0.4271 - classification_loss: 0.0412 292/500 [================>.............] - ETA: 1:44 - loss: 0.4682 - regression_loss: 0.4270 - classification_loss: 0.0412 293/500 [================>.............] - ETA: 1:44 - loss: 0.4689 - regression_loss: 0.4274 - classification_loss: 0.0414 294/500 [================>.............] - ETA: 1:43 - loss: 0.4683 - regression_loss: 0.4269 - classification_loss: 0.0414 295/500 [================>.............] - ETA: 1:43 - loss: 0.4684 - regression_loss: 0.4270 - classification_loss: 0.0414 296/500 [================>.............] - ETA: 1:42 - loss: 0.4685 - regression_loss: 0.4271 - classification_loss: 0.0414 297/500 [================>.............] - ETA: 1:42 - loss: 0.4684 - regression_loss: 0.4269 - classification_loss: 0.0415 298/500 [================>.............] - ETA: 1:41 - loss: 0.4690 - regression_loss: 0.4274 - classification_loss: 0.0416 299/500 [================>.............] - ETA: 1:41 - loss: 0.4687 - regression_loss: 0.4270 - classification_loss: 0.0416 300/500 [=================>............] - ETA: 1:40 - loss: 0.4679 - regression_loss: 0.4263 - classification_loss: 0.0416 301/500 [=================>............] - ETA: 1:40 - loss: 0.4683 - regression_loss: 0.4267 - classification_loss: 0.0417 302/500 [=================>............] - ETA: 1:39 - loss: 0.4674 - regression_loss: 0.4258 - classification_loss: 0.0416 303/500 [=================>............] - ETA: 1:39 - loss: 0.4675 - regression_loss: 0.4259 - classification_loss: 0.0416 304/500 [=================>............] - ETA: 1:38 - loss: 0.4675 - regression_loss: 0.4259 - classification_loss: 0.0416 305/500 [=================>............] - ETA: 1:38 - loss: 0.4682 - regression_loss: 0.4266 - classification_loss: 0.0416 306/500 [=================>............] - ETA: 1:37 - loss: 0.4680 - regression_loss: 0.4264 - classification_loss: 0.0416 307/500 [=================>............] - ETA: 1:37 - loss: 0.4685 - regression_loss: 0.4269 - classification_loss: 0.0417 308/500 [=================>............] - ETA: 1:36 - loss: 0.4685 - regression_loss: 0.4269 - classification_loss: 0.0417 309/500 [=================>............] - ETA: 1:36 - loss: 0.4682 - regression_loss: 0.4266 - classification_loss: 0.0416 310/500 [=================>............] - ETA: 1:35 - loss: 0.4683 - regression_loss: 0.4267 - classification_loss: 0.0416 311/500 [=================>............] - ETA: 1:35 - loss: 0.4673 - regression_loss: 0.4258 - classification_loss: 0.0415 312/500 [=================>............] - ETA: 1:34 - loss: 0.4673 - regression_loss: 0.4258 - classification_loss: 0.0414 313/500 [=================>............] - ETA: 1:34 - loss: 0.4662 - regression_loss: 0.4248 - classification_loss: 0.0414 314/500 [=================>............] - ETA: 1:33 - loss: 0.4660 - regression_loss: 0.4246 - classification_loss: 0.0414 315/500 [=================>............] - ETA: 1:33 - loss: 0.4653 - regression_loss: 0.4240 - classification_loss: 0.0413 316/500 [=================>............] - ETA: 1:32 - loss: 0.4646 - regression_loss: 0.4234 - classification_loss: 0.0412 317/500 [==================>...........] - ETA: 1:32 - loss: 0.4645 - regression_loss: 0.4233 - classification_loss: 0.0412 318/500 [==================>...........] - ETA: 1:31 - loss: 0.4648 - regression_loss: 0.4234 - classification_loss: 0.0413 319/500 [==================>...........] - ETA: 1:31 - loss: 0.4649 - regression_loss: 0.4236 - classification_loss: 0.0413 320/500 [==================>...........] - ETA: 1:30 - loss: 0.4652 - regression_loss: 0.4238 - classification_loss: 0.0414 321/500 [==================>...........] - ETA: 1:30 - loss: 0.4650 - regression_loss: 0.4236 - classification_loss: 0.0413 322/500 [==================>...........] - ETA: 1:29 - loss: 0.4655 - regression_loss: 0.4241 - classification_loss: 0.0414 323/500 [==================>...........] - ETA: 1:29 - loss: 0.4647 - regression_loss: 0.4234 - classification_loss: 0.0413 324/500 [==================>...........] - ETA: 1:28 - loss: 0.4643 - regression_loss: 0.4231 - classification_loss: 0.0412 325/500 [==================>...........] - ETA: 1:28 - loss: 0.4634 - regression_loss: 0.4222 - classification_loss: 0.0412 326/500 [==================>...........] - ETA: 1:27 - loss: 0.4639 - regression_loss: 0.4225 - classification_loss: 0.0413 327/500 [==================>...........] - ETA: 1:27 - loss: 0.4631 - regression_loss: 0.4219 - classification_loss: 0.0412 328/500 [==================>...........] - ETA: 1:26 - loss: 0.4638 - regression_loss: 0.4226 - classification_loss: 0.0412 329/500 [==================>...........] - ETA: 1:26 - loss: 0.4637 - regression_loss: 0.4225 - classification_loss: 0.0412 330/500 [==================>...........] - ETA: 1:25 - loss: 0.4635 - regression_loss: 0.4224 - classification_loss: 0.0411 331/500 [==================>...........] - ETA: 1:25 - loss: 0.4627 - regression_loss: 0.4216 - classification_loss: 0.0411 332/500 [==================>...........] - ETA: 1:24 - loss: 0.4622 - regression_loss: 0.4212 - classification_loss: 0.0410 333/500 [==================>...........] - ETA: 1:24 - loss: 0.4618 - regression_loss: 0.4208 - classification_loss: 0.0409 334/500 [===================>..........] - ETA: 1:23 - loss: 0.4613 - regression_loss: 0.4204 - classification_loss: 0.0409 335/500 [===================>..........] - ETA: 1:23 - loss: 0.4608 - regression_loss: 0.4200 - classification_loss: 0.0408 336/500 [===================>..........] - ETA: 1:22 - loss: 0.4619 - regression_loss: 0.4210 - classification_loss: 0.0409 337/500 [===================>..........] - ETA: 1:22 - loss: 0.4620 - regression_loss: 0.4211 - classification_loss: 0.0409 338/500 [===================>..........] - ETA: 1:21 - loss: 0.4617 - regression_loss: 0.4208 - classification_loss: 0.0409 339/500 [===================>..........] - ETA: 1:20 - loss: 0.4627 - regression_loss: 0.4218 - classification_loss: 0.0410 340/500 [===================>..........] - ETA: 1:20 - loss: 0.4622 - regression_loss: 0.4213 - classification_loss: 0.0410 341/500 [===================>..........] - ETA: 1:20 - loss: 0.4619 - regression_loss: 0.4210 - classification_loss: 0.0409 342/500 [===================>..........] - ETA: 1:19 - loss: 0.4625 - regression_loss: 0.4215 - classification_loss: 0.0410 343/500 [===================>..........] - ETA: 1:18 - loss: 0.4626 - regression_loss: 0.4216 - classification_loss: 0.0410 344/500 [===================>..........] - ETA: 1:18 - loss: 0.4627 - regression_loss: 0.4218 - classification_loss: 0.0409 345/500 [===================>..........] - ETA: 1:17 - loss: 0.4633 - regression_loss: 0.4223 - classification_loss: 0.0410 346/500 [===================>..........] - ETA: 1:17 - loss: 0.4625 - regression_loss: 0.4216 - classification_loss: 0.0409 347/500 [===================>..........] - ETA: 1:16 - loss: 0.4621 - regression_loss: 0.4212 - classification_loss: 0.0408 348/500 [===================>..........] - ETA: 1:16 - loss: 0.4617 - regression_loss: 0.4209 - classification_loss: 0.0408 349/500 [===================>..........] - ETA: 1:15 - loss: 0.4612 - regression_loss: 0.4204 - classification_loss: 0.0408 350/500 [====================>.........] - ETA: 1:15 - loss: 0.4610 - regression_loss: 0.4203 - classification_loss: 0.0407 351/500 [====================>.........] - ETA: 1:14 - loss: 0.4608 - regression_loss: 0.4201 - classification_loss: 0.0407 352/500 [====================>.........] - ETA: 1:14 - loss: 0.4603 - regression_loss: 0.4196 - classification_loss: 0.0406 353/500 [====================>.........] - ETA: 1:13 - loss: 0.4600 - regression_loss: 0.4194 - classification_loss: 0.0406 354/500 [====================>.........] - ETA: 1:13 - loss: 0.4596 - regression_loss: 0.4190 - classification_loss: 0.0406 355/500 [====================>.........] - ETA: 1:12 - loss: 0.4592 - regression_loss: 0.4187 - classification_loss: 0.0405 356/500 [====================>.........] - ETA: 1:12 - loss: 0.4588 - regression_loss: 0.4183 - classification_loss: 0.0405 357/500 [====================>.........] - ETA: 1:11 - loss: 0.4587 - regression_loss: 0.4182 - classification_loss: 0.0404 358/500 [====================>.........] - ETA: 1:11 - loss: 0.4587 - regression_loss: 0.4183 - classification_loss: 0.0404 359/500 [====================>.........] - ETA: 1:10 - loss: 0.4588 - regression_loss: 0.4184 - classification_loss: 0.0404 360/500 [====================>.........] - ETA: 1:10 - loss: 0.4580 - regression_loss: 0.4176 - classification_loss: 0.0404 361/500 [====================>.........] - ETA: 1:09 - loss: 0.4583 - regression_loss: 0.4180 - classification_loss: 0.0404 362/500 [====================>.........] - ETA: 1:09 - loss: 0.4578 - regression_loss: 0.4175 - classification_loss: 0.0403 363/500 [====================>.........] - ETA: 1:08 - loss: 0.4570 - regression_loss: 0.4168 - classification_loss: 0.0402 364/500 [====================>.........] - ETA: 1:08 - loss: 0.4567 - regression_loss: 0.4165 - classification_loss: 0.0402 365/500 [====================>.........] - ETA: 1:07 - loss: 0.4574 - regression_loss: 0.4170 - classification_loss: 0.0403 366/500 [====================>.........] - ETA: 1:07 - loss: 0.4573 - regression_loss: 0.4170 - classification_loss: 0.0403 367/500 [=====================>........] - ETA: 1:06 - loss: 0.4570 - regression_loss: 0.4167 - classification_loss: 0.0402 368/500 [=====================>........] - ETA: 1:06 - loss: 0.4565 - regression_loss: 0.4163 - classification_loss: 0.0402 369/500 [=====================>........] - ETA: 1:05 - loss: 0.4567 - regression_loss: 0.4165 - classification_loss: 0.0402 370/500 [=====================>........] - ETA: 1:05 - loss: 0.4564 - regression_loss: 0.4162 - classification_loss: 0.0402 371/500 [=====================>........] - ETA: 1:04 - loss: 0.4563 - regression_loss: 0.4162 - classification_loss: 0.0402 372/500 [=====================>........] - ETA: 1:04 - loss: 0.4563 - regression_loss: 0.4162 - classification_loss: 0.0402 373/500 [=====================>........] - ETA: 1:03 - loss: 0.4567 - regression_loss: 0.4165 - classification_loss: 0.0402 374/500 [=====================>........] - ETA: 1:03 - loss: 0.4561 - regression_loss: 0.4160 - classification_loss: 0.0401 375/500 [=====================>........] - ETA: 1:02 - loss: 0.4563 - regression_loss: 0.4161 - classification_loss: 0.0402 376/500 [=====================>........] - ETA: 1:02 - loss: 0.4559 - regression_loss: 0.4157 - classification_loss: 0.0402 377/500 [=====================>........] - ETA: 1:01 - loss: 0.4553 - regression_loss: 0.4152 - classification_loss: 0.0401 378/500 [=====================>........] - ETA: 1:01 - loss: 0.4563 - regression_loss: 0.4160 - classification_loss: 0.0403 379/500 [=====================>........] - ETA: 1:00 - loss: 0.4564 - regression_loss: 0.4161 - classification_loss: 0.0403 380/500 [=====================>........] - ETA: 1:00 - loss: 0.4565 - regression_loss: 0.4162 - classification_loss: 0.0403 381/500 [=====================>........] - ETA: 59s - loss: 0.4567 - regression_loss: 0.4164 - classification_loss: 0.0403  382/500 [=====================>........] - ETA: 59s - loss: 0.4573 - regression_loss: 0.4168 - classification_loss: 0.0404 383/500 [=====================>........] - ETA: 58s - loss: 0.4572 - regression_loss: 0.4168 - classification_loss: 0.0404 384/500 [======================>.......] - ETA: 58s - loss: 0.4569 - regression_loss: 0.4165 - classification_loss: 0.0403 385/500 [======================>.......] - ETA: 57s - loss: 0.4571 - regression_loss: 0.4167 - classification_loss: 0.0404 386/500 [======================>.......] - ETA: 57s - loss: 0.4573 - regression_loss: 0.4170 - classification_loss: 0.0403 387/500 [======================>.......] - ETA: 56s - loss: 0.4567 - regression_loss: 0.4164 - classification_loss: 0.0403 388/500 [======================>.......] - ETA: 56s - loss: 0.4563 - regression_loss: 0.4161 - classification_loss: 0.0402 389/500 [======================>.......] - ETA: 55s - loss: 0.4569 - regression_loss: 0.4166 - classification_loss: 0.0403 390/500 [======================>.......] - ETA: 55s - loss: 0.4563 - regression_loss: 0.4161 - classification_loss: 0.0402 391/500 [======================>.......] - ETA: 54s - loss: 0.4560 - regression_loss: 0.4158 - classification_loss: 0.0402 392/500 [======================>.......] - ETA: 54s - loss: 0.4555 - regression_loss: 0.4153 - classification_loss: 0.0401 393/500 [======================>.......] - ETA: 53s - loss: 0.4556 - regression_loss: 0.4155 - classification_loss: 0.0401 394/500 [======================>.......] - ETA: 53s - loss: 0.4551 - regression_loss: 0.4151 - classification_loss: 0.0401 395/500 [======================>.......] - ETA: 52s - loss: 0.4547 - regression_loss: 0.4147 - classification_loss: 0.0400 396/500 [======================>.......] - ETA: 52s - loss: 0.4543 - regression_loss: 0.4143 - classification_loss: 0.0400 397/500 [======================>.......] - ETA: 51s - loss: 0.4540 - regression_loss: 0.4141 - classification_loss: 0.0399 398/500 [======================>.......] - ETA: 51s - loss: 0.4536 - regression_loss: 0.4137 - classification_loss: 0.0399 399/500 [======================>.......] - ETA: 50s - loss: 0.4538 - regression_loss: 0.4139 - classification_loss: 0.0399 400/500 [=======================>......] - ETA: 50s - loss: 0.4539 - regression_loss: 0.4141 - classification_loss: 0.0398 401/500 [=======================>......] - ETA: 49s - loss: 0.4547 - regression_loss: 0.4147 - classification_loss: 0.0400 402/500 [=======================>......] - ETA: 49s - loss: 0.4548 - regression_loss: 0.4149 - classification_loss: 0.0399 403/500 [=======================>......] - ETA: 48s - loss: 0.4555 - regression_loss: 0.4155 - classification_loss: 0.0400 404/500 [=======================>......] - ETA: 48s - loss: 0.4554 - regression_loss: 0.4154 - classification_loss: 0.0400 405/500 [=======================>......] - ETA: 47s - loss: 0.4552 - regression_loss: 0.4153 - classification_loss: 0.0399 406/500 [=======================>......] - ETA: 47s - loss: 0.4549 - regression_loss: 0.4150 - classification_loss: 0.0399 407/500 [=======================>......] - ETA: 46s - loss: 0.4542 - regression_loss: 0.4144 - classification_loss: 0.0398 408/500 [=======================>......] - ETA: 46s - loss: 0.4537 - regression_loss: 0.4139 - classification_loss: 0.0398 409/500 [=======================>......] - ETA: 45s - loss: 0.4534 - regression_loss: 0.4137 - classification_loss: 0.0397 410/500 [=======================>......] - ETA: 45s - loss: 0.4540 - regression_loss: 0.4143 - classification_loss: 0.0398 411/500 [=======================>......] - ETA: 44s - loss: 0.4544 - regression_loss: 0.4146 - classification_loss: 0.0398 412/500 [=======================>......] - ETA: 44s - loss: 0.4545 - regression_loss: 0.4147 - classification_loss: 0.0399 413/500 [=======================>......] - ETA: 43s - loss: 0.4542 - regression_loss: 0.4144 - classification_loss: 0.0398 414/500 [=======================>......] - ETA: 43s - loss: 0.4536 - regression_loss: 0.4138 - classification_loss: 0.0398 415/500 [=======================>......] - ETA: 42s - loss: 0.4534 - regression_loss: 0.4137 - classification_loss: 0.0397 416/500 [=======================>......] - ETA: 42s - loss: 0.4534 - regression_loss: 0.4136 - classification_loss: 0.0398 417/500 [========================>.....] - ETA: 41s - loss: 0.4537 - regression_loss: 0.4139 - classification_loss: 0.0397 418/500 [========================>.....] - ETA: 41s - loss: 0.4539 - regression_loss: 0.4142 - classification_loss: 0.0397 419/500 [========================>.....] - ETA: 40s - loss: 0.4545 - regression_loss: 0.4147 - classification_loss: 0.0398 420/500 [========================>.....] - ETA: 40s - loss: 0.4540 - regression_loss: 0.4143 - classification_loss: 0.0397 421/500 [========================>.....] - ETA: 39s - loss: 0.4545 - regression_loss: 0.4147 - classification_loss: 0.0397 422/500 [========================>.....] - ETA: 39s - loss: 0.4549 - regression_loss: 0.4151 - classification_loss: 0.0398 423/500 [========================>.....] - ETA: 38s - loss: 0.4551 - regression_loss: 0.4153 - classification_loss: 0.0398 424/500 [========================>.....] - ETA: 38s - loss: 0.4560 - regression_loss: 0.4162 - classification_loss: 0.0399 425/500 [========================>.....] - ETA: 37s - loss: 0.4572 - regression_loss: 0.4171 - classification_loss: 0.0401 426/500 [========================>.....] - ETA: 37s - loss: 0.4574 - regression_loss: 0.4173 - classification_loss: 0.0401 427/500 [========================>.....] - ETA: 36s - loss: 0.4592 - regression_loss: 0.4181 - classification_loss: 0.0411 428/500 [========================>.....] - ETA: 36s - loss: 0.4594 - regression_loss: 0.4183 - classification_loss: 0.0412 429/500 [========================>.....] - ETA: 35s - loss: 0.4599 - regression_loss: 0.4186 - classification_loss: 0.0412 430/500 [========================>.....] - ETA: 35s - loss: 0.4596 - regression_loss: 0.4184 - classification_loss: 0.0412 431/500 [========================>.....] - ETA: 34s - loss: 0.4589 - regression_loss: 0.4178 - classification_loss: 0.0411 432/500 [========================>.....] - ETA: 34s - loss: 0.4594 - regression_loss: 0.4182 - classification_loss: 0.0412 433/500 [========================>.....] - ETA: 33s - loss: 0.4596 - regression_loss: 0.4184 - classification_loss: 0.0412 434/500 [=========================>....] - ETA: 33s - loss: 0.4606 - regression_loss: 0.4194 - classification_loss: 0.0411 435/500 [=========================>....] - ETA: 32s - loss: 0.4606 - regression_loss: 0.4194 - classification_loss: 0.0412 436/500 [=========================>....] - ETA: 32s - loss: 0.4615 - regression_loss: 0.4202 - classification_loss: 0.0413 437/500 [=========================>....] - ETA: 31s - loss: 0.4614 - regression_loss: 0.4201 - classification_loss: 0.0413 438/500 [=========================>....] - ETA: 31s - loss: 0.4615 - regression_loss: 0.4202 - classification_loss: 0.0413 439/500 [=========================>....] - ETA: 30s - loss: 0.4615 - regression_loss: 0.4202 - classification_loss: 0.0413 440/500 [=========================>....] - ETA: 30s - loss: 0.4621 - regression_loss: 0.4207 - classification_loss: 0.0414 441/500 [=========================>....] - ETA: 29s - loss: 0.4624 - regression_loss: 0.4210 - classification_loss: 0.0414 442/500 [=========================>....] - ETA: 29s - loss: 0.4628 - regression_loss: 0.4214 - classification_loss: 0.0414 443/500 [=========================>....] - ETA: 28s - loss: 0.4627 - regression_loss: 0.4213 - classification_loss: 0.0414 444/500 [=========================>....] - ETA: 28s - loss: 0.4622 - regression_loss: 0.4208 - classification_loss: 0.0413 445/500 [=========================>....] - ETA: 27s - loss: 0.4623 - regression_loss: 0.4209 - classification_loss: 0.0413 446/500 [=========================>....] - ETA: 27s - loss: 0.4623 - regression_loss: 0.4209 - classification_loss: 0.0413 447/500 [=========================>....] - ETA: 26s - loss: 0.4628 - regression_loss: 0.4214 - classification_loss: 0.0413 448/500 [=========================>....] - ETA: 26s - loss: 0.4628 - regression_loss: 0.4215 - classification_loss: 0.0413 449/500 [=========================>....] - ETA: 25s - loss: 0.4626 - regression_loss: 0.4213 - classification_loss: 0.0413 450/500 [==========================>...] - ETA: 25s - loss: 0.4619 - regression_loss: 0.4208 - classification_loss: 0.0412 451/500 [==========================>...] - ETA: 24s - loss: 0.4620 - regression_loss: 0.4208 - classification_loss: 0.0412 452/500 [==========================>...] - ETA: 24s - loss: 0.4621 - regression_loss: 0.4209 - classification_loss: 0.0412 453/500 [==========================>...] - ETA: 23s - loss: 0.4622 - regression_loss: 0.4210 - classification_loss: 0.0412 454/500 [==========================>...] - ETA: 23s - loss: 0.4624 - regression_loss: 0.4211 - classification_loss: 0.0413 455/500 [==========================>...] - ETA: 22s - loss: 0.4630 - regression_loss: 0.4216 - classification_loss: 0.0414 456/500 [==========================>...] - ETA: 22s - loss: 0.4631 - regression_loss: 0.4217 - classification_loss: 0.0414 457/500 [==========================>...] - ETA: 21s - loss: 0.4631 - regression_loss: 0.4217 - classification_loss: 0.0414 458/500 [==========================>...] - ETA: 21s - loss: 0.4628 - regression_loss: 0.4215 - classification_loss: 0.0414 459/500 [==========================>...] - ETA: 20s - loss: 0.4641 - regression_loss: 0.4225 - classification_loss: 0.0415 460/500 [==========================>...] - ETA: 20s - loss: 0.4641 - regression_loss: 0.4226 - classification_loss: 0.0415 461/500 [==========================>...] - ETA: 19s - loss: 0.4645 - regression_loss: 0.4229 - classification_loss: 0.0416 462/500 [==========================>...] - ETA: 19s - loss: 0.4640 - regression_loss: 0.4225 - classification_loss: 0.0416 463/500 [==========================>...] - ETA: 18s - loss: 0.4633 - regression_loss: 0.4219 - classification_loss: 0.0415 464/500 [==========================>...] - ETA: 18s - loss: 0.4629 - regression_loss: 0.4214 - classification_loss: 0.0414 465/500 [==========================>...] - ETA: 17s - loss: 0.4628 - regression_loss: 0.4214 - classification_loss: 0.0414 466/500 [==========================>...] - ETA: 17s - loss: 0.4631 - regression_loss: 0.4216 - classification_loss: 0.0415 467/500 [===========================>..] - ETA: 16s - loss: 0.4627 - regression_loss: 0.4212 - classification_loss: 0.0415 468/500 [===========================>..] - ETA: 16s - loss: 0.4624 - regression_loss: 0.4209 - classification_loss: 0.0414 469/500 [===========================>..] - ETA: 15s - loss: 0.4617 - regression_loss: 0.4203 - classification_loss: 0.0414 470/500 [===========================>..] - ETA: 15s - loss: 0.4613 - regression_loss: 0.4200 - classification_loss: 0.0413 471/500 [===========================>..] - ETA: 14s - loss: 0.4611 - regression_loss: 0.4198 - classification_loss: 0.0413 472/500 [===========================>..] - ETA: 14s - loss: 0.4607 - regression_loss: 0.4194 - classification_loss: 0.0412 473/500 [===========================>..] - ETA: 13s - loss: 0.4606 - regression_loss: 0.4194 - classification_loss: 0.0412 474/500 [===========================>..] - ETA: 13s - loss: 0.4610 - regression_loss: 0.4196 - classification_loss: 0.0413 475/500 [===========================>..] - ETA: 12s - loss: 0.4610 - regression_loss: 0.4197 - classification_loss: 0.0413 476/500 [===========================>..] - ETA: 12s - loss: 0.4610 - regression_loss: 0.4197 - classification_loss: 0.0413 477/500 [===========================>..] - ETA: 11s - loss: 0.4613 - regression_loss: 0.4200 - classification_loss: 0.0413 478/500 [===========================>..] - ETA: 11s - loss: 0.4616 - regression_loss: 0.4202 - classification_loss: 0.0414 479/500 [===========================>..] - ETA: 10s - loss: 0.4614 - regression_loss: 0.4200 - classification_loss: 0.0414 480/500 [===========================>..] - ETA: 10s - loss: 0.4615 - regression_loss: 0.4201 - classification_loss: 0.0414 481/500 [===========================>..] - ETA: 9s - loss: 0.4614 - regression_loss: 0.4201 - classification_loss: 0.0414  482/500 [===========================>..] - ETA: 9s - loss: 0.4617 - regression_loss: 0.4203 - classification_loss: 0.0413 483/500 [===========================>..] - ETA: 8s - loss: 0.4616 - regression_loss: 0.4203 - classification_loss: 0.0413 484/500 [============================>.] - ETA: 8s - loss: 0.4621 - regression_loss: 0.4207 - classification_loss: 0.0414 485/500 [============================>.] - ETA: 7s - loss: 0.4620 - regression_loss: 0.4206 - classification_loss: 0.0414 486/500 [============================>.] - ETA: 7s - loss: 0.4624 - regression_loss: 0.4211 - classification_loss: 0.0414 487/500 [============================>.] - ETA: 6s - loss: 0.4625 - regression_loss: 0.4211 - classification_loss: 0.0414 488/500 [============================>.] - ETA: 6s - loss: 0.4627 - regression_loss: 0.4212 - classification_loss: 0.0414 489/500 [============================>.] - ETA: 5s - loss: 0.4623 - regression_loss: 0.4209 - classification_loss: 0.0414 490/500 [============================>.] - ETA: 5s - loss: 0.4624 - regression_loss: 0.4210 - classification_loss: 0.0414 491/500 [============================>.] - ETA: 4s - loss: 0.4626 - regression_loss: 0.4212 - classification_loss: 0.0414 492/500 [============================>.] - ETA: 4s - loss: 0.4628 - regression_loss: 0.4213 - classification_loss: 0.0414 493/500 [============================>.] - ETA: 3s - loss: 0.4627 - regression_loss: 0.4213 - classification_loss: 0.0414 494/500 [============================>.] - ETA: 3s - loss: 0.4620 - regression_loss: 0.4207 - classification_loss: 0.0413 495/500 [============================>.] - ETA: 2s - loss: 0.4617 - regression_loss: 0.4204 - classification_loss: 0.0413 496/500 [============================>.] - ETA: 2s - loss: 0.4616 - regression_loss: 0.4203 - classification_loss: 0.0413 497/500 [============================>.] - ETA: 1s - loss: 0.4621 - regression_loss: 0.4208 - classification_loss: 0.0413 498/500 [============================>.] - ETA: 1s - loss: 0.4623 - regression_loss: 0.4211 - classification_loss: 0.0413 499/500 [============================>.] - ETA: 0s - loss: 0.4626 - regression_loss: 0.4214 - classification_loss: 0.0412 500/500 [==============================] - 251s 503ms/step - loss: 0.4625 - regression_loss: 0.4213 - classification_loss: 0.0412 1172 instances of class plum with average precision: 0.7338 mAP: 0.7338 Epoch 00002: saving model to ./training/snapshots/resnet101_pascal_02.h5 Epoch 3/150 1/500 [..............................] - ETA: 4:07 - loss: 0.2239 - regression_loss: 0.1994 - classification_loss: 0.0244 2/500 [..............................] - ETA: 4:02 - loss: 0.5069 - regression_loss: 0.4742 - classification_loss: 0.0326 3/500 [..............................] - ETA: 4:04 - loss: 0.5096 - regression_loss: 0.4707 - classification_loss: 0.0389 4/500 [..............................] - ETA: 4:05 - loss: 0.5194 - regression_loss: 0.4783 - classification_loss: 0.0411 5/500 [..............................] - ETA: 4:08 - loss: 0.5069 - regression_loss: 0.4680 - classification_loss: 0.0389 6/500 [..............................] - ETA: 4:07 - loss: 0.5133 - regression_loss: 0.4753 - classification_loss: 0.0380 7/500 [..............................] - ETA: 4:05 - loss: 0.4866 - regression_loss: 0.4476 - classification_loss: 0.0390 8/500 [..............................] - ETA: 4:06 - loss: 0.4836 - regression_loss: 0.4437 - classification_loss: 0.0399 9/500 [..............................] - ETA: 4:06 - loss: 0.5206 - regression_loss: 0.4800 - classification_loss: 0.0406 10/500 [..............................] - ETA: 4:05 - loss: 0.4996 - regression_loss: 0.4601 - classification_loss: 0.0396 11/500 [..............................] - ETA: 4:04 - loss: 0.4834 - regression_loss: 0.4441 - classification_loss: 0.0393 12/500 [..............................] - ETA: 4:03 - loss: 0.4961 - regression_loss: 0.4540 - classification_loss: 0.0421 13/500 [..............................] - ETA: 4:03 - loss: 0.5197 - regression_loss: 0.4753 - classification_loss: 0.0443 14/500 [..............................] - ETA: 4:03 - loss: 0.5046 - regression_loss: 0.4618 - classification_loss: 0.0429 15/500 [..............................] - ETA: 4:02 - loss: 0.4976 - regression_loss: 0.4563 - classification_loss: 0.0412 16/500 [..............................] - ETA: 4:00 - loss: 0.4892 - regression_loss: 0.4496 - classification_loss: 0.0395 17/500 [>.............................] - ETA: 3:59 - loss: 0.4878 - regression_loss: 0.4480 - classification_loss: 0.0398 18/500 [>.............................] - ETA: 3:58 - loss: 0.4896 - regression_loss: 0.4487 - classification_loss: 0.0408 19/500 [>.............................] - ETA: 3:58 - loss: 0.4744 - regression_loss: 0.4351 - classification_loss: 0.0393 20/500 [>.............................] - ETA: 3:58 - loss: 0.4656 - regression_loss: 0.4266 - classification_loss: 0.0390 21/500 [>.............................] - ETA: 3:57 - loss: 0.4648 - regression_loss: 0.4259 - classification_loss: 0.0389 22/500 [>.............................] - ETA: 3:57 - loss: 0.4652 - regression_loss: 0.4264 - classification_loss: 0.0388 23/500 [>.............................] - ETA: 3:57 - loss: 0.4607 - regression_loss: 0.4225 - classification_loss: 0.0382 24/500 [>.............................] - ETA: 3:56 - loss: 0.4513 - regression_loss: 0.4138 - classification_loss: 0.0375 25/500 [>.............................] - ETA: 3:56 - loss: 0.4484 - regression_loss: 0.4112 - classification_loss: 0.0371 26/500 [>.............................] - ETA: 3:56 - loss: 0.4479 - regression_loss: 0.4109 - classification_loss: 0.0369 27/500 [>.............................] - ETA: 3:55 - loss: 0.4405 - regression_loss: 0.4045 - classification_loss: 0.0360 28/500 [>.............................] - ETA: 3:54 - loss: 0.4448 - regression_loss: 0.4088 - classification_loss: 0.0360 29/500 [>.............................] - ETA: 3:54 - loss: 0.4495 - regression_loss: 0.4129 - classification_loss: 0.0365 30/500 [>.............................] - ETA: 3:54 - loss: 0.4454 - regression_loss: 0.4097 - classification_loss: 0.0357 31/500 [>.............................] - ETA: 3:53 - loss: 0.4389 - regression_loss: 0.4040 - classification_loss: 0.0349 32/500 [>.............................] - ETA: 3:53 - loss: 0.4393 - regression_loss: 0.4045 - classification_loss: 0.0348 33/500 [>.............................] - ETA: 3:52 - loss: 0.4394 - regression_loss: 0.4048 - classification_loss: 0.0346 34/500 [=>............................] - ETA: 3:52 - loss: 0.4382 - regression_loss: 0.4040 - classification_loss: 0.0342 35/500 [=>............................] - ETA: 3:52 - loss: 0.4423 - regression_loss: 0.4079 - classification_loss: 0.0344 36/500 [=>............................] - ETA: 3:51 - loss: 0.4429 - regression_loss: 0.4084 - classification_loss: 0.0345 37/500 [=>............................] - ETA: 3:50 - loss: 0.4465 - regression_loss: 0.4111 - classification_loss: 0.0354 38/500 [=>............................] - ETA: 3:50 - loss: 0.4486 - regression_loss: 0.4130 - classification_loss: 0.0357 39/500 [=>............................] - ETA: 3:50 - loss: 0.4447 - regression_loss: 0.4090 - classification_loss: 0.0357 40/500 [=>............................] - ETA: 3:49 - loss: 0.4380 - regression_loss: 0.4029 - classification_loss: 0.0351 41/500 [=>............................] - ETA: 3:48 - loss: 0.4418 - regression_loss: 0.4064 - classification_loss: 0.0353 42/500 [=>............................] - ETA: 3:48 - loss: 0.4417 - regression_loss: 0.4061 - classification_loss: 0.0355 43/500 [=>............................] - ETA: 3:48 - loss: 0.4380 - regression_loss: 0.4029 - classification_loss: 0.0351 44/500 [=>............................] - ETA: 3:47 - loss: 0.4324 - regression_loss: 0.3978 - classification_loss: 0.0346 45/500 [=>............................] - ETA: 3:47 - loss: 0.4304 - regression_loss: 0.3956 - classification_loss: 0.0347 46/500 [=>............................] - ETA: 3:46 - loss: 0.4382 - regression_loss: 0.4018 - classification_loss: 0.0364 47/500 [=>............................] - ETA: 3:45 - loss: 0.4405 - regression_loss: 0.4035 - classification_loss: 0.0370 48/500 [=>............................] - ETA: 3:45 - loss: 0.4395 - regression_loss: 0.4025 - classification_loss: 0.0370 49/500 [=>............................] - ETA: 3:44 - loss: 0.4400 - regression_loss: 0.4029 - classification_loss: 0.0371 50/500 [==>...........................] - ETA: 3:44 - loss: 0.4384 - regression_loss: 0.4016 - classification_loss: 0.0368 51/500 [==>...........................] - ETA: 3:43 - loss: 0.4394 - regression_loss: 0.4026 - classification_loss: 0.0368 52/500 [==>...........................] - ETA: 3:43 - loss: 0.4422 - regression_loss: 0.4058 - classification_loss: 0.0364 53/500 [==>...........................] - ETA: 3:42 - loss: 0.4424 - regression_loss: 0.4059 - classification_loss: 0.0365 54/500 [==>...........................] - ETA: 3:42 - loss: 0.4445 - regression_loss: 0.4077 - classification_loss: 0.0368 55/500 [==>...........................] - ETA: 3:41 - loss: 0.4439 - regression_loss: 0.4071 - classification_loss: 0.0368 56/500 [==>...........................] - ETA: 3:41 - loss: 0.4480 - regression_loss: 0.4109 - classification_loss: 0.0371 57/500 [==>...........................] - ETA: 3:40 - loss: 0.4465 - regression_loss: 0.4096 - classification_loss: 0.0369 58/500 [==>...........................] - ETA: 3:40 - loss: 0.4477 - regression_loss: 0.4108 - classification_loss: 0.0370 59/500 [==>...........................] - ETA: 3:40 - loss: 0.4439 - regression_loss: 0.4072 - classification_loss: 0.0366 60/500 [==>...........................] - ETA: 3:39 - loss: 0.4406 - regression_loss: 0.4042 - classification_loss: 0.0364 61/500 [==>...........................] - ETA: 3:39 - loss: 0.4362 - regression_loss: 0.4002 - classification_loss: 0.0360 62/500 [==>...........................] - ETA: 3:38 - loss: 0.4351 - regression_loss: 0.3991 - classification_loss: 0.0360 63/500 [==>...........................] - ETA: 3:38 - loss: 0.4360 - regression_loss: 0.3997 - classification_loss: 0.0363 64/500 [==>...........................] - ETA: 3:37 - loss: 0.4399 - regression_loss: 0.4028 - classification_loss: 0.0372 65/500 [==>...........................] - ETA: 3:37 - loss: 0.4391 - regression_loss: 0.4020 - classification_loss: 0.0371 66/500 [==>...........................] - ETA: 3:36 - loss: 0.4384 - regression_loss: 0.4016 - classification_loss: 0.0368 67/500 [===>..........................] - ETA: 3:36 - loss: 0.4334 - regression_loss: 0.3971 - classification_loss: 0.0363 68/500 [===>..........................] - ETA: 3:35 - loss: 0.4327 - regression_loss: 0.3967 - classification_loss: 0.0360 69/500 [===>..........................] - ETA: 3:35 - loss: 0.4355 - regression_loss: 0.3993 - classification_loss: 0.0363 70/500 [===>..........................] - ETA: 3:34 - loss: 0.4352 - regression_loss: 0.3991 - classification_loss: 0.0361 71/500 [===>..........................] - ETA: 3:34 - loss: 0.4339 - regression_loss: 0.3982 - classification_loss: 0.0358 72/500 [===>..........................] - ETA: 3:33 - loss: 0.4316 - regression_loss: 0.3961 - classification_loss: 0.0355 73/500 [===>..........................] - ETA: 3:33 - loss: 0.4311 - regression_loss: 0.3955 - classification_loss: 0.0356 74/500 [===>..........................] - ETA: 3:33 - loss: 0.4292 - regression_loss: 0.3938 - classification_loss: 0.0353 75/500 [===>..........................] - ETA: 3:32 - loss: 0.4258 - regression_loss: 0.3908 - classification_loss: 0.0350 76/500 [===>..........................] - ETA: 3:32 - loss: 0.4243 - regression_loss: 0.3895 - classification_loss: 0.0348 77/500 [===>..........................] - ETA: 3:31 - loss: 0.4234 - regression_loss: 0.3886 - classification_loss: 0.0348 78/500 [===>..........................] - ETA: 3:30 - loss: 0.4245 - regression_loss: 0.3896 - classification_loss: 0.0348 79/500 [===>..........................] - ETA: 3:30 - loss: 0.4256 - regression_loss: 0.3907 - classification_loss: 0.0349 80/500 [===>..........................] - ETA: 3:29 - loss: 0.4237 - regression_loss: 0.3890 - classification_loss: 0.0347 81/500 [===>..........................] - ETA: 3:29 - loss: 0.4232 - regression_loss: 0.3882 - classification_loss: 0.0350 82/500 [===>..........................] - ETA: 3:29 - loss: 0.4232 - regression_loss: 0.3881 - classification_loss: 0.0351 83/500 [===>..........................] - ETA: 3:28 - loss: 0.4215 - regression_loss: 0.3868 - classification_loss: 0.0348 84/500 [====>.........................] - ETA: 3:28 - loss: 0.4217 - regression_loss: 0.3869 - classification_loss: 0.0348 85/500 [====>.........................] - ETA: 3:27 - loss: 0.4216 - regression_loss: 0.3867 - classification_loss: 0.0349 86/500 [====>.........................] - ETA: 3:27 - loss: 0.4218 - regression_loss: 0.3869 - classification_loss: 0.0349 87/500 [====>.........................] - ETA: 3:26 - loss: 0.4219 - regression_loss: 0.3869 - classification_loss: 0.0349 88/500 [====>.........................] - ETA: 3:26 - loss: 0.4209 - regression_loss: 0.3860 - classification_loss: 0.0350 89/500 [====>.........................] - ETA: 3:25 - loss: 0.4218 - regression_loss: 0.3868 - classification_loss: 0.0350 90/500 [====>.........................] - ETA: 3:25 - loss: 0.4239 - regression_loss: 0.3887 - classification_loss: 0.0352 91/500 [====>.........................] - ETA: 3:24 - loss: 0.4263 - regression_loss: 0.3910 - classification_loss: 0.0353 92/500 [====>.........................] - ETA: 3:24 - loss: 0.4270 - regression_loss: 0.3919 - classification_loss: 0.0351 93/500 [====>.........................] - ETA: 3:23 - loss: 0.4292 - regression_loss: 0.3935 - classification_loss: 0.0356 94/500 [====>.........................] - ETA: 3:23 - loss: 0.4282 - regression_loss: 0.3927 - classification_loss: 0.0355 95/500 [====>.........................] - ETA: 3:22 - loss: 0.4280 - regression_loss: 0.3925 - classification_loss: 0.0354 96/500 [====>.........................] - ETA: 3:22 - loss: 0.4267 - regression_loss: 0.3915 - classification_loss: 0.0353 97/500 [====>.........................] - ETA: 3:21 - loss: 0.4286 - regression_loss: 0.3933 - classification_loss: 0.0353 98/500 [====>.........................] - ETA: 3:21 - loss: 0.4274 - regression_loss: 0.3922 - classification_loss: 0.0352 99/500 [====>.........................] - ETA: 3:20 - loss: 0.4291 - regression_loss: 0.3936 - classification_loss: 0.0354 100/500 [=====>........................] - ETA: 3:20 - loss: 0.4309 - regression_loss: 0.3953 - classification_loss: 0.0357 101/500 [=====>........................] - ETA: 3:19 - loss: 0.4347 - regression_loss: 0.3983 - classification_loss: 0.0363 102/500 [=====>........................] - ETA: 3:19 - loss: 0.4332 - regression_loss: 0.3971 - classification_loss: 0.0361 103/500 [=====>........................] - ETA: 3:18 - loss: 0.4312 - regression_loss: 0.3953 - classification_loss: 0.0359 104/500 [=====>........................] - ETA: 3:17 - loss: 0.4328 - regression_loss: 0.3969 - classification_loss: 0.0359 105/500 [=====>........................] - ETA: 3:17 - loss: 0.4325 - regression_loss: 0.3967 - classification_loss: 0.0358 106/500 [=====>........................] - ETA: 3:16 - loss: 0.4310 - regression_loss: 0.3953 - classification_loss: 0.0357 107/500 [=====>........................] - ETA: 3:16 - loss: 0.4310 - regression_loss: 0.3954 - classification_loss: 0.0356 108/500 [=====>........................] - ETA: 3:16 - loss: 0.4304 - regression_loss: 0.3946 - classification_loss: 0.0358 109/500 [=====>........................] - ETA: 3:15 - loss: 0.4300 - regression_loss: 0.3941 - classification_loss: 0.0359 110/500 [=====>........................] - ETA: 3:15 - loss: 0.4341 - regression_loss: 0.3975 - classification_loss: 0.0366 111/500 [=====>........................] - ETA: 3:14 - loss: 0.4367 - regression_loss: 0.4000 - classification_loss: 0.0367 112/500 [=====>........................] - ETA: 3:14 - loss: 0.4365 - regression_loss: 0.4000 - classification_loss: 0.0365 113/500 [=====>........................] - ETA: 3:13 - loss: 0.4388 - regression_loss: 0.4022 - classification_loss: 0.0366 114/500 [=====>........................] - ETA: 3:12 - loss: 0.4377 - regression_loss: 0.4013 - classification_loss: 0.0365 115/500 [=====>........................] - ETA: 3:12 - loss: 0.4372 - regression_loss: 0.4009 - classification_loss: 0.0363 116/500 [=====>........................] - ETA: 3:12 - loss: 0.4364 - regression_loss: 0.4002 - classification_loss: 0.0362 117/500 [======>.......................] - ETA: 3:11 - loss: 0.4365 - regression_loss: 0.4002 - classification_loss: 0.0363 118/500 [======>.......................] - ETA: 3:11 - loss: 0.4357 - regression_loss: 0.3995 - classification_loss: 0.0362 119/500 [======>.......................] - ETA: 3:10 - loss: 0.4365 - regression_loss: 0.4003 - classification_loss: 0.0363 120/500 [======>.......................] - ETA: 3:10 - loss: 0.4368 - regression_loss: 0.4005 - classification_loss: 0.0362 121/500 [======>.......................] - ETA: 3:09 - loss: 0.4371 - regression_loss: 0.4011 - classification_loss: 0.0360 122/500 [======>.......................] - ETA: 3:09 - loss: 0.4358 - regression_loss: 0.3999 - classification_loss: 0.0358 123/500 [======>.......................] - ETA: 3:08 - loss: 0.4341 - regression_loss: 0.3985 - classification_loss: 0.0356 124/500 [======>.......................] - ETA: 3:08 - loss: 0.4333 - regression_loss: 0.3978 - classification_loss: 0.0356 125/500 [======>.......................] - ETA: 3:07 - loss: 0.4340 - regression_loss: 0.3984 - classification_loss: 0.0355 126/500 [======>.......................] - ETA: 3:07 - loss: 0.4323 - regression_loss: 0.3968 - classification_loss: 0.0355 127/500 [======>.......................] - ETA: 3:06 - loss: 0.4325 - regression_loss: 0.3969 - classification_loss: 0.0356 128/500 [======>.......................] - ETA: 3:06 - loss: 0.4326 - regression_loss: 0.3970 - classification_loss: 0.0356 129/500 [======>.......................] - ETA: 3:05 - loss: 0.4319 - regression_loss: 0.3964 - classification_loss: 0.0355 130/500 [======>.......................] - ETA: 3:05 - loss: 0.4341 - regression_loss: 0.3982 - classification_loss: 0.0360 131/500 [======>.......................] - ETA: 3:04 - loss: 0.4386 - regression_loss: 0.4027 - classification_loss: 0.0359 132/500 [======>.......................] - ETA: 3:04 - loss: 0.4447 - regression_loss: 0.4080 - classification_loss: 0.0367 133/500 [======>.......................] - ETA: 3:03 - loss: 0.4448 - regression_loss: 0.4081 - classification_loss: 0.0367 134/500 [=======>......................] - ETA: 3:03 - loss: 0.4464 - regression_loss: 0.4095 - classification_loss: 0.0369 135/500 [=======>......................] - ETA: 3:02 - loss: 0.4444 - regression_loss: 0.4077 - classification_loss: 0.0367 136/500 [=======>......................] - ETA: 3:02 - loss: 0.4445 - regression_loss: 0.4079 - classification_loss: 0.0366 137/500 [=======>......................] - ETA: 3:01 - loss: 0.4450 - regression_loss: 0.4084 - classification_loss: 0.0367 138/500 [=======>......................] - ETA: 3:01 - loss: 0.4441 - regression_loss: 0.4076 - classification_loss: 0.0365 139/500 [=======>......................] - ETA: 3:00 - loss: 0.4432 - regression_loss: 0.4068 - classification_loss: 0.0364 140/500 [=======>......................] - ETA: 3:00 - loss: 0.4440 - regression_loss: 0.4076 - classification_loss: 0.0363 141/500 [=======>......................] - ETA: 2:59 - loss: 0.4470 - regression_loss: 0.4100 - classification_loss: 0.0370 142/500 [=======>......................] - ETA: 2:59 - loss: 0.4468 - regression_loss: 0.4098 - classification_loss: 0.0369 143/500 [=======>......................] - ETA: 2:58 - loss: 0.4487 - regression_loss: 0.4116 - classification_loss: 0.0371 144/500 [=======>......................] - ETA: 2:58 - loss: 0.4468 - regression_loss: 0.4099 - classification_loss: 0.0369 145/500 [=======>......................] - ETA: 2:57 - loss: 0.4475 - regression_loss: 0.4105 - classification_loss: 0.0370 146/500 [=======>......................] - ETA: 2:57 - loss: 0.4472 - regression_loss: 0.4103 - classification_loss: 0.0370 147/500 [=======>......................] - ETA: 2:56 - loss: 0.4490 - regression_loss: 0.4120 - classification_loss: 0.0370 148/500 [=======>......................] - ETA: 2:56 - loss: 0.4477 - regression_loss: 0.4108 - classification_loss: 0.0369 149/500 [=======>......................] - ETA: 2:55 - loss: 0.4482 - regression_loss: 0.4112 - classification_loss: 0.0370 150/500 [========>.....................] - ETA: 2:55 - loss: 0.4492 - regression_loss: 0.4123 - classification_loss: 0.0369 151/500 [========>.....................] - ETA: 2:54 - loss: 0.4497 - regression_loss: 0.4128 - classification_loss: 0.0369 152/500 [========>.....................] - ETA: 2:54 - loss: 0.4482 - regression_loss: 0.4114 - classification_loss: 0.0368 153/500 [========>.....................] - ETA: 2:53 - loss: 0.4488 - regression_loss: 0.4119 - classification_loss: 0.0369 154/500 [========>.....................] - ETA: 2:53 - loss: 0.4510 - regression_loss: 0.4140 - classification_loss: 0.0370 155/500 [========>.....................] - ETA: 2:52 - loss: 0.4507 - regression_loss: 0.4137 - classification_loss: 0.0370 156/500 [========>.....................] - ETA: 2:52 - loss: 0.4507 - regression_loss: 0.4138 - classification_loss: 0.0370 157/500 [========>.....................] - ETA: 2:51 - loss: 0.4496 - regression_loss: 0.4127 - classification_loss: 0.0369 158/500 [========>.....................] - ETA: 2:51 - loss: 0.4494 - regression_loss: 0.4125 - classification_loss: 0.0369 159/500 [========>.....................] - ETA: 2:50 - loss: 0.4486 - regression_loss: 0.4118 - classification_loss: 0.0368 160/500 [========>.....................] - ETA: 2:50 - loss: 0.4477 - regression_loss: 0.4111 - classification_loss: 0.0366 161/500 [========>.....................] - ETA: 2:49 - loss: 0.4489 - regression_loss: 0.4122 - classification_loss: 0.0367 162/500 [========>.....................] - ETA: 2:49 - loss: 0.4510 - regression_loss: 0.4141 - classification_loss: 0.0368 163/500 [========>.....................] - ETA: 2:48 - loss: 0.4510 - regression_loss: 0.4142 - classification_loss: 0.0368 164/500 [========>.....................] - ETA: 2:48 - loss: 0.4517 - regression_loss: 0.4148 - classification_loss: 0.0369 165/500 [========>.....................] - ETA: 2:47 - loss: 0.4509 - regression_loss: 0.4141 - classification_loss: 0.0369 166/500 [========>.....................] - ETA: 2:47 - loss: 0.4523 - regression_loss: 0.4152 - classification_loss: 0.0371 167/500 [=========>....................] - ETA: 2:46 - loss: 0.4525 - regression_loss: 0.4154 - classification_loss: 0.0371 168/500 [=========>....................] - ETA: 2:46 - loss: 0.4510 - regression_loss: 0.4141 - classification_loss: 0.0369 169/500 [=========>....................] - ETA: 2:45 - loss: 0.4517 - regression_loss: 0.4147 - classification_loss: 0.0369 170/500 [=========>....................] - ETA: 2:45 - loss: 0.4529 - regression_loss: 0.4160 - classification_loss: 0.0369 171/500 [=========>....................] - ETA: 2:44 - loss: 0.4519 - regression_loss: 0.4150 - classification_loss: 0.0368 172/500 [=========>....................] - ETA: 2:44 - loss: 0.4509 - regression_loss: 0.4142 - classification_loss: 0.0367 173/500 [=========>....................] - ETA: 2:43 - loss: 0.4518 - regression_loss: 0.4150 - classification_loss: 0.0369 174/500 [=========>....................] - ETA: 2:43 - loss: 0.4507 - regression_loss: 0.4139 - classification_loss: 0.0368 175/500 [=========>....................] - ETA: 2:42 - loss: 0.4508 - regression_loss: 0.4139 - classification_loss: 0.0368 176/500 [=========>....................] - ETA: 2:42 - loss: 0.4519 - regression_loss: 0.4149 - classification_loss: 0.0371 177/500 [=========>....................] - ETA: 2:41 - loss: 0.4529 - regression_loss: 0.4156 - classification_loss: 0.0373 178/500 [=========>....................] - ETA: 2:41 - loss: 0.4534 - regression_loss: 0.4161 - classification_loss: 0.0373 179/500 [=========>....................] - ETA: 2:40 - loss: 0.4547 - regression_loss: 0.4173 - classification_loss: 0.0374 180/500 [=========>....................] - ETA: 2:40 - loss: 0.4559 - regression_loss: 0.4183 - classification_loss: 0.0375 181/500 [=========>....................] - ETA: 2:39 - loss: 0.4570 - regression_loss: 0.4193 - classification_loss: 0.0377 182/500 [=========>....................] - ETA: 2:39 - loss: 0.4565 - regression_loss: 0.4190 - classification_loss: 0.0375 183/500 [=========>....................] - ETA: 2:38 - loss: 0.4548 - regression_loss: 0.4175 - classification_loss: 0.0374 184/500 [==========>...................] - ETA: 2:38 - loss: 0.4552 - regression_loss: 0.4178 - classification_loss: 0.0373 185/500 [==========>...................] - ETA: 2:37 - loss: 0.4559 - regression_loss: 0.4186 - classification_loss: 0.0373 186/500 [==========>...................] - ETA: 2:37 - loss: 0.4579 - regression_loss: 0.4202 - classification_loss: 0.0376 187/500 [==========>...................] - ETA: 2:36 - loss: 0.4584 - regression_loss: 0.4205 - classification_loss: 0.0379 188/500 [==========>...................] - ETA: 2:36 - loss: 0.4588 - regression_loss: 0.4208 - classification_loss: 0.0380 189/500 [==========>...................] - ETA: 2:35 - loss: 0.4583 - regression_loss: 0.4204 - classification_loss: 0.0379 190/500 [==========>...................] - ETA: 2:35 - loss: 0.4590 - regression_loss: 0.4209 - classification_loss: 0.0381 191/500 [==========>...................] - ETA: 2:34 - loss: 0.4586 - regression_loss: 0.4204 - classification_loss: 0.0381 192/500 [==========>...................] - ETA: 2:34 - loss: 0.4575 - regression_loss: 0.4194 - classification_loss: 0.0381 193/500 [==========>...................] - ETA: 2:33 - loss: 0.4582 - regression_loss: 0.4201 - classification_loss: 0.0381 194/500 [==========>...................] - ETA: 2:33 - loss: 0.4573 - regression_loss: 0.4193 - classification_loss: 0.0380 195/500 [==========>...................] - ETA: 2:32 - loss: 0.4580 - regression_loss: 0.4200 - classification_loss: 0.0380 196/500 [==========>...................] - ETA: 2:32 - loss: 0.4596 - regression_loss: 0.4214 - classification_loss: 0.0382 197/500 [==========>...................] - ETA: 2:31 - loss: 0.4590 - regression_loss: 0.4209 - classification_loss: 0.0381 198/500 [==========>...................] - ETA: 2:31 - loss: 0.4588 - regression_loss: 0.4207 - classification_loss: 0.0381 199/500 [==========>...................] - ETA: 2:30 - loss: 0.4586 - regression_loss: 0.4205 - classification_loss: 0.0381 200/500 [===========>..................] - ETA: 2:30 - loss: 0.4586 - regression_loss: 0.4205 - classification_loss: 0.0381 201/500 [===========>..................] - ETA: 2:29 - loss: 0.4575 - regression_loss: 0.4196 - classification_loss: 0.0379 202/500 [===========>..................] - ETA: 2:29 - loss: 0.4562 - regression_loss: 0.4184 - classification_loss: 0.0378 203/500 [===========>..................] - ETA: 2:28 - loss: 0.4556 - regression_loss: 0.4179 - classification_loss: 0.0377 204/500 [===========>..................] - ETA: 2:28 - loss: 0.4551 - regression_loss: 0.4175 - classification_loss: 0.0376 205/500 [===========>..................] - ETA: 2:27 - loss: 0.4550 - regression_loss: 0.4173 - classification_loss: 0.0376 206/500 [===========>..................] - ETA: 2:27 - loss: 0.4546 - regression_loss: 0.4170 - classification_loss: 0.0376 207/500 [===========>..................] - ETA: 2:26 - loss: 0.4549 - regression_loss: 0.4173 - classification_loss: 0.0376 208/500 [===========>..................] - ETA: 2:26 - loss: 0.4548 - regression_loss: 0.4172 - classification_loss: 0.0376 209/500 [===========>..................] - ETA: 2:25 - loss: 0.4552 - regression_loss: 0.4174 - classification_loss: 0.0378 210/500 [===========>..................] - ETA: 2:25 - loss: 0.4547 - regression_loss: 0.4170 - classification_loss: 0.0377 211/500 [===========>..................] - ETA: 2:24 - loss: 0.4544 - regression_loss: 0.4167 - classification_loss: 0.0377 212/500 [===========>..................] - ETA: 2:24 - loss: 0.4542 - regression_loss: 0.4165 - classification_loss: 0.0376 213/500 [===========>..................] - ETA: 2:23 - loss: 0.4536 - regression_loss: 0.4159 - classification_loss: 0.0377 214/500 [===========>..................] - ETA: 2:23 - loss: 0.4532 - regression_loss: 0.4156 - classification_loss: 0.0377 215/500 [===========>..................] - ETA: 2:22 - loss: 0.4532 - regression_loss: 0.4155 - classification_loss: 0.0377 216/500 [===========>..................] - ETA: 2:22 - loss: 0.4534 - regression_loss: 0.4157 - classification_loss: 0.0377 217/500 [============>.................] - ETA: 2:21 - loss: 0.4534 - regression_loss: 0.4156 - classification_loss: 0.0378 218/500 [============>.................] - ETA: 2:21 - loss: 0.4531 - regression_loss: 0.4154 - classification_loss: 0.0377 219/500 [============>.................] - ETA: 2:20 - loss: 0.4544 - regression_loss: 0.4166 - classification_loss: 0.0378 220/500 [============>.................] - ETA: 2:20 - loss: 0.4555 - regression_loss: 0.4175 - classification_loss: 0.0380 221/500 [============>.................] - ETA: 2:19 - loss: 0.4553 - regression_loss: 0.4173 - classification_loss: 0.0379 222/500 [============>.................] - ETA: 2:19 - loss: 0.4559 - regression_loss: 0.4179 - classification_loss: 0.0381 223/500 [============>.................] - ETA: 2:18 - loss: 0.4561 - regression_loss: 0.4180 - classification_loss: 0.0381 224/500 [============>.................] - ETA: 2:18 - loss: 0.4559 - regression_loss: 0.4178 - classification_loss: 0.0381 225/500 [============>.................] - ETA: 2:17 - loss: 0.4566 - regression_loss: 0.4183 - classification_loss: 0.0383 226/500 [============>.................] - ETA: 2:17 - loss: 0.4559 - regression_loss: 0.4178 - classification_loss: 0.0382 227/500 [============>.................] - ETA: 2:16 - loss: 0.4555 - regression_loss: 0.4174 - classification_loss: 0.0382 228/500 [============>.................] - ETA: 2:16 - loss: 0.4556 - regression_loss: 0.4174 - classification_loss: 0.0382 229/500 [============>.................] - ETA: 2:15 - loss: 0.4559 - regression_loss: 0.4177 - classification_loss: 0.0382 230/500 [============>.................] - ETA: 2:15 - loss: 0.4577 - regression_loss: 0.4190 - classification_loss: 0.0386 231/500 [============>.................] - ETA: 2:14 - loss: 0.4578 - regression_loss: 0.4192 - classification_loss: 0.0386 232/500 [============>.................] - ETA: 2:14 - loss: 0.4584 - regression_loss: 0.4198 - classification_loss: 0.0386 233/500 [============>.................] - ETA: 2:13 - loss: 0.4587 - regression_loss: 0.4202 - classification_loss: 0.0385 234/500 [=============>................] - ETA: 2:13 - loss: 0.4584 - regression_loss: 0.4199 - classification_loss: 0.0385 235/500 [=============>................] - ETA: 2:12 - loss: 0.4580 - regression_loss: 0.4196 - classification_loss: 0.0384 236/500 [=============>................] - ETA: 2:12 - loss: 0.4583 - regression_loss: 0.4197 - classification_loss: 0.0386 237/500 [=============>................] - ETA: 2:11 - loss: 0.4576 - regression_loss: 0.4190 - classification_loss: 0.0386 238/500 [=============>................] - ETA: 2:11 - loss: 0.4579 - regression_loss: 0.4193 - classification_loss: 0.0386 239/500 [=============>................] - ETA: 2:10 - loss: 0.4578 - regression_loss: 0.4193 - classification_loss: 0.0385 240/500 [=============>................] - ETA: 2:10 - loss: 0.4590 - regression_loss: 0.4204 - classification_loss: 0.0386 241/500 [=============>................] - ETA: 2:09 - loss: 0.4595 - regression_loss: 0.4209 - classification_loss: 0.0386 242/500 [=============>................] - ETA: 2:09 - loss: 0.4598 - regression_loss: 0.4212 - classification_loss: 0.0386 243/500 [=============>................] - ETA: 2:08 - loss: 0.4610 - regression_loss: 0.4225 - classification_loss: 0.0386 244/500 [=============>................] - ETA: 2:08 - loss: 0.4604 - regression_loss: 0.4219 - classification_loss: 0.0385 245/500 [=============>................] - ETA: 2:07 - loss: 0.4601 - regression_loss: 0.4217 - classification_loss: 0.0385 246/500 [=============>................] - ETA: 2:07 - loss: 0.4593 - regression_loss: 0.4209 - classification_loss: 0.0384 247/500 [=============>................] - ETA: 2:06 - loss: 0.4597 - regression_loss: 0.4213 - classification_loss: 0.0384 248/500 [=============>................] - ETA: 2:06 - loss: 0.4592 - regression_loss: 0.4207 - classification_loss: 0.0385 249/500 [=============>................] - ETA: 2:05 - loss: 0.4598 - regression_loss: 0.4212 - classification_loss: 0.0386 250/500 [==============>...............] - ETA: 2:05 - loss: 0.4591 - regression_loss: 0.4205 - classification_loss: 0.0386 251/500 [==============>...............] - ETA: 2:04 - loss: 0.4595 - regression_loss: 0.4209 - classification_loss: 0.0386 252/500 [==============>...............] - ETA: 2:04 - loss: 0.4588 - regression_loss: 0.4203 - classification_loss: 0.0385 253/500 [==============>...............] - ETA: 2:03 - loss: 0.4597 - regression_loss: 0.4212 - classification_loss: 0.0385 254/500 [==============>...............] - ETA: 2:03 - loss: 0.4597 - regression_loss: 0.4212 - classification_loss: 0.0385 255/500 [==============>...............] - ETA: 2:02 - loss: 0.4603 - regression_loss: 0.4217 - classification_loss: 0.0386 256/500 [==============>...............] - ETA: 2:02 - loss: 0.4604 - regression_loss: 0.4219 - classification_loss: 0.0385 257/500 [==============>...............] - ETA: 2:01 - loss: 0.4601 - regression_loss: 0.4217 - classification_loss: 0.0385 258/500 [==============>...............] - ETA: 2:01 - loss: 0.4609 - regression_loss: 0.4224 - classification_loss: 0.0385 259/500 [==============>...............] - ETA: 2:00 - loss: 0.4605 - regression_loss: 0.4220 - classification_loss: 0.0385 260/500 [==============>...............] - ETA: 2:00 - loss: 0.4594 - regression_loss: 0.4210 - classification_loss: 0.0384 261/500 [==============>...............] - ETA: 1:59 - loss: 0.4586 - regression_loss: 0.4203 - classification_loss: 0.0383 262/500 [==============>...............] - ETA: 1:59 - loss: 0.4589 - regression_loss: 0.4205 - classification_loss: 0.0384 263/500 [==============>...............] - ETA: 1:58 - loss: 0.4582 - regression_loss: 0.4199 - classification_loss: 0.0383 264/500 [==============>...............] - ETA: 1:58 - loss: 0.4576 - regression_loss: 0.4194 - classification_loss: 0.0382 265/500 [==============>...............] - ETA: 1:57 - loss: 0.4569 - regression_loss: 0.4188 - classification_loss: 0.0382 266/500 [==============>...............] - ETA: 1:57 - loss: 0.4570 - regression_loss: 0.4188 - classification_loss: 0.0382 267/500 [===============>..............] - ETA: 1:56 - loss: 0.4566 - regression_loss: 0.4183 - classification_loss: 0.0382 268/500 [===============>..............] - ETA: 1:56 - loss: 0.4575 - regression_loss: 0.4192 - classification_loss: 0.0383 269/500 [===============>..............] - ETA: 1:55 - loss: 0.4582 - regression_loss: 0.4197 - classification_loss: 0.0385 270/500 [===============>..............] - ETA: 1:55 - loss: 0.4578 - regression_loss: 0.4194 - classification_loss: 0.0384 271/500 [===============>..............] - ETA: 1:54 - loss: 0.4576 - regression_loss: 0.4192 - classification_loss: 0.0383 272/500 [===============>..............] - ETA: 1:54 - loss: 0.4578 - regression_loss: 0.4194 - classification_loss: 0.0384 273/500 [===============>..............] - ETA: 1:53 - loss: 0.4577 - regression_loss: 0.4194 - classification_loss: 0.0383 274/500 [===============>..............] - ETA: 1:53 - loss: 0.4579 - regression_loss: 0.4196 - classification_loss: 0.0383 275/500 [===============>..............] - ETA: 1:52 - loss: 0.4582 - regression_loss: 0.4199 - classification_loss: 0.0383 276/500 [===============>..............] - ETA: 1:52 - loss: 0.4592 - regression_loss: 0.4208 - classification_loss: 0.0384 277/500 [===============>..............] - ETA: 1:51 - loss: 0.4591 - regression_loss: 0.4207 - classification_loss: 0.0384 278/500 [===============>..............] - ETA: 1:51 - loss: 0.4587 - regression_loss: 0.4204 - classification_loss: 0.0383 279/500 [===============>..............] - ETA: 1:50 - loss: 0.4589 - regression_loss: 0.4206 - classification_loss: 0.0383 280/500 [===============>..............] - ETA: 1:50 - loss: 0.4595 - regression_loss: 0.4210 - classification_loss: 0.0385 281/500 [===============>..............] - ETA: 1:49 - loss: 0.4597 - regression_loss: 0.4212 - classification_loss: 0.0385 282/500 [===============>..............] - ETA: 1:49 - loss: 0.4589 - regression_loss: 0.4204 - classification_loss: 0.0385 283/500 [===============>..............] - ETA: 1:48 - loss: 0.4588 - regression_loss: 0.4202 - classification_loss: 0.0386 284/500 [================>.............] - ETA: 1:48 - loss: 0.4575 - regression_loss: 0.4190 - classification_loss: 0.0385 285/500 [================>.............] - ETA: 1:47 - loss: 0.4569 - regression_loss: 0.4185 - classification_loss: 0.0384 286/500 [================>.............] - ETA: 1:47 - loss: 0.4570 - regression_loss: 0.4185 - classification_loss: 0.0385 287/500 [================>.............] - ETA: 1:46 - loss: 0.4559 - regression_loss: 0.4175 - classification_loss: 0.0384 288/500 [================>.............] - ETA: 1:46 - loss: 0.4550 - regression_loss: 0.4167 - classification_loss: 0.0383 289/500 [================>.............] - ETA: 1:45 - loss: 0.4546 - regression_loss: 0.4164 - classification_loss: 0.0383 290/500 [================>.............] - ETA: 1:45 - loss: 0.4545 - regression_loss: 0.4162 - classification_loss: 0.0383 291/500 [================>.............] - ETA: 1:44 - loss: 0.4548 - regression_loss: 0.4166 - classification_loss: 0.0383 292/500 [================>.............] - ETA: 1:44 - loss: 0.4551 - regression_loss: 0.4167 - classification_loss: 0.0384 293/500 [================>.............] - ETA: 1:43 - loss: 0.4550 - regression_loss: 0.4167 - classification_loss: 0.0383 294/500 [================>.............] - ETA: 1:43 - loss: 0.4547 - regression_loss: 0.4164 - classification_loss: 0.0383 295/500 [================>.............] - ETA: 1:42 - loss: 0.4538 - regression_loss: 0.4156 - classification_loss: 0.0382 296/500 [================>.............] - ETA: 1:42 - loss: 0.4538 - regression_loss: 0.4157 - classification_loss: 0.0381 297/500 [================>.............] - ETA: 1:41 - loss: 0.4535 - regression_loss: 0.4155 - classification_loss: 0.0381 298/500 [================>.............] - ETA: 1:41 - loss: 0.4537 - regression_loss: 0.4156 - classification_loss: 0.0381 299/500 [================>.............] - ETA: 1:40 - loss: 0.4535 - regression_loss: 0.4155 - classification_loss: 0.0380 300/500 [=================>............] - ETA: 1:40 - loss: 0.4532 - regression_loss: 0.4152 - classification_loss: 0.0380 301/500 [=================>............] - ETA: 1:39 - loss: 0.4536 - regression_loss: 0.4155 - classification_loss: 0.0381 302/500 [=================>............] - ETA: 1:39 - loss: 0.4533 - regression_loss: 0.4152 - classification_loss: 0.0380 303/500 [=================>............] - ETA: 1:38 - loss: 0.4532 - regression_loss: 0.4152 - classification_loss: 0.0380 304/500 [=================>............] - ETA: 1:38 - loss: 0.4539 - regression_loss: 0.4158 - classification_loss: 0.0381 305/500 [=================>............] - ETA: 1:37 - loss: 0.4533 - regression_loss: 0.4153 - classification_loss: 0.0381 306/500 [=================>............] - ETA: 1:37 - loss: 0.4531 - regression_loss: 0.4151 - classification_loss: 0.0380 307/500 [=================>............] - ETA: 1:36 - loss: 0.4528 - regression_loss: 0.4149 - classification_loss: 0.0379 308/500 [=================>............] - ETA: 1:36 - loss: 0.4531 - regression_loss: 0.4151 - classification_loss: 0.0379 309/500 [=================>............] - ETA: 1:35 - loss: 0.4536 - regression_loss: 0.4157 - classification_loss: 0.0379 310/500 [=================>............] - ETA: 1:35 - loss: 0.4533 - regression_loss: 0.4155 - classification_loss: 0.0379 311/500 [=================>............] - ETA: 1:34 - loss: 0.4530 - regression_loss: 0.4152 - classification_loss: 0.0378 312/500 [=================>............] - ETA: 1:34 - loss: 0.4520 - regression_loss: 0.4143 - classification_loss: 0.0377 313/500 [=================>............] - ETA: 1:33 - loss: 0.4517 - regression_loss: 0.4140 - classification_loss: 0.0377 314/500 [=================>............] - ETA: 1:33 - loss: 0.4517 - regression_loss: 0.4140 - classification_loss: 0.0377 315/500 [=================>............] - ETA: 1:32 - loss: 0.4512 - regression_loss: 0.4135 - classification_loss: 0.0376 316/500 [=================>............] - ETA: 1:32 - loss: 0.4512 - regression_loss: 0.4136 - classification_loss: 0.0376 317/500 [==================>...........] - ETA: 1:31 - loss: 0.4506 - regression_loss: 0.4130 - classification_loss: 0.0375 318/500 [==================>...........] - ETA: 1:31 - loss: 0.4513 - regression_loss: 0.4136 - classification_loss: 0.0376 319/500 [==================>...........] - ETA: 1:30 - loss: 0.4517 - regression_loss: 0.4141 - classification_loss: 0.0376 320/500 [==================>...........] - ETA: 1:30 - loss: 0.4508 - regression_loss: 0.4132 - classification_loss: 0.0375 321/500 [==================>...........] - ETA: 1:29 - loss: 0.4506 - regression_loss: 0.4131 - classification_loss: 0.0375 322/500 [==================>...........] - ETA: 1:29 - loss: 0.4501 - regression_loss: 0.4126 - classification_loss: 0.0375 323/500 [==================>...........] - ETA: 1:28 - loss: 0.4494 - regression_loss: 0.4120 - classification_loss: 0.0374 324/500 [==================>...........] - ETA: 1:28 - loss: 0.4495 - regression_loss: 0.4121 - classification_loss: 0.0374 325/500 [==================>...........] - ETA: 1:27 - loss: 0.4504 - regression_loss: 0.4129 - classification_loss: 0.0375 326/500 [==================>...........] - ETA: 1:27 - loss: 0.4508 - regression_loss: 0.4133 - classification_loss: 0.0375 327/500 [==================>...........] - ETA: 1:26 - loss: 0.4509 - regression_loss: 0.4134 - classification_loss: 0.0375 328/500 [==================>...........] - ETA: 1:26 - loss: 0.4504 - regression_loss: 0.4129 - classification_loss: 0.0374 329/500 [==================>...........] - ETA: 1:25 - loss: 0.4502 - regression_loss: 0.4128 - classification_loss: 0.0374 330/500 [==================>...........] - ETA: 1:25 - loss: 0.4497 - regression_loss: 0.4123 - classification_loss: 0.0374 331/500 [==================>...........] - ETA: 1:24 - loss: 0.4503 - regression_loss: 0.4129 - classification_loss: 0.0374 332/500 [==================>...........] - ETA: 1:24 - loss: 0.4504 - regression_loss: 0.4130 - classification_loss: 0.0374 333/500 [==================>...........] - ETA: 1:23 - loss: 0.4501 - regression_loss: 0.4127 - classification_loss: 0.0374 334/500 [===================>..........] - ETA: 1:23 - loss: 0.4495 - regression_loss: 0.4122 - classification_loss: 0.0373 335/500 [===================>..........] - ETA: 1:22 - loss: 0.4495 - regression_loss: 0.4122 - classification_loss: 0.0373 336/500 [===================>..........] - ETA: 1:22 - loss: 0.4494 - regression_loss: 0.4121 - classification_loss: 0.0373 337/500 [===================>..........] - ETA: 1:21 - loss: 0.4488 - regression_loss: 0.4116 - classification_loss: 0.0372 338/500 [===================>..........] - ETA: 1:21 - loss: 0.4482 - regression_loss: 0.4111 - classification_loss: 0.0372 339/500 [===================>..........] - ETA: 1:20 - loss: 0.4480 - regression_loss: 0.4108 - classification_loss: 0.0372 340/500 [===================>..........] - ETA: 1:20 - loss: 0.4477 - regression_loss: 0.4106 - classification_loss: 0.0371 341/500 [===================>..........] - ETA: 1:19 - loss: 0.4471 - regression_loss: 0.4101 - classification_loss: 0.0371 342/500 [===================>..........] - ETA: 1:19 - loss: 0.4472 - regression_loss: 0.4101 - classification_loss: 0.0371 343/500 [===================>..........] - ETA: 1:18 - loss: 0.4472 - regression_loss: 0.4101 - classification_loss: 0.0371 344/500 [===================>..........] - ETA: 1:18 - loss: 0.4467 - regression_loss: 0.4096 - classification_loss: 0.0371 345/500 [===================>..........] - ETA: 1:17 - loss: 0.4461 - regression_loss: 0.4091 - classification_loss: 0.0370 346/500 [===================>..........] - ETA: 1:17 - loss: 0.4458 - regression_loss: 0.4088 - classification_loss: 0.0370 347/500 [===================>..........] - ETA: 1:16 - loss: 0.4459 - regression_loss: 0.4089 - classification_loss: 0.0370 348/500 [===================>..........] - ETA: 1:16 - loss: 0.4452 - regression_loss: 0.4083 - classification_loss: 0.0370 349/500 [===================>..........] - ETA: 1:15 - loss: 0.4452 - regression_loss: 0.4083 - classification_loss: 0.0370 350/500 [====================>.........] - ETA: 1:15 - loss: 0.4452 - regression_loss: 0.4082 - classification_loss: 0.0370 351/500 [====================>.........] - ETA: 1:14 - loss: 0.4457 - regression_loss: 0.4088 - classification_loss: 0.0370 352/500 [====================>.........] - ETA: 1:14 - loss: 0.4468 - regression_loss: 0.4096 - classification_loss: 0.0372 353/500 [====================>.........] - ETA: 1:13 - loss: 0.4469 - regression_loss: 0.4097 - classification_loss: 0.0372 354/500 [====================>.........] - ETA: 1:13 - loss: 0.4463 - regression_loss: 0.4092 - classification_loss: 0.0371 355/500 [====================>.........] - ETA: 1:12 - loss: 0.4462 - regression_loss: 0.4091 - classification_loss: 0.0371 356/500 [====================>.........] - ETA: 1:12 - loss: 0.4460 - regression_loss: 0.4089 - classification_loss: 0.0370 357/500 [====================>.........] - ETA: 1:11 - loss: 0.4460 - regression_loss: 0.4090 - classification_loss: 0.0370 358/500 [====================>.........] - ETA: 1:11 - loss: 0.4456 - regression_loss: 0.4086 - classification_loss: 0.0370 359/500 [====================>.........] - ETA: 1:10 - loss: 0.4455 - regression_loss: 0.4085 - classification_loss: 0.0370 360/500 [====================>.........] - ETA: 1:10 - loss: 0.4453 - regression_loss: 0.4083 - classification_loss: 0.0370 361/500 [====================>.........] - ETA: 1:09 - loss: 0.4454 - regression_loss: 0.4083 - classification_loss: 0.0370 362/500 [====================>.........] - ETA: 1:09 - loss: 0.4453 - regression_loss: 0.4083 - classification_loss: 0.0370 363/500 [====================>.........] - ETA: 1:08 - loss: 0.4463 - regression_loss: 0.4092 - classification_loss: 0.0371 364/500 [====================>.........] - ETA: 1:08 - loss: 0.4464 - regression_loss: 0.4092 - classification_loss: 0.0372 365/500 [====================>.........] - ETA: 1:07 - loss: 0.4462 - regression_loss: 0.4090 - classification_loss: 0.0372 366/500 [====================>.........] - ETA: 1:07 - loss: 0.4462 - regression_loss: 0.4090 - classification_loss: 0.0372 367/500 [=====================>........] - ETA: 1:06 - loss: 0.4463 - regression_loss: 0.4092 - classification_loss: 0.0371 368/500 [=====================>........] - ETA: 1:06 - loss: 0.4459 - regression_loss: 0.4088 - classification_loss: 0.0371 369/500 [=====================>........] - ETA: 1:05 - loss: 0.4463 - regression_loss: 0.4092 - classification_loss: 0.0371 370/500 [=====================>........] - ETA: 1:05 - loss: 0.4469 - regression_loss: 0.4098 - classification_loss: 0.0371 371/500 [=====================>........] - ETA: 1:04 - loss: 0.4469 - regression_loss: 0.4098 - classification_loss: 0.0371 372/500 [=====================>........] - ETA: 1:04 - loss: 0.4470 - regression_loss: 0.4099 - classification_loss: 0.0371 373/500 [=====================>........] - ETA: 1:03 - loss: 0.4470 - regression_loss: 0.4099 - classification_loss: 0.0371 374/500 [=====================>........] - ETA: 1:03 - loss: 0.4465 - regression_loss: 0.4094 - classification_loss: 0.0371 375/500 [=====================>........] - ETA: 1:02 - loss: 0.4470 - regression_loss: 0.4099 - classification_loss: 0.0371 376/500 [=====================>........] - ETA: 1:02 - loss: 0.4471 - regression_loss: 0.4099 - classification_loss: 0.0371 377/500 [=====================>........] - ETA: 1:01 - loss: 0.4464 - regression_loss: 0.4093 - classification_loss: 0.0371 378/500 [=====================>........] - ETA: 1:01 - loss: 0.4463 - regression_loss: 0.4093 - classification_loss: 0.0370 379/500 [=====================>........] - ETA: 1:00 - loss: 0.4470 - regression_loss: 0.4100 - classification_loss: 0.0371 380/500 [=====================>........] - ETA: 1:00 - loss: 0.4471 - regression_loss: 0.4101 - classification_loss: 0.0370 381/500 [=====================>........] - ETA: 59s - loss: 0.4479 - regression_loss: 0.4109 - classification_loss: 0.0370  382/500 [=====================>........] - ETA: 59s - loss: 0.4481 - regression_loss: 0.4111 - classification_loss: 0.0370 383/500 [=====================>........] - ETA: 58s - loss: 0.4485 - regression_loss: 0.4114 - classification_loss: 0.0371 384/500 [======================>.......] - ETA: 58s - loss: 0.4486 - regression_loss: 0.4115 - classification_loss: 0.0371 385/500 [======================>.......] - ETA: 57s - loss: 0.4487 - regression_loss: 0.4116 - classification_loss: 0.0370 386/500 [======================>.......] - ETA: 57s - loss: 0.4486 - regression_loss: 0.4116 - classification_loss: 0.0370 387/500 [======================>.......] - ETA: 56s - loss: 0.4481 - regression_loss: 0.4111 - classification_loss: 0.0370 388/500 [======================>.......] - ETA: 56s - loss: 0.4479 - regression_loss: 0.4109 - classification_loss: 0.0369 389/500 [======================>.......] - ETA: 55s - loss: 0.4483 - regression_loss: 0.4113 - classification_loss: 0.0371 390/500 [======================>.......] - ETA: 55s - loss: 0.4486 - regression_loss: 0.4115 - classification_loss: 0.0371 391/500 [======================>.......] - ETA: 54s - loss: 0.4487 - regression_loss: 0.4117 - classification_loss: 0.0371 392/500 [======================>.......] - ETA: 54s - loss: 0.4483 - regression_loss: 0.4113 - classification_loss: 0.0371 393/500 [======================>.......] - ETA: 53s - loss: 0.4479 - regression_loss: 0.4109 - classification_loss: 0.0370 394/500 [======================>.......] - ETA: 53s - loss: 0.4481 - regression_loss: 0.4111 - classification_loss: 0.0370 395/500 [======================>.......] - ETA: 52s - loss: 0.4479 - regression_loss: 0.4109 - classification_loss: 0.0370 396/500 [======================>.......] - ETA: 52s - loss: 0.4483 - regression_loss: 0.4114 - classification_loss: 0.0370 397/500 [======================>.......] - ETA: 51s - loss: 0.4498 - regression_loss: 0.4126 - classification_loss: 0.0372 398/500 [======================>.......] - ETA: 51s - loss: 0.4492 - regression_loss: 0.4121 - classification_loss: 0.0372 399/500 [======================>.......] - ETA: 50s - loss: 0.4501 - regression_loss: 0.4129 - classification_loss: 0.0373 400/500 [=======================>......] - ETA: 50s - loss: 0.4504 - regression_loss: 0.4131 - classification_loss: 0.0372 401/500 [=======================>......] - ETA: 49s - loss: 0.4509 - regression_loss: 0.4137 - classification_loss: 0.0372 402/500 [=======================>......] - ETA: 49s - loss: 0.4512 - regression_loss: 0.4140 - classification_loss: 0.0372 403/500 [=======================>......] - ETA: 48s - loss: 0.4525 - regression_loss: 0.4152 - classification_loss: 0.0373 404/500 [=======================>......] - ETA: 48s - loss: 0.4524 - regression_loss: 0.4152 - classification_loss: 0.0372 405/500 [=======================>......] - ETA: 47s - loss: 0.4520 - regression_loss: 0.4148 - classification_loss: 0.0372 406/500 [=======================>......] - ETA: 47s - loss: 0.4515 - regression_loss: 0.4144 - classification_loss: 0.0371 407/500 [=======================>......] - ETA: 46s - loss: 0.4520 - regression_loss: 0.4148 - classification_loss: 0.0372 408/500 [=======================>......] - ETA: 46s - loss: 0.4523 - regression_loss: 0.4151 - classification_loss: 0.0373 409/500 [=======================>......] - ETA: 45s - loss: 0.4529 - regression_loss: 0.4156 - classification_loss: 0.0373 410/500 [=======================>......] - ETA: 45s - loss: 0.4525 - regression_loss: 0.4152 - classification_loss: 0.0372 411/500 [=======================>......] - ETA: 44s - loss: 0.4526 - regression_loss: 0.4153 - classification_loss: 0.0373 412/500 [=======================>......] - ETA: 44s - loss: 0.4525 - regression_loss: 0.4153 - classification_loss: 0.0372 413/500 [=======================>......] - ETA: 43s - loss: 0.4524 - regression_loss: 0.4152 - classification_loss: 0.0372 414/500 [=======================>......] - ETA: 43s - loss: 0.4524 - regression_loss: 0.4152 - classification_loss: 0.0371 415/500 [=======================>......] - ETA: 42s - loss: 0.4520 - regression_loss: 0.4148 - classification_loss: 0.0371 416/500 [=======================>......] - ETA: 42s - loss: 0.4522 - regression_loss: 0.4151 - classification_loss: 0.0371 417/500 [========================>.....] - ETA: 41s - loss: 0.4526 - regression_loss: 0.4154 - classification_loss: 0.0372 418/500 [========================>.....] - ETA: 41s - loss: 0.4529 - regression_loss: 0.4157 - classification_loss: 0.0372 419/500 [========================>.....] - ETA: 40s - loss: 0.4527 - regression_loss: 0.4155 - classification_loss: 0.0372 420/500 [========================>.....] - ETA: 40s - loss: 0.4530 - regression_loss: 0.4158 - classification_loss: 0.0373 421/500 [========================>.....] - ETA: 39s - loss: 0.4529 - regression_loss: 0.4157 - classification_loss: 0.0373 422/500 [========================>.....] - ETA: 39s - loss: 0.4534 - regression_loss: 0.4161 - classification_loss: 0.0373 423/500 [========================>.....] - ETA: 38s - loss: 0.4530 - regression_loss: 0.4158 - classification_loss: 0.0372 424/500 [========================>.....] - ETA: 38s - loss: 0.4531 - regression_loss: 0.4159 - classification_loss: 0.0372 425/500 [========================>.....] - ETA: 37s - loss: 0.4531 - regression_loss: 0.4159 - classification_loss: 0.0372 426/500 [========================>.....] - ETA: 37s - loss: 0.4530 - regression_loss: 0.4158 - classification_loss: 0.0372 427/500 [========================>.....] - ETA: 36s - loss: 0.4522 - regression_loss: 0.4151 - classification_loss: 0.0371 428/500 [========================>.....] - ETA: 36s - loss: 0.4520 - regression_loss: 0.4149 - classification_loss: 0.0371 429/500 [========================>.....] - ETA: 35s - loss: 0.4523 - regression_loss: 0.4151 - classification_loss: 0.0371 430/500 [========================>.....] - ETA: 35s - loss: 0.4523 - regression_loss: 0.4151 - classification_loss: 0.0372 431/500 [========================>.....] - ETA: 34s - loss: 0.4519 - regression_loss: 0.4148 - classification_loss: 0.0371 432/500 [========================>.....] - ETA: 34s - loss: 0.4516 - regression_loss: 0.4145 - classification_loss: 0.0371 433/500 [========================>.....] - ETA: 33s - loss: 0.4509 - regression_loss: 0.4139 - classification_loss: 0.0370 434/500 [=========================>....] - ETA: 33s - loss: 0.4504 - regression_loss: 0.4135 - classification_loss: 0.0370 435/500 [=========================>....] - ETA: 32s - loss: 0.4497 - regression_loss: 0.4128 - classification_loss: 0.0369 436/500 [=========================>....] - ETA: 32s - loss: 0.4490 - regression_loss: 0.4122 - classification_loss: 0.0368 437/500 [=========================>....] - ETA: 31s - loss: 0.4490 - regression_loss: 0.4122 - classification_loss: 0.0368 438/500 [=========================>....] - ETA: 31s - loss: 0.4490 - regression_loss: 0.4122 - classification_loss: 0.0368 439/500 [=========================>....] - ETA: 30s - loss: 0.4486 - regression_loss: 0.4118 - classification_loss: 0.0368 440/500 [=========================>....] - ETA: 30s - loss: 0.4485 - regression_loss: 0.4117 - classification_loss: 0.0368 441/500 [=========================>....] - ETA: 29s - loss: 0.4482 - regression_loss: 0.4115 - classification_loss: 0.0367 442/500 [=========================>....] - ETA: 29s - loss: 0.4481 - regression_loss: 0.4113 - classification_loss: 0.0368 443/500 [=========================>....] - ETA: 28s - loss: 0.4475 - regression_loss: 0.4108 - classification_loss: 0.0367 444/500 [=========================>....] - ETA: 28s - loss: 0.4483 - regression_loss: 0.4115 - classification_loss: 0.0368 445/500 [=========================>....] - ETA: 27s - loss: 0.4478 - regression_loss: 0.4110 - classification_loss: 0.0368 446/500 [=========================>....] - ETA: 27s - loss: 0.4473 - regression_loss: 0.4106 - classification_loss: 0.0367 447/500 [=========================>....] - ETA: 26s - loss: 0.4476 - regression_loss: 0.4108 - classification_loss: 0.0368 448/500 [=========================>....] - ETA: 26s - loss: 0.4471 - regression_loss: 0.4104 - classification_loss: 0.0367 449/500 [=========================>....] - ETA: 25s - loss: 0.4476 - regression_loss: 0.4109 - classification_loss: 0.0368 450/500 [==========================>...] - ETA: 25s - loss: 0.4471 - regression_loss: 0.4104 - classification_loss: 0.0367 451/500 [==========================>...] - ETA: 24s - loss: 0.4470 - regression_loss: 0.4103 - classification_loss: 0.0367 452/500 [==========================>...] - ETA: 24s - loss: 0.4465 - regression_loss: 0.4098 - classification_loss: 0.0366 453/500 [==========================>...] - ETA: 23s - loss: 0.4461 - regression_loss: 0.4095 - classification_loss: 0.0366 454/500 [==========================>...] - ETA: 23s - loss: 0.4466 - regression_loss: 0.4100 - classification_loss: 0.0366 455/500 [==========================>...] - ETA: 22s - loss: 0.4471 - regression_loss: 0.4104 - classification_loss: 0.0367 456/500 [==========================>...] - ETA: 22s - loss: 0.4473 - regression_loss: 0.4106 - classification_loss: 0.0367 457/500 [==========================>...] - ETA: 21s - loss: 0.4470 - regression_loss: 0.4103 - classification_loss: 0.0367 458/500 [==========================>...] - ETA: 21s - loss: 0.4470 - regression_loss: 0.4104 - classification_loss: 0.0366 459/500 [==========================>...] - ETA: 20s - loss: 0.4468 - regression_loss: 0.4103 - classification_loss: 0.0366 460/500 [==========================>...] - ETA: 20s - loss: 0.4468 - regression_loss: 0.4102 - classification_loss: 0.0365 461/500 [==========================>...] - ETA: 19s - loss: 0.4476 - regression_loss: 0.4110 - classification_loss: 0.0366 462/500 [==========================>...] - ETA: 19s - loss: 0.4481 - regression_loss: 0.4115 - classification_loss: 0.0367 463/500 [==========================>...] - ETA: 18s - loss: 0.4481 - regression_loss: 0.4115 - classification_loss: 0.0366 464/500 [==========================>...] - ETA: 18s - loss: 0.4481 - regression_loss: 0.4115 - classification_loss: 0.0366 465/500 [==========================>...] - ETA: 17s - loss: 0.4482 - regression_loss: 0.4115 - classification_loss: 0.0367 466/500 [==========================>...] - ETA: 17s - loss: 0.4480 - regression_loss: 0.4113 - classification_loss: 0.0367 467/500 [===========================>..] - ETA: 16s - loss: 0.4482 - regression_loss: 0.4115 - classification_loss: 0.0367 468/500 [===========================>..] - ETA: 16s - loss: 0.4481 - regression_loss: 0.4114 - classification_loss: 0.0367 469/500 [===========================>..] - ETA: 15s - loss: 0.4476 - regression_loss: 0.4109 - classification_loss: 0.0367 470/500 [===========================>..] - ETA: 15s - loss: 0.4476 - regression_loss: 0.4109 - classification_loss: 0.0367 471/500 [===========================>..] - ETA: 14s - loss: 0.4478 - regression_loss: 0.4111 - classification_loss: 0.0367 472/500 [===========================>..] - ETA: 14s - loss: 0.4482 - regression_loss: 0.4114 - classification_loss: 0.0368 473/500 [===========================>..] - ETA: 13s - loss: 0.4486 - regression_loss: 0.4118 - classification_loss: 0.0368 474/500 [===========================>..] - ETA: 13s - loss: 0.4487 - regression_loss: 0.4119 - classification_loss: 0.0368 475/500 [===========================>..] - ETA: 12s - loss: 0.4488 - regression_loss: 0.4120 - classification_loss: 0.0368 476/500 [===========================>..] - ETA: 12s - loss: 0.4493 - regression_loss: 0.4125 - classification_loss: 0.0368 477/500 [===========================>..] - ETA: 11s - loss: 0.4499 - regression_loss: 0.4130 - classification_loss: 0.0369 478/500 [===========================>..] - ETA: 11s - loss: 0.4501 - regression_loss: 0.4131 - classification_loss: 0.0369 479/500 [===========================>..] - ETA: 10s - loss: 0.4505 - regression_loss: 0.4135 - classification_loss: 0.0370 480/500 [===========================>..] - ETA: 10s - loss: 0.4508 - regression_loss: 0.4138 - classification_loss: 0.0370 481/500 [===========================>..] - ETA: 9s - loss: 0.4509 - regression_loss: 0.4138 - classification_loss: 0.0370  482/500 [===========================>..] - ETA: 9s - loss: 0.4508 - regression_loss: 0.4139 - classification_loss: 0.0370 483/500 [===========================>..] - ETA: 8s - loss: 0.4510 - regression_loss: 0.4140 - classification_loss: 0.0370 484/500 [============================>.] - ETA: 8s - loss: 0.4507 - regression_loss: 0.4138 - classification_loss: 0.0369 485/500 [============================>.] - ETA: 7s - loss: 0.4502 - regression_loss: 0.4133 - classification_loss: 0.0369 486/500 [============================>.] - ETA: 7s - loss: 0.4503 - regression_loss: 0.4134 - classification_loss: 0.0369 487/500 [============================>.] - ETA: 6s - loss: 0.4499 - regression_loss: 0.4131 - classification_loss: 0.0368 488/500 [============================>.] - ETA: 6s - loss: 0.4514 - regression_loss: 0.4144 - classification_loss: 0.0369 489/500 [============================>.] - ETA: 5s - loss: 0.4512 - regression_loss: 0.4143 - classification_loss: 0.0369 490/500 [============================>.] - ETA: 5s - loss: 0.4512 - regression_loss: 0.4143 - classification_loss: 0.0369 491/500 [============================>.] - ETA: 4s - loss: 0.4508 - regression_loss: 0.4139 - classification_loss: 0.0369 492/500 [============================>.] - ETA: 4s - loss: 0.4509 - regression_loss: 0.4140 - classification_loss: 0.0369 493/500 [============================>.] - ETA: 3s - loss: 0.4509 - regression_loss: 0.4140 - classification_loss: 0.0369 494/500 [============================>.] - ETA: 3s - loss: 0.4511 - regression_loss: 0.4142 - classification_loss: 0.0369 495/500 [============================>.] - ETA: 2s - loss: 0.4512 - regression_loss: 0.4143 - classification_loss: 0.0369 496/500 [============================>.] - ETA: 2s - loss: 0.4512 - regression_loss: 0.4143 - classification_loss: 0.0369 497/500 [============================>.] - ETA: 1s - loss: 0.4508 - regression_loss: 0.4140 - classification_loss: 0.0369 498/500 [============================>.] - ETA: 1s - loss: 0.4508 - regression_loss: 0.4139 - classification_loss: 0.0369 499/500 [============================>.] - ETA: 0s - loss: 0.4501 - regression_loss: 0.4133 - classification_loss: 0.0368 500/500 [==============================] - 251s 502ms/step - loss: 0.4498 - regression_loss: 0.4130 - classification_loss: 0.0368 1172 instances of class plum with average precision: 0.7278 mAP: 0.7278 Epoch 00003: saving model to ./training/snapshots/resnet101_pascal_03.h5 Epoch 4/150 1/500 [..............................] - ETA: 4:18 - loss: 0.3838 - regression_loss: 0.3627 - classification_loss: 0.0212 2/500 [..............................] - ETA: 4:16 - loss: 0.4421 - regression_loss: 0.4140 - classification_loss: 0.0280 3/500 [..............................] - ETA: 4:13 - loss: 0.4381 - regression_loss: 0.4046 - classification_loss: 0.0335 4/500 [..............................] - ETA: 4:10 - loss: 0.4656 - regression_loss: 0.4299 - classification_loss: 0.0357 5/500 [..............................] - ETA: 4:10 - loss: 0.4440 - regression_loss: 0.4106 - classification_loss: 0.0334 6/500 [..............................] - ETA: 4:09 - loss: 0.4765 - regression_loss: 0.4415 - classification_loss: 0.0349 7/500 [..............................] - ETA: 4:09 - loss: 0.5258 - regression_loss: 0.4879 - classification_loss: 0.0379 8/500 [..............................] - ETA: 4:08 - loss: 0.5359 - regression_loss: 0.4929 - classification_loss: 0.0430 9/500 [..............................] - ETA: 4:08 - loss: 0.5432 - regression_loss: 0.4962 - classification_loss: 0.0470 10/500 [..............................] - ETA: 4:08 - loss: 0.5485 - regression_loss: 0.5015 - classification_loss: 0.0470 11/500 [..............................] - ETA: 4:06 - loss: 0.5427 - regression_loss: 0.4958 - classification_loss: 0.0469 12/500 [..............................] - ETA: 4:06 - loss: 0.5453 - regression_loss: 0.4993 - classification_loss: 0.0459 13/500 [..............................] - ETA: 4:05 - loss: 0.5163 - regression_loss: 0.4731 - classification_loss: 0.0432 14/500 [..............................] - ETA: 4:05 - loss: 0.5258 - regression_loss: 0.4812 - classification_loss: 0.0446 15/500 [..............................] - ETA: 4:05 - loss: 0.5295 - regression_loss: 0.4853 - classification_loss: 0.0442 16/500 [..............................] - ETA: 4:04 - loss: 0.5471 - regression_loss: 0.5035 - classification_loss: 0.0436 17/500 [>.............................] - ETA: 4:03 - loss: 0.5484 - regression_loss: 0.5043 - classification_loss: 0.0440 18/500 [>.............................] - ETA: 4:03 - loss: 0.5412 - regression_loss: 0.4965 - classification_loss: 0.0447 19/500 [>.............................] - ETA: 4:02 - loss: 0.5294 - regression_loss: 0.4862 - classification_loss: 0.0432 20/500 [>.............................] - ETA: 4:02 - loss: 0.5183 - regression_loss: 0.4763 - classification_loss: 0.0421 21/500 [>.............................] - ETA: 4:01 - loss: 0.5183 - regression_loss: 0.4765 - classification_loss: 0.0418 22/500 [>.............................] - ETA: 4:00 - loss: 0.5182 - regression_loss: 0.4771 - classification_loss: 0.0412 23/500 [>.............................] - ETA: 4:00 - loss: 0.5069 - regression_loss: 0.4668 - classification_loss: 0.0402 24/500 [>.............................] - ETA: 4:00 - loss: 0.5063 - regression_loss: 0.4660 - classification_loss: 0.0403 25/500 [>.............................] - ETA: 3:59 - loss: 0.5054 - regression_loss: 0.4654 - classification_loss: 0.0399 26/500 [>.............................] - ETA: 3:58 - loss: 0.5071 - regression_loss: 0.4674 - classification_loss: 0.0397 27/500 [>.............................] - ETA: 3:59 - loss: 0.5272 - regression_loss: 0.4874 - classification_loss: 0.0397 28/500 [>.............................] - ETA: 3:58 - loss: 0.5285 - regression_loss: 0.4888 - classification_loss: 0.0397 29/500 [>.............................] - ETA: 3:57 - loss: 0.5274 - regression_loss: 0.4877 - classification_loss: 0.0397 30/500 [>.............................] - ETA: 3:56 - loss: 0.5254 - regression_loss: 0.4857 - classification_loss: 0.0397 31/500 [>.............................] - ETA: 3:56 - loss: 0.5208 - regression_loss: 0.4818 - classification_loss: 0.0390 32/500 [>.............................] - ETA: 3:56 - loss: 0.5318 - regression_loss: 0.4911 - classification_loss: 0.0407 33/500 [>.............................] - ETA: 3:55 - loss: 0.5236 - regression_loss: 0.4835 - classification_loss: 0.0401 34/500 [=>............................] - ETA: 3:54 - loss: 0.5129 - regression_loss: 0.4737 - classification_loss: 0.0391 35/500 [=>............................] - ETA: 3:54 - loss: 0.5153 - regression_loss: 0.4755 - classification_loss: 0.0398 36/500 [=>............................] - ETA: 3:53 - loss: 0.5174 - regression_loss: 0.4772 - classification_loss: 0.0402 37/500 [=>............................] - ETA: 3:53 - loss: 0.5123 - regression_loss: 0.4729 - classification_loss: 0.0394 38/500 [=>............................] - ETA: 3:52 - loss: 0.5128 - regression_loss: 0.4732 - classification_loss: 0.0396 39/500 [=>............................] - ETA: 3:52 - loss: 0.5147 - regression_loss: 0.4740 - classification_loss: 0.0407 40/500 [=>............................] - ETA: 3:51 - loss: 0.5149 - regression_loss: 0.4739 - classification_loss: 0.0409 41/500 [=>............................] - ETA: 3:51 - loss: 0.5121 - regression_loss: 0.4716 - classification_loss: 0.0405 42/500 [=>............................] - ETA: 3:50 - loss: 0.5132 - regression_loss: 0.4718 - classification_loss: 0.0415 43/500 [=>............................] - ETA: 3:50 - loss: 0.5062 - regression_loss: 0.4653 - classification_loss: 0.0409 44/500 [=>............................] - ETA: 3:49 - loss: 0.5089 - regression_loss: 0.4683 - classification_loss: 0.0406 45/500 [=>............................] - ETA: 3:49 - loss: 0.5193 - regression_loss: 0.4767 - classification_loss: 0.0426 46/500 [=>............................] - ETA: 3:48 - loss: 0.5240 - regression_loss: 0.4806 - classification_loss: 0.0434 47/500 [=>............................] - ETA: 3:48 - loss: 0.5235 - regression_loss: 0.4805 - classification_loss: 0.0429 48/500 [=>............................] - ETA: 3:47 - loss: 0.5186 - regression_loss: 0.4761 - classification_loss: 0.0425 49/500 [=>............................] - ETA: 3:47 - loss: 0.5174 - regression_loss: 0.4749 - classification_loss: 0.0425 50/500 [==>...........................] - ETA: 3:46 - loss: 0.5120 - regression_loss: 0.4700 - classification_loss: 0.0420 51/500 [==>...........................] - ETA: 3:46 - loss: 0.5178 - regression_loss: 0.4744 - classification_loss: 0.0434 52/500 [==>...........................] - ETA: 3:45 - loss: 0.5153 - regression_loss: 0.4723 - classification_loss: 0.0431 53/500 [==>...........................] - ETA: 3:45 - loss: 0.5113 - regression_loss: 0.4688 - classification_loss: 0.0425 54/500 [==>...........................] - ETA: 3:44 - loss: 0.5061 - regression_loss: 0.4642 - classification_loss: 0.0419 55/500 [==>...........................] - ETA: 3:43 - loss: 0.5035 - regression_loss: 0.4619 - classification_loss: 0.0415 56/500 [==>...........................] - ETA: 3:43 - loss: 0.5015 - regression_loss: 0.4595 - classification_loss: 0.0421 57/500 [==>...........................] - ETA: 3:43 - loss: 0.5029 - regression_loss: 0.4604 - classification_loss: 0.0425 58/500 [==>...........................] - ETA: 3:42 - loss: 0.5024 - regression_loss: 0.4600 - classification_loss: 0.0424 59/500 [==>...........................] - ETA: 3:41 - loss: 0.5025 - regression_loss: 0.4601 - classification_loss: 0.0424 60/500 [==>...........................] - ETA: 3:41 - loss: 0.5004 - regression_loss: 0.4582 - classification_loss: 0.0422 61/500 [==>...........................] - ETA: 3:40 - loss: 0.5005 - regression_loss: 0.4578 - classification_loss: 0.0427 62/500 [==>...........................] - ETA: 3:40 - loss: 0.5000 - regression_loss: 0.4574 - classification_loss: 0.0426 63/500 [==>...........................] - ETA: 3:40 - loss: 0.4973 - regression_loss: 0.4550 - classification_loss: 0.0423 64/500 [==>...........................] - ETA: 3:39 - loss: 0.4937 - regression_loss: 0.4516 - classification_loss: 0.0421 65/500 [==>...........................] - ETA: 3:38 - loss: 0.4918 - regression_loss: 0.4497 - classification_loss: 0.0421 66/500 [==>...........................] - ETA: 3:38 - loss: 0.4876 - regression_loss: 0.4459 - classification_loss: 0.0417 67/500 [===>..........................] - ETA: 3:38 - loss: 0.4858 - regression_loss: 0.4442 - classification_loss: 0.0415 68/500 [===>..........................] - ETA: 3:37 - loss: 0.4860 - regression_loss: 0.4443 - classification_loss: 0.0417 69/500 [===>..........................] - ETA: 3:37 - loss: 0.4847 - regression_loss: 0.4432 - classification_loss: 0.0415 70/500 [===>..........................] - ETA: 3:36 - loss: 0.4817 - regression_loss: 0.4403 - classification_loss: 0.0414 71/500 [===>..........................] - ETA: 3:36 - loss: 0.4788 - regression_loss: 0.4376 - classification_loss: 0.0412 72/500 [===>..........................] - ETA: 3:35 - loss: 0.4808 - regression_loss: 0.4396 - classification_loss: 0.0412 73/500 [===>..........................] - ETA: 3:35 - loss: 0.4822 - regression_loss: 0.4409 - classification_loss: 0.0413 74/500 [===>..........................] - ETA: 3:34 - loss: 0.4812 - regression_loss: 0.4402 - classification_loss: 0.0410 75/500 [===>..........................] - ETA: 3:34 - loss: 0.4786 - regression_loss: 0.4379 - classification_loss: 0.0407 76/500 [===>..........................] - ETA: 3:33 - loss: 0.4812 - regression_loss: 0.4397 - classification_loss: 0.0415 77/500 [===>..........................] - ETA: 3:32 - loss: 0.4803 - regression_loss: 0.4389 - classification_loss: 0.0414 78/500 [===>..........................] - ETA: 3:32 - loss: 0.4828 - regression_loss: 0.4410 - classification_loss: 0.0419 79/500 [===>..........................] - ETA: 3:31 - loss: 0.4807 - regression_loss: 0.4390 - classification_loss: 0.0417 80/500 [===>..........................] - ETA: 3:31 - loss: 0.4800 - regression_loss: 0.4385 - classification_loss: 0.0415 81/500 [===>..........................] - ETA: 3:30 - loss: 0.4788 - regression_loss: 0.4373 - classification_loss: 0.0415 82/500 [===>..........................] - ETA: 3:30 - loss: 0.4759 - regression_loss: 0.4348 - classification_loss: 0.0411 83/500 [===>..........................] - ETA: 3:30 - loss: 0.4737 - regression_loss: 0.4329 - classification_loss: 0.0409 84/500 [====>.........................] - ETA: 3:29 - loss: 0.4715 - regression_loss: 0.4308 - classification_loss: 0.0406 85/500 [====>.........................] - ETA: 3:29 - loss: 0.4729 - regression_loss: 0.4321 - classification_loss: 0.0408 86/500 [====>.........................] - ETA: 3:28 - loss: 0.4720 - regression_loss: 0.4314 - classification_loss: 0.0406 87/500 [====>.........................] - ETA: 3:27 - loss: 0.4723 - regression_loss: 0.4317 - classification_loss: 0.0406 88/500 [====>.........................] - ETA: 3:27 - loss: 0.4702 - regression_loss: 0.4299 - classification_loss: 0.0403 89/500 [====>.........................] - ETA: 3:26 - loss: 0.4697 - regression_loss: 0.4295 - classification_loss: 0.0402 90/500 [====>.........................] - ETA: 3:26 - loss: 0.4697 - regression_loss: 0.4295 - classification_loss: 0.0402 91/500 [====>.........................] - ETA: 3:25 - loss: 0.4673 - regression_loss: 0.4273 - classification_loss: 0.0400 92/500 [====>.........................] - ETA: 3:25 - loss: 0.4673 - regression_loss: 0.4274 - classification_loss: 0.0399 93/500 [====>.........................] - ETA: 3:24 - loss: 0.4677 - regression_loss: 0.4277 - classification_loss: 0.0400 94/500 [====>.........................] - ETA: 3:24 - loss: 0.4675 - regression_loss: 0.4276 - classification_loss: 0.0399 95/500 [====>.........................] - ETA: 3:23 - loss: 0.4702 - regression_loss: 0.4304 - classification_loss: 0.0398 96/500 [====>.........................] - ETA: 3:23 - loss: 0.4715 - regression_loss: 0.4314 - classification_loss: 0.0402 97/500 [====>.........................] - ETA: 3:22 - loss: 0.4731 - regression_loss: 0.4326 - classification_loss: 0.0405 98/500 [====>.........................] - ETA: 3:22 - loss: 0.4706 - regression_loss: 0.4304 - classification_loss: 0.0402 99/500 [====>.........................] - ETA: 3:21 - loss: 0.4683 - regression_loss: 0.4283 - classification_loss: 0.0400 100/500 [=====>........................] - ETA: 3:21 - loss: 0.4694 - regression_loss: 0.4294 - classification_loss: 0.0400 101/500 [=====>........................] - ETA: 3:20 - loss: 0.4692 - regression_loss: 0.4292 - classification_loss: 0.0400 102/500 [=====>........................] - ETA: 3:20 - loss: 0.4704 - regression_loss: 0.4302 - classification_loss: 0.0402 103/500 [=====>........................] - ETA: 3:19 - loss: 0.4732 - regression_loss: 0.4324 - classification_loss: 0.0408 104/500 [=====>........................] - ETA: 3:19 - loss: 0.4751 - regression_loss: 0.4342 - classification_loss: 0.0409 105/500 [=====>........................] - ETA: 3:18 - loss: 0.4752 - regression_loss: 0.4344 - classification_loss: 0.0408 106/500 [=====>........................] - ETA: 3:18 - loss: 0.4755 - regression_loss: 0.4346 - classification_loss: 0.0410 107/500 [=====>........................] - ETA: 3:17 - loss: 0.4738 - regression_loss: 0.4330 - classification_loss: 0.0407 108/500 [=====>........................] - ETA: 3:17 - loss: 0.4721 - regression_loss: 0.4315 - classification_loss: 0.0406 109/500 [=====>........................] - ETA: 3:16 - loss: 0.4708 - regression_loss: 0.4305 - classification_loss: 0.0404 110/500 [=====>........................] - ETA: 3:16 - loss: 0.4687 - regression_loss: 0.4286 - classification_loss: 0.0402 111/500 [=====>........................] - ETA: 3:15 - loss: 0.4679 - regression_loss: 0.4279 - classification_loss: 0.0400 112/500 [=====>........................] - ETA: 3:15 - loss: 0.4688 - regression_loss: 0.4287 - classification_loss: 0.0401 113/500 [=====>........................] - ETA: 3:14 - loss: 0.4700 - regression_loss: 0.4298 - classification_loss: 0.0402 114/500 [=====>........................] - ETA: 3:14 - loss: 0.4695 - regression_loss: 0.4294 - classification_loss: 0.0401 115/500 [=====>........................] - ETA: 3:13 - loss: 0.4680 - regression_loss: 0.4281 - classification_loss: 0.0398 116/500 [=====>........................] - ETA: 3:13 - loss: 0.4734 - regression_loss: 0.4333 - classification_loss: 0.0400 117/500 [======>.......................] - ETA: 3:12 - loss: 0.4717 - regression_loss: 0.4318 - classification_loss: 0.0399 118/500 [======>.......................] - ETA: 3:12 - loss: 0.4709 - regression_loss: 0.4311 - classification_loss: 0.0398 119/500 [======>.......................] - ETA: 3:11 - loss: 0.4677 - regression_loss: 0.4280 - classification_loss: 0.0397 120/500 [======>.......................] - ETA: 3:11 - loss: 0.4694 - regression_loss: 0.4298 - classification_loss: 0.0396 121/500 [======>.......................] - ETA: 3:10 - loss: 0.4682 - regression_loss: 0.4288 - classification_loss: 0.0394 122/500 [======>.......................] - ETA: 3:10 - loss: 0.4679 - regression_loss: 0.4285 - classification_loss: 0.0393 123/500 [======>.......................] - ETA: 3:09 - loss: 0.4668 - regression_loss: 0.4276 - classification_loss: 0.0392 124/500 [======>.......................] - ETA: 3:09 - loss: 0.4660 - regression_loss: 0.4268 - classification_loss: 0.0392 125/500 [======>.......................] - ETA: 3:08 - loss: 0.4644 - regression_loss: 0.4255 - classification_loss: 0.0389 126/500 [======>.......................] - ETA: 3:08 - loss: 0.4632 - regression_loss: 0.4244 - classification_loss: 0.0388 127/500 [======>.......................] - ETA: 3:07 - loss: 0.4635 - regression_loss: 0.4247 - classification_loss: 0.0388 128/500 [======>.......................] - ETA: 3:07 - loss: 0.4635 - regression_loss: 0.4247 - classification_loss: 0.0389 129/500 [======>.......................] - ETA: 3:06 - loss: 0.4646 - regression_loss: 0.4256 - classification_loss: 0.0389 130/500 [======>.......................] - ETA: 3:06 - loss: 0.4645 - regression_loss: 0.4254 - classification_loss: 0.0391 131/500 [======>.......................] - ETA: 3:05 - loss: 0.4643 - regression_loss: 0.4253 - classification_loss: 0.0390 132/500 [======>.......................] - ETA: 3:05 - loss: 0.4642 - regression_loss: 0.4253 - classification_loss: 0.0389 133/500 [======>.......................] - ETA: 3:04 - loss: 0.4627 - regression_loss: 0.4240 - classification_loss: 0.0387 134/500 [=======>......................] - ETA: 3:04 - loss: 0.4615 - regression_loss: 0.4229 - classification_loss: 0.0386 135/500 [=======>......................] - ETA: 3:03 - loss: 0.4616 - regression_loss: 0.4230 - classification_loss: 0.0386 136/500 [=======>......................] - ETA: 3:03 - loss: 0.4600 - regression_loss: 0.4216 - classification_loss: 0.0384 137/500 [=======>......................] - ETA: 3:02 - loss: 0.4608 - regression_loss: 0.4223 - classification_loss: 0.0385 138/500 [=======>......................] - ETA: 3:02 - loss: 0.4629 - regression_loss: 0.4242 - classification_loss: 0.0387 139/500 [=======>......................] - ETA: 3:01 - loss: 0.4615 - regression_loss: 0.4230 - classification_loss: 0.0385 140/500 [=======>......................] - ETA: 3:01 - loss: 0.4618 - regression_loss: 0.4231 - classification_loss: 0.0386 141/500 [=======>......................] - ETA: 3:00 - loss: 0.4624 - regression_loss: 0.4238 - classification_loss: 0.0386 142/500 [=======>......................] - ETA: 3:00 - loss: 0.4622 - regression_loss: 0.4237 - classification_loss: 0.0385 143/500 [=======>......................] - ETA: 2:59 - loss: 0.4601 - regression_loss: 0.4217 - classification_loss: 0.0384 144/500 [=======>......................] - ETA: 2:59 - loss: 0.4609 - regression_loss: 0.4225 - classification_loss: 0.0385 145/500 [=======>......................] - ETA: 2:58 - loss: 0.4621 - regression_loss: 0.4236 - classification_loss: 0.0385 146/500 [=======>......................] - ETA: 2:58 - loss: 0.4627 - regression_loss: 0.4242 - classification_loss: 0.0385 147/500 [=======>......................] - ETA: 2:57 - loss: 0.4653 - regression_loss: 0.4265 - classification_loss: 0.0388 148/500 [=======>......................] - ETA: 2:57 - loss: 0.4646 - regression_loss: 0.4258 - classification_loss: 0.0388 149/500 [=======>......................] - ETA: 2:56 - loss: 0.4658 - regression_loss: 0.4271 - classification_loss: 0.0387 150/500 [========>.....................] - ETA: 2:56 - loss: 0.4668 - regression_loss: 0.4279 - classification_loss: 0.0389 151/500 [========>.....................] - ETA: 2:55 - loss: 0.4660 - regression_loss: 0.4273 - classification_loss: 0.0387 152/500 [========>.....................] - ETA: 2:55 - loss: 0.4641 - regression_loss: 0.4255 - classification_loss: 0.0386 153/500 [========>.....................] - ETA: 2:54 - loss: 0.4648 - regression_loss: 0.4261 - classification_loss: 0.0386 154/500 [========>.....................] - ETA: 2:54 - loss: 0.4628 - regression_loss: 0.4244 - classification_loss: 0.0384 155/500 [========>.....................] - ETA: 2:53 - loss: 0.4660 - regression_loss: 0.4273 - classification_loss: 0.0386 156/500 [========>.....................] - ETA: 2:53 - loss: 0.4664 - regression_loss: 0.4277 - classification_loss: 0.0387 157/500 [========>.....................] - ETA: 2:52 - loss: 0.4660 - regression_loss: 0.4274 - classification_loss: 0.0386 158/500 [========>.....................] - ETA: 2:52 - loss: 0.4662 - regression_loss: 0.4276 - classification_loss: 0.0386 159/500 [========>.....................] - ETA: 2:51 - loss: 0.4663 - regression_loss: 0.4277 - classification_loss: 0.0386 160/500 [========>.....................] - ETA: 2:51 - loss: 0.4667 - regression_loss: 0.4282 - classification_loss: 0.0385 161/500 [========>.....................] - ETA: 2:50 - loss: 0.4665 - regression_loss: 0.4280 - classification_loss: 0.0384 162/500 [========>.....................] - ETA: 2:50 - loss: 0.4674 - regression_loss: 0.4289 - classification_loss: 0.0385 163/500 [========>.....................] - ETA: 2:49 - loss: 0.4664 - regression_loss: 0.4280 - classification_loss: 0.0385 164/500 [========>.....................] - ETA: 2:49 - loss: 0.4681 - regression_loss: 0.4296 - classification_loss: 0.0385 165/500 [========>.....................] - ETA: 2:48 - loss: 0.4665 - regression_loss: 0.4281 - classification_loss: 0.0383 166/500 [========>.....................] - ETA: 2:48 - loss: 0.4675 - regression_loss: 0.4288 - classification_loss: 0.0386 167/500 [=========>....................] - ETA: 2:47 - loss: 0.4676 - regression_loss: 0.4290 - classification_loss: 0.0386 168/500 [=========>....................] - ETA: 2:47 - loss: 0.4674 - regression_loss: 0.4288 - classification_loss: 0.0386 169/500 [=========>....................] - ETA: 2:46 - loss: 0.4666 - regression_loss: 0.4281 - classification_loss: 0.0385 170/500 [=========>....................] - ETA: 2:46 - loss: 0.4678 - regression_loss: 0.4290 - classification_loss: 0.0387 171/500 [=========>....................] - ETA: 2:45 - loss: 0.4678 - regression_loss: 0.4290 - classification_loss: 0.0387 172/500 [=========>....................] - ETA: 2:45 - loss: 0.4667 - regression_loss: 0.4281 - classification_loss: 0.0386 173/500 [=========>....................] - ETA: 2:44 - loss: 0.4669 - regression_loss: 0.4281 - classification_loss: 0.0388 174/500 [=========>....................] - ETA: 2:44 - loss: 0.4655 - regression_loss: 0.4269 - classification_loss: 0.0387 175/500 [=========>....................] - ETA: 2:43 - loss: 0.4645 - regression_loss: 0.4260 - classification_loss: 0.0385 176/500 [=========>....................] - ETA: 2:43 - loss: 0.4630 - regression_loss: 0.4246 - classification_loss: 0.0384 177/500 [=========>....................] - ETA: 2:42 - loss: 0.4650 - regression_loss: 0.4265 - classification_loss: 0.0386 178/500 [=========>....................] - ETA: 2:42 - loss: 0.4657 - regression_loss: 0.4271 - classification_loss: 0.0385 179/500 [=========>....................] - ETA: 2:41 - loss: 0.4644 - regression_loss: 0.4260 - classification_loss: 0.0384 180/500 [=========>....................] - ETA: 2:41 - loss: 0.4637 - regression_loss: 0.4254 - classification_loss: 0.0383 181/500 [=========>....................] - ETA: 2:40 - loss: 0.4631 - regression_loss: 0.4249 - classification_loss: 0.0382 182/500 [=========>....................] - ETA: 2:40 - loss: 0.4628 - regression_loss: 0.4246 - classification_loss: 0.0382 183/500 [=========>....................] - ETA: 2:39 - loss: 0.4630 - regression_loss: 0.4248 - classification_loss: 0.0382 184/500 [==========>...................] - ETA: 2:39 - loss: 0.4623 - regression_loss: 0.4240 - classification_loss: 0.0382 185/500 [==========>...................] - ETA: 2:38 - loss: 0.4613 - regression_loss: 0.4232 - classification_loss: 0.0381 186/500 [==========>...................] - ETA: 2:38 - loss: 0.4603 - regression_loss: 0.4222 - classification_loss: 0.0381 187/500 [==========>...................] - ETA: 2:37 - loss: 0.4596 - regression_loss: 0.4216 - classification_loss: 0.0379 188/500 [==========>...................] - ETA: 2:37 - loss: 0.4589 - regression_loss: 0.4209 - classification_loss: 0.0380 189/500 [==========>...................] - ETA: 2:36 - loss: 0.4596 - regression_loss: 0.4216 - classification_loss: 0.0380 190/500 [==========>...................] - ETA: 2:35 - loss: 0.4593 - regression_loss: 0.4213 - classification_loss: 0.0380 191/500 [==========>...................] - ETA: 2:35 - loss: 0.4584 - regression_loss: 0.4206 - classification_loss: 0.0379 192/500 [==========>...................] - ETA: 2:34 - loss: 0.4587 - regression_loss: 0.4208 - classification_loss: 0.0379 193/500 [==========>...................] - ETA: 2:34 - loss: 0.4586 - regression_loss: 0.4207 - classification_loss: 0.0379 194/500 [==========>...................] - ETA: 2:33 - loss: 0.4593 - regression_loss: 0.4212 - classification_loss: 0.0381 195/500 [==========>...................] - ETA: 2:33 - loss: 0.4582 - regression_loss: 0.4202 - classification_loss: 0.0380 196/500 [==========>...................] - ETA: 2:32 - loss: 0.4576 - regression_loss: 0.4197 - classification_loss: 0.0379 197/500 [==========>...................] - ETA: 2:32 - loss: 0.4583 - regression_loss: 0.4202 - classification_loss: 0.0381 198/500 [==========>...................] - ETA: 2:31 - loss: 0.4590 - regression_loss: 0.4209 - classification_loss: 0.0381 199/500 [==========>...................] - ETA: 2:31 - loss: 0.4603 - regression_loss: 0.4221 - classification_loss: 0.0382 200/500 [===========>..................] - ETA: 2:30 - loss: 0.4593 - regression_loss: 0.4212 - classification_loss: 0.0381 201/500 [===========>..................] - ETA: 2:30 - loss: 0.4584 - regression_loss: 0.4204 - classification_loss: 0.0380 202/500 [===========>..................] - ETA: 2:29 - loss: 0.4594 - regression_loss: 0.4213 - classification_loss: 0.0381 203/500 [===========>..................] - ETA: 2:29 - loss: 0.4590 - regression_loss: 0.4210 - classification_loss: 0.0380 204/500 [===========>..................] - ETA: 2:28 - loss: 0.4595 - regression_loss: 0.4213 - classification_loss: 0.0381 205/500 [===========>..................] - ETA: 2:28 - loss: 0.4608 - regression_loss: 0.4227 - classification_loss: 0.0381 206/500 [===========>..................] - ETA: 2:27 - loss: 0.4603 - regression_loss: 0.4223 - classification_loss: 0.0380 207/500 [===========>..................] - ETA: 2:27 - loss: 0.4622 - regression_loss: 0.4239 - classification_loss: 0.0383 208/500 [===========>..................] - ETA: 2:26 - loss: 0.4627 - regression_loss: 0.4244 - classification_loss: 0.0383 209/500 [===========>..................] - ETA: 2:26 - loss: 0.4629 - regression_loss: 0.4246 - classification_loss: 0.0383 210/500 [===========>..................] - ETA: 2:25 - loss: 0.4633 - regression_loss: 0.4250 - classification_loss: 0.0383 211/500 [===========>..................] - ETA: 2:25 - loss: 0.4632 - regression_loss: 0.4250 - classification_loss: 0.0382 212/500 [===========>..................] - ETA: 2:24 - loss: 0.4623 - regression_loss: 0.4241 - classification_loss: 0.0381 213/500 [===========>..................] - ETA: 2:24 - loss: 0.4611 - regression_loss: 0.4231 - classification_loss: 0.0380 214/500 [===========>..................] - ETA: 2:23 - loss: 0.4619 - regression_loss: 0.4238 - classification_loss: 0.0381 215/500 [===========>..................] - ETA: 2:23 - loss: 0.4613 - regression_loss: 0.4232 - classification_loss: 0.0380 216/500 [===========>..................] - ETA: 2:22 - loss: 0.4627 - regression_loss: 0.4245 - classification_loss: 0.0382 217/500 [============>.................] - ETA: 2:22 - loss: 0.4623 - regression_loss: 0.4242 - classification_loss: 0.0381 218/500 [============>.................] - ETA: 2:21 - loss: 0.4623 - regression_loss: 0.4241 - classification_loss: 0.0382 219/500 [============>.................] - ETA: 2:21 - loss: 0.4619 - regression_loss: 0.4238 - classification_loss: 0.0381 220/500 [============>.................] - ETA: 2:20 - loss: 0.4611 - regression_loss: 0.4231 - classification_loss: 0.0380 221/500 [============>.................] - ETA: 2:20 - loss: 0.4613 - regression_loss: 0.4233 - classification_loss: 0.0380 222/500 [============>.................] - ETA: 2:19 - loss: 0.4605 - regression_loss: 0.4226 - classification_loss: 0.0380 223/500 [============>.................] - ETA: 2:19 - loss: 0.4600 - regression_loss: 0.4222 - classification_loss: 0.0379 224/500 [============>.................] - ETA: 2:18 - loss: 0.4614 - regression_loss: 0.4233 - classification_loss: 0.0381 225/500 [============>.................] - ETA: 2:18 - loss: 0.4610 - regression_loss: 0.4230 - classification_loss: 0.0380 226/500 [============>.................] - ETA: 2:17 - loss: 0.4603 - regression_loss: 0.4224 - classification_loss: 0.0379 227/500 [============>.................] - ETA: 2:17 - loss: 0.4618 - regression_loss: 0.4236 - classification_loss: 0.0382 228/500 [============>.................] - ETA: 2:16 - loss: 0.4612 - regression_loss: 0.4231 - classification_loss: 0.0381 229/500 [============>.................] - ETA: 2:16 - loss: 0.4611 - regression_loss: 0.4229 - classification_loss: 0.0381 230/500 [============>.................] - ETA: 2:15 - loss: 0.4615 - regression_loss: 0.4233 - classification_loss: 0.0381 231/500 [============>.................] - ETA: 2:15 - loss: 0.4613 - regression_loss: 0.4232 - classification_loss: 0.0382 232/500 [============>.................] - ETA: 2:14 - loss: 0.4619 - regression_loss: 0.4236 - classification_loss: 0.0382 233/500 [============>.................] - ETA: 2:14 - loss: 0.4618 - regression_loss: 0.4236 - classification_loss: 0.0382 234/500 [=============>................] - ETA: 2:13 - loss: 0.4612 - regression_loss: 0.4230 - classification_loss: 0.0382 235/500 [=============>................] - ETA: 2:13 - loss: 0.4607 - regression_loss: 0.4226 - classification_loss: 0.0381 236/500 [=============>................] - ETA: 2:12 - loss: 0.4603 - regression_loss: 0.4223 - classification_loss: 0.0380 237/500 [=============>................] - ETA: 2:12 - loss: 0.4599 - regression_loss: 0.4219 - classification_loss: 0.0380 238/500 [=============>................] - ETA: 2:11 - loss: 0.4591 - regression_loss: 0.4212 - classification_loss: 0.0379 239/500 [=============>................] - ETA: 2:11 - loss: 0.4582 - regression_loss: 0.4204 - classification_loss: 0.0378 240/500 [=============>................] - ETA: 2:10 - loss: 0.4585 - regression_loss: 0.4207 - classification_loss: 0.0378 241/500 [=============>................] - ETA: 2:10 - loss: 0.4589 - regression_loss: 0.4211 - classification_loss: 0.0378 242/500 [=============>................] - ETA: 2:09 - loss: 0.4588 - regression_loss: 0.4210 - classification_loss: 0.0378 243/500 [=============>................] - ETA: 2:09 - loss: 0.4584 - regression_loss: 0.4206 - classification_loss: 0.0377 244/500 [=============>................] - ETA: 2:08 - loss: 0.4574 - regression_loss: 0.4197 - classification_loss: 0.0376 245/500 [=============>................] - ETA: 2:08 - loss: 0.4572 - regression_loss: 0.4196 - classification_loss: 0.0376 246/500 [=============>................] - ETA: 2:07 - loss: 0.4578 - regression_loss: 0.4202 - classification_loss: 0.0377 247/500 [=============>................] - ETA: 2:07 - loss: 0.4578 - regression_loss: 0.4201 - classification_loss: 0.0377 248/500 [=============>................] - ETA: 2:06 - loss: 0.4579 - regression_loss: 0.4202 - classification_loss: 0.0377 249/500 [=============>................] - ETA: 2:06 - loss: 0.4593 - regression_loss: 0.4214 - classification_loss: 0.0378 250/500 [==============>...............] - ETA: 2:05 - loss: 0.4596 - regression_loss: 0.4217 - classification_loss: 0.0379 251/500 [==============>...............] - ETA: 2:05 - loss: 0.4591 - regression_loss: 0.4213 - classification_loss: 0.0378 252/500 [==============>...............] - ETA: 2:04 - loss: 0.4582 - regression_loss: 0.4205 - classification_loss: 0.0377 253/500 [==============>...............] - ETA: 2:04 - loss: 0.4587 - regression_loss: 0.4211 - classification_loss: 0.0377 254/500 [==============>...............] - ETA: 2:03 - loss: 0.4578 - regression_loss: 0.4202 - classification_loss: 0.0376 255/500 [==============>...............] - ETA: 2:03 - loss: 0.4586 - regression_loss: 0.4208 - classification_loss: 0.0378 256/500 [==============>...............] - ETA: 2:02 - loss: 0.4582 - regression_loss: 0.4205 - classification_loss: 0.0378 257/500 [==============>...............] - ETA: 2:02 - loss: 0.4587 - regression_loss: 0.4209 - classification_loss: 0.0378 258/500 [==============>...............] - ETA: 2:01 - loss: 0.4581 - regression_loss: 0.4204 - classification_loss: 0.0377 259/500 [==============>...............] - ETA: 2:01 - loss: 0.4582 - regression_loss: 0.4205 - classification_loss: 0.0377 260/500 [==============>...............] - ETA: 2:00 - loss: 0.4571 - regression_loss: 0.4195 - classification_loss: 0.0376 261/500 [==============>...............] - ETA: 2:00 - loss: 0.4568 - regression_loss: 0.4193 - classification_loss: 0.0376 262/500 [==============>...............] - ETA: 1:59 - loss: 0.4560 - regression_loss: 0.4185 - classification_loss: 0.0375 263/500 [==============>...............] - ETA: 1:59 - loss: 0.4556 - regression_loss: 0.4183 - classification_loss: 0.0374 264/500 [==============>...............] - ETA: 1:58 - loss: 0.4557 - regression_loss: 0.4183 - classification_loss: 0.0373 265/500 [==============>...............] - ETA: 1:58 - loss: 0.4560 - regression_loss: 0.4186 - classification_loss: 0.0374 266/500 [==============>...............] - ETA: 1:57 - loss: 0.4564 - regression_loss: 0.4189 - classification_loss: 0.0374 267/500 [===============>..............] - ETA: 1:57 - loss: 0.4560 - regression_loss: 0.4186 - classification_loss: 0.0374 268/500 [===============>..............] - ETA: 1:56 - loss: 0.4563 - regression_loss: 0.4189 - classification_loss: 0.0374 269/500 [===============>..............] - ETA: 1:56 - loss: 0.4559 - regression_loss: 0.4186 - classification_loss: 0.0373 270/500 [===============>..............] - ETA: 1:55 - loss: 0.4553 - regression_loss: 0.4180 - classification_loss: 0.0373 271/500 [===============>..............] - ETA: 1:55 - loss: 0.4563 - regression_loss: 0.4190 - classification_loss: 0.0374 272/500 [===============>..............] - ETA: 1:54 - loss: 0.4564 - regression_loss: 0.4191 - classification_loss: 0.0373 273/500 [===============>..............] - ETA: 1:54 - loss: 0.4557 - regression_loss: 0.4184 - classification_loss: 0.0372 274/500 [===============>..............] - ETA: 1:53 - loss: 0.4552 - regression_loss: 0.4180 - classification_loss: 0.0372 275/500 [===============>..............] - ETA: 1:53 - loss: 0.4547 - regression_loss: 0.4176 - classification_loss: 0.0372 276/500 [===============>..............] - ETA: 1:52 - loss: 0.4543 - regression_loss: 0.4171 - classification_loss: 0.0371 277/500 [===============>..............] - ETA: 1:52 - loss: 0.4543 - regression_loss: 0.4172 - classification_loss: 0.0371 278/500 [===============>..............] - ETA: 1:51 - loss: 0.4542 - regression_loss: 0.4171 - classification_loss: 0.0370 279/500 [===============>..............] - ETA: 1:51 - loss: 0.4554 - regression_loss: 0.4181 - classification_loss: 0.0372 280/500 [===============>..............] - ETA: 1:50 - loss: 0.4554 - regression_loss: 0.4181 - classification_loss: 0.0373 281/500 [===============>..............] - ETA: 1:50 - loss: 0.4544 - regression_loss: 0.4172 - classification_loss: 0.0372 282/500 [===============>..............] - ETA: 1:49 - loss: 0.4547 - regression_loss: 0.4175 - classification_loss: 0.0372 283/500 [===============>..............] - ETA: 1:49 - loss: 0.4549 - regression_loss: 0.4177 - classification_loss: 0.0372 284/500 [================>.............] - ETA: 1:48 - loss: 0.4544 - regression_loss: 0.4172 - classification_loss: 0.0372 285/500 [================>.............] - ETA: 1:48 - loss: 0.4553 - regression_loss: 0.4181 - classification_loss: 0.0373 286/500 [================>.............] - ETA: 1:47 - loss: 0.4546 - regression_loss: 0.4174 - classification_loss: 0.0372 287/500 [================>.............] - ETA: 1:47 - loss: 0.4539 - regression_loss: 0.4168 - classification_loss: 0.0371 288/500 [================>.............] - ETA: 1:46 - loss: 0.4543 - regression_loss: 0.4171 - classification_loss: 0.0372 289/500 [================>.............] - ETA: 1:46 - loss: 0.4551 - regression_loss: 0.4178 - classification_loss: 0.0374 290/500 [================>.............] - ETA: 1:45 - loss: 0.4551 - regression_loss: 0.4178 - classification_loss: 0.0373 291/500 [================>.............] - ETA: 1:45 - loss: 0.4557 - regression_loss: 0.4184 - classification_loss: 0.0373 292/500 [================>.............] - ETA: 1:44 - loss: 0.4561 - regression_loss: 0.4188 - classification_loss: 0.0374 293/500 [================>.............] - ETA: 1:44 - loss: 0.4561 - regression_loss: 0.4187 - classification_loss: 0.0374 294/500 [================>.............] - ETA: 1:43 - loss: 0.4560 - regression_loss: 0.4186 - classification_loss: 0.0375 295/500 [================>.............] - ETA: 1:43 - loss: 0.4575 - regression_loss: 0.4199 - classification_loss: 0.0376 296/500 [================>.............] - ETA: 1:42 - loss: 0.4584 - regression_loss: 0.4208 - classification_loss: 0.0377 297/500 [================>.............] - ETA: 1:42 - loss: 0.4586 - regression_loss: 0.4209 - classification_loss: 0.0377 298/500 [================>.............] - ETA: 1:41 - loss: 0.4581 - regression_loss: 0.4204 - classification_loss: 0.0377 299/500 [================>.............] - ETA: 1:41 - loss: 0.4579 - regression_loss: 0.4202 - classification_loss: 0.0377 300/500 [=================>............] - ETA: 1:40 - loss: 0.4583 - regression_loss: 0.4205 - classification_loss: 0.0379 301/500 [=================>............] - ETA: 1:40 - loss: 0.4577 - regression_loss: 0.4199 - classification_loss: 0.0378 302/500 [=================>............] - ETA: 1:39 - loss: 0.4569 - regression_loss: 0.4192 - classification_loss: 0.0377 303/500 [=================>............] - ETA: 1:39 - loss: 0.4570 - regression_loss: 0.4193 - classification_loss: 0.0377 304/500 [=================>............] - ETA: 1:38 - loss: 0.4573 - regression_loss: 0.4196 - classification_loss: 0.0377 305/500 [=================>............] - ETA: 1:38 - loss: 0.4568 - regression_loss: 0.4192 - classification_loss: 0.0376 306/500 [=================>............] - ETA: 1:37 - loss: 0.4584 - regression_loss: 0.4206 - classification_loss: 0.0378 307/500 [=================>............] - ETA: 1:37 - loss: 0.4580 - regression_loss: 0.4203 - classification_loss: 0.0377 308/500 [=================>............] - ETA: 1:36 - loss: 0.4575 - regression_loss: 0.4198 - classification_loss: 0.0377 309/500 [=================>............] - ETA: 1:36 - loss: 0.4570 - regression_loss: 0.4194 - classification_loss: 0.0376 310/500 [=================>............] - ETA: 1:35 - loss: 0.4570 - regression_loss: 0.4194 - classification_loss: 0.0376 311/500 [=================>............] - ETA: 1:35 - loss: 0.4567 - regression_loss: 0.4191 - classification_loss: 0.0376 312/500 [=================>............] - ETA: 1:34 - loss: 0.4575 - regression_loss: 0.4198 - classification_loss: 0.0377 313/500 [=================>............] - ETA: 1:34 - loss: 0.4566 - regression_loss: 0.4190 - classification_loss: 0.0376 314/500 [=================>............] - ETA: 1:33 - loss: 0.4559 - regression_loss: 0.4184 - classification_loss: 0.0375 315/500 [=================>............] - ETA: 1:33 - loss: 0.4557 - regression_loss: 0.4182 - classification_loss: 0.0374 316/500 [=================>............] - ETA: 1:32 - loss: 0.4560 - regression_loss: 0.4185 - classification_loss: 0.0374 317/500 [==================>...........] - ETA: 1:32 - loss: 0.4562 - regression_loss: 0.4187 - classification_loss: 0.0375 318/500 [==================>...........] - ETA: 1:31 - loss: 0.4561 - regression_loss: 0.4186 - classification_loss: 0.0375 319/500 [==================>...........] - ETA: 1:31 - loss: 0.4559 - regression_loss: 0.4184 - classification_loss: 0.0375 320/500 [==================>...........] - ETA: 1:30 - loss: 0.4564 - regression_loss: 0.4190 - classification_loss: 0.0374 321/500 [==================>...........] - ETA: 1:30 - loss: 0.4565 - regression_loss: 0.4190 - classification_loss: 0.0375 322/500 [==================>...........] - ETA: 1:29 - loss: 0.4565 - regression_loss: 0.4191 - classification_loss: 0.0374 323/500 [==================>...........] - ETA: 1:29 - loss: 0.4563 - regression_loss: 0.4190 - classification_loss: 0.0374 324/500 [==================>...........] - ETA: 1:28 - loss: 0.4566 - regression_loss: 0.4191 - classification_loss: 0.0375 325/500 [==================>...........] - ETA: 1:28 - loss: 0.4569 - regression_loss: 0.4194 - classification_loss: 0.0375 326/500 [==================>...........] - ETA: 1:27 - loss: 0.4573 - regression_loss: 0.4198 - classification_loss: 0.0375 327/500 [==================>...........] - ETA: 1:27 - loss: 0.4570 - regression_loss: 0.4196 - classification_loss: 0.0374 328/500 [==================>...........] - ETA: 1:26 - loss: 0.4568 - regression_loss: 0.4194 - classification_loss: 0.0374 329/500 [==================>...........] - ETA: 1:26 - loss: 0.4571 - regression_loss: 0.4196 - classification_loss: 0.0375 330/500 [==================>...........] - ETA: 1:25 - loss: 0.4566 - regression_loss: 0.4192 - classification_loss: 0.0374 331/500 [==================>...........] - ETA: 1:25 - loss: 0.4563 - regression_loss: 0.4188 - classification_loss: 0.0374 332/500 [==================>...........] - ETA: 1:24 - loss: 0.4566 - regression_loss: 0.4191 - classification_loss: 0.0375 333/500 [==================>...........] - ETA: 1:24 - loss: 0.4562 - regression_loss: 0.4188 - classification_loss: 0.0374 334/500 [===================>..........] - ETA: 1:23 - loss: 0.4555 - regression_loss: 0.4181 - classification_loss: 0.0374 335/500 [===================>..........] - ETA: 1:23 - loss: 0.4554 - regression_loss: 0.4180 - classification_loss: 0.0373 336/500 [===================>..........] - ETA: 1:22 - loss: 0.4551 - regression_loss: 0.4177 - classification_loss: 0.0373 337/500 [===================>..........] - ETA: 1:22 - loss: 0.4544 - regression_loss: 0.4172 - classification_loss: 0.0373 338/500 [===================>..........] - ETA: 1:21 - loss: 0.4551 - regression_loss: 0.4178 - classification_loss: 0.0373 339/500 [===================>..........] - ETA: 1:21 - loss: 0.4552 - regression_loss: 0.4180 - classification_loss: 0.0373 340/500 [===================>..........] - ETA: 1:20 - loss: 0.4552 - regression_loss: 0.4179 - classification_loss: 0.0373 341/500 [===================>..........] - ETA: 1:20 - loss: 0.4546 - regression_loss: 0.4174 - classification_loss: 0.0372 342/500 [===================>..........] - ETA: 1:19 - loss: 0.4550 - regression_loss: 0.4177 - classification_loss: 0.0373 343/500 [===================>..........] - ETA: 1:19 - loss: 0.4556 - regression_loss: 0.4183 - classification_loss: 0.0372 344/500 [===================>..........] - ETA: 1:18 - loss: 0.4553 - regression_loss: 0.4181 - classification_loss: 0.0372 345/500 [===================>..........] - ETA: 1:18 - loss: 0.4552 - regression_loss: 0.4180 - classification_loss: 0.0372 346/500 [===================>..........] - ETA: 1:17 - loss: 0.4553 - regression_loss: 0.4181 - classification_loss: 0.0372 347/500 [===================>..........] - ETA: 1:17 - loss: 0.4546 - regression_loss: 0.4174 - classification_loss: 0.0371 348/500 [===================>..........] - ETA: 1:16 - loss: 0.4547 - regression_loss: 0.4176 - classification_loss: 0.0372 349/500 [===================>..........] - ETA: 1:16 - loss: 0.4548 - regression_loss: 0.4177 - classification_loss: 0.0371 350/500 [====================>.........] - ETA: 1:15 - loss: 0.4545 - regression_loss: 0.4174 - classification_loss: 0.0371 351/500 [====================>.........] - ETA: 1:15 - loss: 0.4549 - regression_loss: 0.4177 - classification_loss: 0.0372 352/500 [====================>.........] - ETA: 1:14 - loss: 0.4551 - regression_loss: 0.4179 - classification_loss: 0.0372 353/500 [====================>.........] - ETA: 1:14 - loss: 0.4547 - regression_loss: 0.4176 - classification_loss: 0.0371 354/500 [====================>.........] - ETA: 1:13 - loss: 0.4543 - regression_loss: 0.4172 - classification_loss: 0.0371 355/500 [====================>.........] - ETA: 1:13 - loss: 0.4532 - regression_loss: 0.4162 - classification_loss: 0.0370 356/500 [====================>.........] - ETA: 1:12 - loss: 0.4539 - regression_loss: 0.4169 - classification_loss: 0.0370 357/500 [====================>.........] - ETA: 1:12 - loss: 0.4535 - regression_loss: 0.4165 - classification_loss: 0.0370 358/500 [====================>.........] - ETA: 1:11 - loss: 0.4534 - regression_loss: 0.4165 - classification_loss: 0.0370 359/500 [====================>.........] - ETA: 1:10 - loss: 0.4539 - regression_loss: 0.4169 - classification_loss: 0.0370 360/500 [====================>.........] - ETA: 1:10 - loss: 0.4539 - regression_loss: 0.4169 - classification_loss: 0.0370 361/500 [====================>.........] - ETA: 1:09 - loss: 0.4549 - regression_loss: 0.4177 - classification_loss: 0.0372 362/500 [====================>.........] - ETA: 1:09 - loss: 0.4544 - regression_loss: 0.4172 - classification_loss: 0.0372 363/500 [====================>.........] - ETA: 1:08 - loss: 0.4536 - regression_loss: 0.4165 - classification_loss: 0.0371 364/500 [====================>.........] - ETA: 1:08 - loss: 0.4539 - regression_loss: 0.4166 - classification_loss: 0.0372 365/500 [====================>.........] - ETA: 1:07 - loss: 0.4540 - regression_loss: 0.4167 - classification_loss: 0.0373 366/500 [====================>.........] - ETA: 1:07 - loss: 0.4546 - regression_loss: 0.4173 - classification_loss: 0.0373 367/500 [=====================>........] - ETA: 1:06 - loss: 0.4548 - regression_loss: 0.4176 - classification_loss: 0.0373 368/500 [=====================>........] - ETA: 1:06 - loss: 0.4550 - regression_loss: 0.4176 - classification_loss: 0.0373 369/500 [=====================>........] - ETA: 1:05 - loss: 0.4550 - regression_loss: 0.4177 - classification_loss: 0.0373 370/500 [=====================>........] - ETA: 1:05 - loss: 0.4562 - regression_loss: 0.4188 - classification_loss: 0.0374 371/500 [=====================>........] - ETA: 1:04 - loss: 0.4572 - regression_loss: 0.4198 - classification_loss: 0.0374 372/500 [=====================>........] - ETA: 1:04 - loss: 0.4579 - regression_loss: 0.4203 - classification_loss: 0.0376 373/500 [=====================>........] - ETA: 1:03 - loss: 0.4576 - regression_loss: 0.4200 - classification_loss: 0.0376 374/500 [=====================>........] - ETA: 1:03 - loss: 0.4580 - regression_loss: 0.4205 - classification_loss: 0.0376 375/500 [=====================>........] - ETA: 1:02 - loss: 0.4577 - regression_loss: 0.4202 - classification_loss: 0.0375 376/500 [=====================>........] - ETA: 1:02 - loss: 0.4578 - regression_loss: 0.4203 - classification_loss: 0.0375 377/500 [=====================>........] - ETA: 1:01 - loss: 0.4582 - regression_loss: 0.4207 - classification_loss: 0.0375 378/500 [=====================>........] - ETA: 1:01 - loss: 0.4579 - regression_loss: 0.4204 - classification_loss: 0.0375 379/500 [=====================>........] - ETA: 1:00 - loss: 0.4586 - regression_loss: 0.4211 - classification_loss: 0.0375 380/500 [=====================>........] - ETA: 1:00 - loss: 0.4586 - regression_loss: 0.4212 - classification_loss: 0.0375 381/500 [=====================>........] - ETA: 59s - loss: 0.4583 - regression_loss: 0.4209 - classification_loss: 0.0374  382/500 [=====================>........] - ETA: 59s - loss: 0.4585 - regression_loss: 0.4211 - classification_loss: 0.0375 383/500 [=====================>........] - ETA: 58s - loss: 0.4581 - regression_loss: 0.4207 - classification_loss: 0.0374 384/500 [======================>.......] - ETA: 58s - loss: 0.4582 - regression_loss: 0.4208 - classification_loss: 0.0374 385/500 [======================>.......] - ETA: 57s - loss: 0.4583 - regression_loss: 0.4209 - classification_loss: 0.0374 386/500 [======================>.......] - ETA: 57s - loss: 0.4579 - regression_loss: 0.4205 - classification_loss: 0.0374 387/500 [======================>.......] - ETA: 56s - loss: 0.4577 - regression_loss: 0.4204 - classification_loss: 0.0373 388/500 [======================>.......] - ETA: 56s - loss: 0.4583 - regression_loss: 0.4209 - classification_loss: 0.0374 389/500 [======================>.......] - ETA: 55s - loss: 0.4589 - regression_loss: 0.4214 - classification_loss: 0.0375 390/500 [======================>.......] - ETA: 55s - loss: 0.4591 - regression_loss: 0.4216 - classification_loss: 0.0375 391/500 [======================>.......] - ETA: 54s - loss: 0.4587 - regression_loss: 0.4212 - classification_loss: 0.0374 392/500 [======================>.......] - ETA: 54s - loss: 0.4589 - regression_loss: 0.4214 - classification_loss: 0.0375 393/500 [======================>.......] - ETA: 53s - loss: 0.4591 - regression_loss: 0.4217 - classification_loss: 0.0375 394/500 [======================>.......] - ETA: 53s - loss: 0.4588 - regression_loss: 0.4214 - classification_loss: 0.0374 395/500 [======================>.......] - ETA: 52s - loss: 0.4588 - regression_loss: 0.4215 - classification_loss: 0.0374 396/500 [======================>.......] - ETA: 52s - loss: 0.4584 - regression_loss: 0.4211 - classification_loss: 0.0373 397/500 [======================>.......] - ETA: 51s - loss: 0.4589 - regression_loss: 0.4216 - classification_loss: 0.0373 398/500 [======================>.......] - ETA: 51s - loss: 0.4593 - regression_loss: 0.4220 - classification_loss: 0.0373 399/500 [======================>.......] - ETA: 50s - loss: 0.4594 - regression_loss: 0.4221 - classification_loss: 0.0373 400/500 [=======================>......] - ETA: 50s - loss: 0.4588 - regression_loss: 0.4215 - classification_loss: 0.0373 401/500 [=======================>......] - ETA: 49s - loss: 0.4593 - regression_loss: 0.4219 - classification_loss: 0.0374 402/500 [=======================>......] - ETA: 49s - loss: 0.4587 - regression_loss: 0.4214 - classification_loss: 0.0373 403/500 [=======================>......] - ETA: 48s - loss: 0.4583 - regression_loss: 0.4210 - classification_loss: 0.0373 404/500 [=======================>......] - ETA: 48s - loss: 0.4593 - regression_loss: 0.4219 - classification_loss: 0.0374 405/500 [=======================>......] - ETA: 47s - loss: 0.4587 - regression_loss: 0.4213 - classification_loss: 0.0374 406/500 [=======================>......] - ETA: 47s - loss: 0.4582 - regression_loss: 0.4209 - classification_loss: 0.0373 407/500 [=======================>......] - ETA: 46s - loss: 0.4584 - regression_loss: 0.4210 - classification_loss: 0.0373 408/500 [=======================>......] - ETA: 46s - loss: 0.4582 - regression_loss: 0.4208 - classification_loss: 0.0374 409/500 [=======================>......] - ETA: 45s - loss: 0.4580 - regression_loss: 0.4207 - classification_loss: 0.0373 410/500 [=======================>......] - ETA: 45s - loss: 0.4574 - regression_loss: 0.4201 - classification_loss: 0.0373 411/500 [=======================>......] - ETA: 44s - loss: 0.4573 - regression_loss: 0.4200 - classification_loss: 0.0373 412/500 [=======================>......] - ETA: 44s - loss: 0.4567 - regression_loss: 0.4195 - classification_loss: 0.0372 413/500 [=======================>......] - ETA: 43s - loss: 0.4571 - regression_loss: 0.4199 - classification_loss: 0.0373 414/500 [=======================>......] - ETA: 43s - loss: 0.4570 - regression_loss: 0.4197 - classification_loss: 0.0372 415/500 [=======================>......] - ETA: 42s - loss: 0.4572 - regression_loss: 0.4199 - classification_loss: 0.0373 416/500 [=======================>......] - ETA: 42s - loss: 0.4564 - regression_loss: 0.4192 - classification_loss: 0.0372 417/500 [========================>.....] - ETA: 41s - loss: 0.4563 - regression_loss: 0.4191 - classification_loss: 0.0372 418/500 [========================>.....] - ETA: 41s - loss: 0.4563 - regression_loss: 0.4191 - classification_loss: 0.0372 419/500 [========================>.....] - ETA: 40s - loss: 0.4557 - regression_loss: 0.4185 - classification_loss: 0.0371 420/500 [========================>.....] - ETA: 40s - loss: 0.4552 - regression_loss: 0.4181 - classification_loss: 0.0371 421/500 [========================>.....] - ETA: 39s - loss: 0.4548 - regression_loss: 0.4177 - classification_loss: 0.0371 422/500 [========================>.....] - ETA: 39s - loss: 0.4544 - regression_loss: 0.4174 - classification_loss: 0.0370 423/500 [========================>.....] - ETA: 38s - loss: 0.4546 - regression_loss: 0.4176 - classification_loss: 0.0370 424/500 [========================>.....] - ETA: 38s - loss: 0.4546 - regression_loss: 0.4176 - classification_loss: 0.0370 425/500 [========================>.....] - ETA: 37s - loss: 0.4540 - regression_loss: 0.4170 - classification_loss: 0.0370 426/500 [========================>.....] - ETA: 37s - loss: 0.4540 - regression_loss: 0.4170 - classification_loss: 0.0370 427/500 [========================>.....] - ETA: 36s - loss: 0.4538 - regression_loss: 0.4168 - classification_loss: 0.0369 428/500 [========================>.....] - ETA: 36s - loss: 0.4540 - regression_loss: 0.4171 - classification_loss: 0.0369 429/500 [========================>.....] - ETA: 35s - loss: 0.4545 - regression_loss: 0.4175 - classification_loss: 0.0370 430/500 [========================>.....] - ETA: 35s - loss: 0.4543 - regression_loss: 0.4173 - classification_loss: 0.0370 431/500 [========================>.....] - ETA: 34s - loss: 0.4538 - regression_loss: 0.4168 - classification_loss: 0.0369 432/500 [========================>.....] - ETA: 34s - loss: 0.4536 - regression_loss: 0.4167 - classification_loss: 0.0369 433/500 [========================>.....] - ETA: 33s - loss: 0.4532 - regression_loss: 0.4164 - classification_loss: 0.0368 434/500 [=========================>....] - ETA: 33s - loss: 0.4537 - regression_loss: 0.4168 - classification_loss: 0.0369 435/500 [=========================>....] - ETA: 32s - loss: 0.4544 - regression_loss: 0.4175 - classification_loss: 0.0369 436/500 [=========================>....] - ETA: 32s - loss: 0.4544 - regression_loss: 0.4175 - classification_loss: 0.0369 437/500 [=========================>....] - ETA: 31s - loss: 0.4546 - regression_loss: 0.4177 - classification_loss: 0.0369 438/500 [=========================>....] - ETA: 31s - loss: 0.4543 - regression_loss: 0.4174 - classification_loss: 0.0369 439/500 [=========================>....] - ETA: 30s - loss: 0.4540 - regression_loss: 0.4172 - classification_loss: 0.0368 440/500 [=========================>....] - ETA: 30s - loss: 0.4543 - regression_loss: 0.4175 - classification_loss: 0.0368 441/500 [=========================>....] - ETA: 29s - loss: 0.4542 - regression_loss: 0.4174 - classification_loss: 0.0368 442/500 [=========================>....] - ETA: 29s - loss: 0.4535 - regression_loss: 0.4168 - classification_loss: 0.0367 443/500 [=========================>....] - ETA: 28s - loss: 0.4532 - regression_loss: 0.4165 - classification_loss: 0.0367 444/500 [=========================>....] - ETA: 28s - loss: 0.4540 - regression_loss: 0.4172 - classification_loss: 0.0368 445/500 [=========================>....] - ETA: 27s - loss: 0.4539 - regression_loss: 0.4171 - classification_loss: 0.0368 446/500 [=========================>....] - ETA: 27s - loss: 0.4538 - regression_loss: 0.4170 - classification_loss: 0.0368 447/500 [=========================>....] - ETA: 26s - loss: 0.4537 - regression_loss: 0.4169 - classification_loss: 0.0368 448/500 [=========================>....] - ETA: 26s - loss: 0.4540 - regression_loss: 0.4172 - classification_loss: 0.0368 449/500 [=========================>....] - ETA: 25s - loss: 0.4541 - regression_loss: 0.4173 - classification_loss: 0.0368 450/500 [==========================>...] - ETA: 25s - loss: 0.4540 - regression_loss: 0.4171 - classification_loss: 0.0368 451/500 [==========================>...] - ETA: 24s - loss: 0.4535 - regression_loss: 0.4168 - classification_loss: 0.0367 452/500 [==========================>...] - ETA: 24s - loss: 0.4534 - regression_loss: 0.4167 - classification_loss: 0.0367 453/500 [==========================>...] - ETA: 23s - loss: 0.4530 - regression_loss: 0.4163 - classification_loss: 0.0367 454/500 [==========================>...] - ETA: 23s - loss: 0.4530 - regression_loss: 0.4164 - classification_loss: 0.0367 455/500 [==========================>...] - ETA: 22s - loss: 0.4529 - regression_loss: 0.4162 - classification_loss: 0.0367 456/500 [==========================>...] - ETA: 22s - loss: 0.4529 - regression_loss: 0.4162 - classification_loss: 0.0367 457/500 [==========================>...] - ETA: 21s - loss: 0.4526 - regression_loss: 0.4160 - classification_loss: 0.0366 458/500 [==========================>...] - ETA: 21s - loss: 0.4522 - regression_loss: 0.4156 - classification_loss: 0.0366 459/500 [==========================>...] - ETA: 20s - loss: 0.4517 - regression_loss: 0.4152 - classification_loss: 0.0366 460/500 [==========================>...] - ETA: 20s - loss: 0.4511 - regression_loss: 0.4146 - classification_loss: 0.0365 461/500 [==========================>...] - ETA: 19s - loss: 0.4514 - regression_loss: 0.4149 - classification_loss: 0.0365 462/500 [==========================>...] - ETA: 19s - loss: 0.4521 - regression_loss: 0.4155 - classification_loss: 0.0366 463/500 [==========================>...] - ETA: 18s - loss: 0.4527 - regression_loss: 0.4161 - classification_loss: 0.0367 464/500 [==========================>...] - ETA: 18s - loss: 0.4524 - regression_loss: 0.4158 - classification_loss: 0.0366 465/500 [==========================>...] - ETA: 17s - loss: 0.4525 - regression_loss: 0.4158 - classification_loss: 0.0366 466/500 [==========================>...] - ETA: 17s - loss: 0.4525 - regression_loss: 0.4159 - classification_loss: 0.0366 467/500 [===========================>..] - ETA: 16s - loss: 0.4525 - regression_loss: 0.4158 - classification_loss: 0.0367 468/500 [===========================>..] - ETA: 16s - loss: 0.4527 - regression_loss: 0.4160 - classification_loss: 0.0367 469/500 [===========================>..] - ETA: 15s - loss: 0.4525 - regression_loss: 0.4158 - classification_loss: 0.0367 470/500 [===========================>..] - ETA: 15s - loss: 0.4525 - regression_loss: 0.4158 - classification_loss: 0.0367 471/500 [===========================>..] - ETA: 14s - loss: 0.4521 - regression_loss: 0.4155 - classification_loss: 0.0366 472/500 [===========================>..] - ETA: 14s - loss: 0.4521 - regression_loss: 0.4155 - classification_loss: 0.0367 473/500 [===========================>..] - ETA: 13s - loss: 0.4516 - regression_loss: 0.4150 - classification_loss: 0.0366 474/500 [===========================>..] - ETA: 13s - loss: 0.4517 - regression_loss: 0.4150 - classification_loss: 0.0367 475/500 [===========================>..] - ETA: 12s - loss: 0.4525 - regression_loss: 0.4159 - classification_loss: 0.0367 476/500 [===========================>..] - ETA: 12s - loss: 0.4530 - regression_loss: 0.4163 - classification_loss: 0.0367 477/500 [===========================>..] - ETA: 11s - loss: 0.4533 - regression_loss: 0.4166 - classification_loss: 0.0367 478/500 [===========================>..] - ETA: 11s - loss: 0.4530 - regression_loss: 0.4163 - classification_loss: 0.0367 479/500 [===========================>..] - ETA: 10s - loss: 0.4528 - regression_loss: 0.4161 - classification_loss: 0.0367 480/500 [===========================>..] - ETA: 10s - loss: 0.4524 - regression_loss: 0.4157 - classification_loss: 0.0366 481/500 [===========================>..] - ETA: 9s - loss: 0.4526 - regression_loss: 0.4160 - classification_loss: 0.0366  482/500 [===========================>..] - ETA: 9s - loss: 0.4530 - regression_loss: 0.4163 - classification_loss: 0.0366 483/500 [===========================>..] - ETA: 8s - loss: 0.4525 - regression_loss: 0.4159 - classification_loss: 0.0366 484/500 [============================>.] - ETA: 8s - loss: 0.4519 - regression_loss: 0.4154 - classification_loss: 0.0366 485/500 [============================>.] - ETA: 7s - loss: 0.4520 - regression_loss: 0.4154 - classification_loss: 0.0366 486/500 [============================>.] - ETA: 7s - loss: 0.4523 - regression_loss: 0.4157 - classification_loss: 0.0366 487/500 [============================>.] - ETA: 6s - loss: 0.4523 - regression_loss: 0.4157 - classification_loss: 0.0367 488/500 [============================>.] - ETA: 6s - loss: 0.4519 - regression_loss: 0.4152 - classification_loss: 0.0366 489/500 [============================>.] - ETA: 5s - loss: 0.4514 - regression_loss: 0.4148 - classification_loss: 0.0366 490/500 [============================>.] - ETA: 5s - loss: 0.4519 - regression_loss: 0.4153 - classification_loss: 0.0367 491/500 [============================>.] - ETA: 4s - loss: 0.4518 - regression_loss: 0.4152 - classification_loss: 0.0367 492/500 [============================>.] - ETA: 4s - loss: 0.4523 - regression_loss: 0.4156 - classification_loss: 0.0367 493/500 [============================>.] - ETA: 3s - loss: 0.4519 - regression_loss: 0.4152 - classification_loss: 0.0367 494/500 [============================>.] - ETA: 3s - loss: 0.4521 - regression_loss: 0.4153 - classification_loss: 0.0368 495/500 [============================>.] - ETA: 2s - loss: 0.4522 - regression_loss: 0.4154 - classification_loss: 0.0368 496/500 [============================>.] - ETA: 2s - loss: 0.4520 - regression_loss: 0.4152 - classification_loss: 0.0368 497/500 [============================>.] - ETA: 1s - loss: 0.4527 - regression_loss: 0.4158 - classification_loss: 0.0368 498/500 [============================>.] - ETA: 1s - loss: 0.4524 - regression_loss: 0.4156 - classification_loss: 0.0368 499/500 [============================>.] - ETA: 0s - loss: 0.4520 - regression_loss: 0.4152 - classification_loss: 0.0368 500/500 [==============================] - 252s 503ms/step - loss: 0.4516 - regression_loss: 0.4148 - classification_loss: 0.0367 1172 instances of class plum with average precision: 0.7268 mAP: 0.7268 Epoch 00004: saving model to ./training/snapshots/resnet101_pascal_04.h5 Epoch 5/150 1/500 [..............................] - ETA: 4:09 - loss: 0.3041 - regression_loss: 0.2814 - classification_loss: 0.0228 2/500 [..............................] - ETA: 4:06 - loss: 0.3887 - regression_loss: 0.3547 - classification_loss: 0.0340 3/500 [..............................] - ETA: 4:04 - loss: 0.3953 - regression_loss: 0.3635 - classification_loss: 0.0318 4/500 [..............................] - ETA: 4:02 - loss: 0.3565 - regression_loss: 0.3284 - classification_loss: 0.0282 5/500 [..............................] - ETA: 4:03 - loss: 0.3330 - regression_loss: 0.3086 - classification_loss: 0.0245 6/500 [..............................] - ETA: 4:05 - loss: 0.3334 - regression_loss: 0.3107 - classification_loss: 0.0227 7/500 [..............................] - ETA: 4:05 - loss: 0.3738 - regression_loss: 0.3502 - classification_loss: 0.0235 8/500 [..............................] - ETA: 4:03 - loss: 0.3880 - regression_loss: 0.3637 - classification_loss: 0.0243 9/500 [..............................] - ETA: 4:03 - loss: 0.3935 - regression_loss: 0.3702 - classification_loss: 0.0234 10/500 [..............................] - ETA: 4:04 - loss: 0.4040 - regression_loss: 0.3791 - classification_loss: 0.0249 11/500 [..............................] - ETA: 4:04 - loss: 0.3909 - regression_loss: 0.3671 - classification_loss: 0.0238 12/500 [..............................] - ETA: 4:03 - loss: 0.3737 - regression_loss: 0.3512 - classification_loss: 0.0225 13/500 [..............................] - ETA: 4:02 - loss: 0.3633 - regression_loss: 0.3401 - classification_loss: 0.0233 14/500 [..............................] - ETA: 4:02 - loss: 0.3780 - regression_loss: 0.3548 - classification_loss: 0.0233 15/500 [..............................] - ETA: 4:01 - loss: 0.3787 - regression_loss: 0.3553 - classification_loss: 0.0235 16/500 [..............................] - ETA: 4:01 - loss: 0.3712 - regression_loss: 0.3482 - classification_loss: 0.0230 17/500 [>.............................] - ETA: 4:00 - loss: 0.3579 - regression_loss: 0.3355 - classification_loss: 0.0224 18/500 [>.............................] - ETA: 4:00 - loss: 0.3524 - regression_loss: 0.3303 - classification_loss: 0.0221 19/500 [>.............................] - ETA: 4:00 - loss: 0.3643 - regression_loss: 0.3406 - classification_loss: 0.0236 20/500 [>.............................] - ETA: 4:00 - loss: 0.3626 - regression_loss: 0.3395 - classification_loss: 0.0231 21/500 [>.............................] - ETA: 3:59 - loss: 0.3516 - regression_loss: 0.3293 - classification_loss: 0.0224 22/500 [>.............................] - ETA: 3:58 - loss: 0.3611 - regression_loss: 0.3383 - classification_loss: 0.0228 23/500 [>.............................] - ETA: 3:58 - loss: 0.3680 - regression_loss: 0.3444 - classification_loss: 0.0235 24/500 [>.............................] - ETA: 3:58 - loss: 0.3703 - regression_loss: 0.3467 - classification_loss: 0.0236 25/500 [>.............................] - ETA: 3:57 - loss: 0.3840 - regression_loss: 0.3582 - classification_loss: 0.0258 26/500 [>.............................] - ETA: 3:57 - loss: 0.3873 - regression_loss: 0.3611 - classification_loss: 0.0262 27/500 [>.............................] - ETA: 3:56 - loss: 0.3989 - regression_loss: 0.3717 - classification_loss: 0.0272 28/500 [>.............................] - ETA: 3:56 - loss: 0.4113 - regression_loss: 0.3826 - classification_loss: 0.0287 29/500 [>.............................] - ETA: 3:55 - loss: 0.4165 - regression_loss: 0.3877 - classification_loss: 0.0288 30/500 [>.............................] - ETA: 3:55 - loss: 0.4205 - regression_loss: 0.3909 - classification_loss: 0.0296 31/500 [>.............................] - ETA: 3:54 - loss: 0.4154 - regression_loss: 0.3862 - classification_loss: 0.0292 32/500 [>.............................] - ETA: 3:54 - loss: 0.4157 - regression_loss: 0.3862 - classification_loss: 0.0295 33/500 [>.............................] - ETA: 3:53 - loss: 0.4171 - regression_loss: 0.3869 - classification_loss: 0.0302 34/500 [=>............................] - ETA: 3:53 - loss: 0.4168 - regression_loss: 0.3865 - classification_loss: 0.0303 35/500 [=>............................] - ETA: 3:53 - loss: 0.4168 - regression_loss: 0.3869 - classification_loss: 0.0299 36/500 [=>............................] - ETA: 3:53 - loss: 0.4184 - regression_loss: 0.3880 - classification_loss: 0.0304 37/500 [=>............................] - ETA: 3:52 - loss: 0.4228 - regression_loss: 0.3922 - classification_loss: 0.0306 38/500 [=>............................] - ETA: 3:51 - loss: 0.4190 - regression_loss: 0.3890 - classification_loss: 0.0301 39/500 [=>............................] - ETA: 3:51 - loss: 0.4179 - regression_loss: 0.3881 - classification_loss: 0.0298 40/500 [=>............................] - ETA: 3:51 - loss: 0.4160 - regression_loss: 0.3867 - classification_loss: 0.0293 41/500 [=>............................] - ETA: 3:50 - loss: 0.4158 - regression_loss: 0.3864 - classification_loss: 0.0295 42/500 [=>............................] - ETA: 3:49 - loss: 0.4219 - regression_loss: 0.3917 - classification_loss: 0.0302 43/500 [=>............................] - ETA: 3:49 - loss: 0.4255 - regression_loss: 0.3943 - classification_loss: 0.0313 44/500 [=>............................] - ETA: 3:48 - loss: 0.4258 - regression_loss: 0.3944 - classification_loss: 0.0314 45/500 [=>............................] - ETA: 3:48 - loss: 0.4256 - regression_loss: 0.3942 - classification_loss: 0.0314 46/500 [=>............................] - ETA: 3:47 - loss: 0.4227 - regression_loss: 0.3916 - classification_loss: 0.0311 47/500 [=>............................] - ETA: 3:47 - loss: 0.4248 - regression_loss: 0.3934 - classification_loss: 0.0313 48/500 [=>............................] - ETA: 3:46 - loss: 0.4248 - regression_loss: 0.3933 - classification_loss: 0.0315 49/500 [=>............................] - ETA: 3:46 - loss: 0.4243 - regression_loss: 0.3929 - classification_loss: 0.0314 50/500 [==>...........................] - ETA: 3:45 - loss: 0.4279 - regression_loss: 0.3964 - classification_loss: 0.0316 51/500 [==>...........................] - ETA: 3:45 - loss: 0.4327 - regression_loss: 0.4010 - classification_loss: 0.0317 52/500 [==>...........................] - ETA: 3:44 - loss: 0.4332 - regression_loss: 0.4003 - classification_loss: 0.0329 53/500 [==>...........................] - ETA: 3:44 - loss: 0.4292 - regression_loss: 0.3968 - classification_loss: 0.0325 54/500 [==>...........................] - ETA: 3:44 - loss: 0.4267 - regression_loss: 0.3942 - classification_loss: 0.0325 55/500 [==>...........................] - ETA: 3:43 - loss: 0.4252 - regression_loss: 0.3930 - classification_loss: 0.0323 56/500 [==>...........................] - ETA: 3:42 - loss: 0.4294 - regression_loss: 0.3961 - classification_loss: 0.0333 57/500 [==>...........................] - ETA: 3:42 - loss: 0.4287 - regression_loss: 0.3952 - classification_loss: 0.0335 58/500 [==>...........................] - ETA: 3:42 - loss: 0.4275 - regression_loss: 0.3942 - classification_loss: 0.0333 59/500 [==>...........................] - ETA: 3:41 - loss: 0.4266 - regression_loss: 0.3936 - classification_loss: 0.0330 60/500 [==>...........................] - ETA: 3:41 - loss: 0.4235 - regression_loss: 0.3908 - classification_loss: 0.0327 61/500 [==>...........................] - ETA: 3:40 - loss: 0.4218 - regression_loss: 0.3892 - classification_loss: 0.0326 62/500 [==>...........................] - ETA: 3:39 - loss: 0.4227 - regression_loss: 0.3897 - classification_loss: 0.0330 63/500 [==>...........................] - ETA: 3:39 - loss: 0.4273 - regression_loss: 0.3944 - classification_loss: 0.0328 64/500 [==>...........................] - ETA: 3:38 - loss: 0.4248 - regression_loss: 0.3922 - classification_loss: 0.0326 65/500 [==>...........................] - ETA: 3:38 - loss: 0.4259 - regression_loss: 0.3930 - classification_loss: 0.0330 66/500 [==>...........................] - ETA: 3:37 - loss: 0.4259 - regression_loss: 0.3931 - classification_loss: 0.0329 67/500 [===>..........................] - ETA: 3:37 - loss: 0.4301 - regression_loss: 0.3969 - classification_loss: 0.0332 68/500 [===>..........................] - ETA: 3:36 - loss: 0.4284 - regression_loss: 0.3952 - classification_loss: 0.0332 69/500 [===>..........................] - ETA: 3:36 - loss: 0.4263 - regression_loss: 0.3934 - classification_loss: 0.0329 70/500 [===>..........................] - ETA: 3:35 - loss: 0.4250 - regression_loss: 0.3923 - classification_loss: 0.0327 71/500 [===>..........................] - ETA: 3:35 - loss: 0.4252 - regression_loss: 0.3927 - classification_loss: 0.0325 72/500 [===>..........................] - ETA: 3:35 - loss: 0.4282 - regression_loss: 0.3957 - classification_loss: 0.0325 73/500 [===>..........................] - ETA: 3:34 - loss: 0.4323 - regression_loss: 0.3996 - classification_loss: 0.0327 74/500 [===>..........................] - ETA: 3:33 - loss: 0.4336 - regression_loss: 0.4008 - classification_loss: 0.0329 75/500 [===>..........................] - ETA: 3:33 - loss: 0.4340 - regression_loss: 0.4010 - classification_loss: 0.0330 76/500 [===>..........................] - ETA: 3:32 - loss: 0.4309 - regression_loss: 0.3981 - classification_loss: 0.0328 77/500 [===>..........................] - ETA: 3:32 - loss: 0.4311 - regression_loss: 0.3981 - classification_loss: 0.0330 78/500 [===>..........................] - ETA: 3:31 - loss: 0.4283 - regression_loss: 0.3955 - classification_loss: 0.0328 79/500 [===>..........................] - ETA: 3:31 - loss: 0.4282 - regression_loss: 0.3954 - classification_loss: 0.0328 80/500 [===>..........................] - ETA: 3:30 - loss: 0.4319 - regression_loss: 0.3985 - classification_loss: 0.0334 81/500 [===>..........................] - ETA: 3:30 - loss: 0.4285 - regression_loss: 0.3954 - classification_loss: 0.0331 82/500 [===>..........................] - ETA: 3:29 - loss: 0.4308 - regression_loss: 0.3975 - classification_loss: 0.0334 83/500 [===>..........................] - ETA: 3:29 - loss: 0.4271 - regression_loss: 0.3940 - classification_loss: 0.0331 84/500 [====>.........................] - ETA: 3:28 - loss: 0.4275 - regression_loss: 0.3943 - classification_loss: 0.0332 85/500 [====>.........................] - ETA: 3:28 - loss: 0.4272 - regression_loss: 0.3940 - classification_loss: 0.0332 86/500 [====>.........................] - ETA: 3:27 - loss: 0.4256 - regression_loss: 0.3925 - classification_loss: 0.0330 87/500 [====>.........................] - ETA: 3:27 - loss: 0.4250 - regression_loss: 0.3920 - classification_loss: 0.0330 88/500 [====>.........................] - ETA: 3:26 - loss: 0.4258 - regression_loss: 0.3926 - classification_loss: 0.0331 89/500 [====>.........................] - ETA: 3:26 - loss: 0.4249 - regression_loss: 0.3918 - classification_loss: 0.0331 90/500 [====>.........................] - ETA: 3:26 - loss: 0.4244 - regression_loss: 0.3914 - classification_loss: 0.0330 91/500 [====>.........................] - ETA: 3:25 - loss: 0.4267 - regression_loss: 0.3933 - classification_loss: 0.0334 92/500 [====>.........................] - ETA: 3:24 - loss: 0.4247 - regression_loss: 0.3915 - classification_loss: 0.0332 93/500 [====>.........................] - ETA: 3:24 - loss: 0.4223 - regression_loss: 0.3892 - classification_loss: 0.0330 94/500 [====>.........................] - ETA: 3:23 - loss: 0.4216 - regression_loss: 0.3885 - classification_loss: 0.0331 95/500 [====>.........................] - ETA: 3:23 - loss: 0.4193 - regression_loss: 0.3864 - classification_loss: 0.0328 96/500 [====>.........................] - ETA: 3:22 - loss: 0.4204 - regression_loss: 0.3875 - classification_loss: 0.0329 97/500 [====>.........................] - ETA: 3:22 - loss: 0.4220 - regression_loss: 0.3891 - classification_loss: 0.0329 98/500 [====>.........................] - ETA: 3:21 - loss: 0.4235 - regression_loss: 0.3906 - classification_loss: 0.0329 99/500 [====>.........................] - ETA: 3:21 - loss: 0.4273 - regression_loss: 0.3938 - classification_loss: 0.0335 100/500 [=====>........................] - ETA: 3:20 - loss: 0.4265 - regression_loss: 0.3929 - classification_loss: 0.0336 101/500 [=====>........................] - ETA: 3:20 - loss: 0.4282 - regression_loss: 0.3945 - classification_loss: 0.0337 102/500 [=====>........................] - ETA: 3:20 - loss: 0.4284 - regression_loss: 0.3947 - classification_loss: 0.0337 103/500 [=====>........................] - ETA: 3:19 - loss: 0.4286 - regression_loss: 0.3946 - classification_loss: 0.0340 104/500 [=====>........................] - ETA: 3:19 - loss: 0.4297 - regression_loss: 0.3957 - classification_loss: 0.0340 105/500 [=====>........................] - ETA: 3:18 - loss: 0.4312 - regression_loss: 0.3966 - classification_loss: 0.0345 106/500 [=====>........................] - ETA: 3:17 - loss: 0.4330 - regression_loss: 0.3981 - classification_loss: 0.0349 107/500 [=====>........................] - ETA: 3:17 - loss: 0.4332 - regression_loss: 0.3984 - classification_loss: 0.0349 108/500 [=====>........................] - ETA: 3:16 - loss: 0.4360 - regression_loss: 0.4010 - classification_loss: 0.0349 109/500 [=====>........................] - ETA: 3:16 - loss: 0.4349 - regression_loss: 0.4001 - classification_loss: 0.0347 110/500 [=====>........................] - ETA: 3:15 - loss: 0.4338 - regression_loss: 0.3992 - classification_loss: 0.0346 111/500 [=====>........................] - ETA: 3:15 - loss: 0.4332 - regression_loss: 0.3985 - classification_loss: 0.0346 112/500 [=====>........................] - ETA: 3:14 - loss: 0.4353 - regression_loss: 0.4004 - classification_loss: 0.0349 113/500 [=====>........................] - ETA: 3:14 - loss: 0.4355 - regression_loss: 0.4004 - classification_loss: 0.0351 114/500 [=====>........................] - ETA: 3:13 - loss: 0.4345 - regression_loss: 0.3995 - classification_loss: 0.0350 115/500 [=====>........................] - ETA: 3:13 - loss: 0.4338 - regression_loss: 0.3989 - classification_loss: 0.0349 116/500 [=====>........................] - ETA: 3:12 - loss: 0.4326 - regression_loss: 0.3979 - classification_loss: 0.0347 117/500 [======>.......................] - ETA: 3:12 - loss: 0.4330 - regression_loss: 0.3985 - classification_loss: 0.0346 118/500 [======>.......................] - ETA: 3:11 - loss: 0.4344 - regression_loss: 0.3997 - classification_loss: 0.0347 119/500 [======>.......................] - ETA: 3:11 - loss: 0.4344 - regression_loss: 0.3997 - classification_loss: 0.0346 120/500 [======>.......................] - ETA: 3:10 - loss: 0.4334 - regression_loss: 0.3988 - classification_loss: 0.0346 121/500 [======>.......................] - ETA: 3:10 - loss: 0.4309 - regression_loss: 0.3965 - classification_loss: 0.0344 122/500 [======>.......................] - ETA: 3:09 - loss: 0.4306 - regression_loss: 0.3963 - classification_loss: 0.0343 123/500 [======>.......................] - ETA: 3:09 - loss: 0.4323 - regression_loss: 0.3979 - classification_loss: 0.0344 124/500 [======>.......................] - ETA: 3:08 - loss: 0.4336 - regression_loss: 0.3991 - classification_loss: 0.0345 125/500 [======>.......................] - ETA: 3:08 - loss: 0.4324 - regression_loss: 0.3981 - classification_loss: 0.0343 126/500 [======>.......................] - ETA: 3:07 - loss: 0.4347 - regression_loss: 0.4001 - classification_loss: 0.0346 127/500 [======>.......................] - ETA: 3:07 - loss: 0.4340 - regression_loss: 0.3994 - classification_loss: 0.0346 128/500 [======>.......................] - ETA: 3:07 - loss: 0.4336 - regression_loss: 0.3990 - classification_loss: 0.0345 129/500 [======>.......................] - ETA: 3:06 - loss: 0.4327 - regression_loss: 0.3984 - classification_loss: 0.0344 130/500 [======>.......................] - ETA: 3:05 - loss: 0.4342 - regression_loss: 0.3996 - classification_loss: 0.0346 131/500 [======>.......................] - ETA: 3:05 - loss: 0.4357 - regression_loss: 0.4010 - classification_loss: 0.0347 132/500 [======>.......................] - ETA: 3:05 - loss: 0.4361 - regression_loss: 0.4014 - classification_loss: 0.0347 133/500 [======>.......................] - ETA: 3:04 - loss: 0.4364 - regression_loss: 0.4017 - classification_loss: 0.0348 134/500 [=======>......................] - ETA: 3:04 - loss: 0.4350 - regression_loss: 0.4003 - classification_loss: 0.0347 135/500 [=======>......................] - ETA: 3:03 - loss: 0.4333 - regression_loss: 0.3988 - classification_loss: 0.0345 136/500 [=======>......................] - ETA: 3:02 - loss: 0.4349 - regression_loss: 0.4004 - classification_loss: 0.0345 137/500 [=======>......................] - ETA: 3:02 - loss: 0.4347 - regression_loss: 0.4002 - classification_loss: 0.0345 138/500 [=======>......................] - ETA: 3:02 - loss: 0.4336 - regression_loss: 0.3993 - classification_loss: 0.0343 139/500 [=======>......................] - ETA: 3:01 - loss: 0.4325 - regression_loss: 0.3983 - classification_loss: 0.0343 140/500 [=======>......................] - ETA: 3:00 - loss: 0.4328 - regression_loss: 0.3986 - classification_loss: 0.0342 141/500 [=======>......................] - ETA: 3:00 - loss: 0.4315 - regression_loss: 0.3974 - classification_loss: 0.0341 142/500 [=======>......................] - ETA: 2:59 - loss: 0.4311 - regression_loss: 0.3971 - classification_loss: 0.0340 143/500 [=======>......................] - ETA: 2:59 - loss: 0.4313 - regression_loss: 0.3971 - classification_loss: 0.0342 144/500 [=======>......................] - ETA: 2:58 - loss: 0.4305 - regression_loss: 0.3964 - classification_loss: 0.0341 145/500 [=======>......................] - ETA: 2:58 - loss: 0.4292 - regression_loss: 0.3952 - classification_loss: 0.0340 146/500 [=======>......................] - ETA: 2:57 - loss: 0.4289 - regression_loss: 0.3948 - classification_loss: 0.0341 147/500 [=======>......................] - ETA: 2:57 - loss: 0.4279 - regression_loss: 0.3940 - classification_loss: 0.0339 148/500 [=======>......................] - ETA: 2:56 - loss: 0.4271 - regression_loss: 0.3932 - classification_loss: 0.0338 149/500 [=======>......................] - ETA: 2:56 - loss: 0.4294 - regression_loss: 0.3954 - classification_loss: 0.0340 150/500 [========>.....................] - ETA: 2:55 - loss: 0.4297 - regression_loss: 0.3957 - classification_loss: 0.0340 151/500 [========>.....................] - ETA: 2:55 - loss: 0.4293 - regression_loss: 0.3954 - classification_loss: 0.0339 152/500 [========>.....................] - ETA: 2:54 - loss: 0.4288 - regression_loss: 0.3950 - classification_loss: 0.0338 153/500 [========>.....................] - ETA: 2:54 - loss: 0.4294 - regression_loss: 0.3958 - classification_loss: 0.0337 154/500 [========>.....................] - ETA: 2:53 - loss: 0.4297 - regression_loss: 0.3959 - classification_loss: 0.0338 155/500 [========>.....................] - ETA: 2:53 - loss: 0.4289 - regression_loss: 0.3952 - classification_loss: 0.0337 156/500 [========>.....................] - ETA: 2:52 - loss: 0.4280 - regression_loss: 0.3944 - classification_loss: 0.0336 157/500 [========>.....................] - ETA: 2:52 - loss: 0.4273 - regression_loss: 0.3938 - classification_loss: 0.0335 158/500 [========>.....................] - ETA: 2:51 - loss: 0.4277 - regression_loss: 0.3942 - classification_loss: 0.0336 159/500 [========>.....................] - ETA: 2:51 - loss: 0.4270 - regression_loss: 0.3934 - classification_loss: 0.0336 160/500 [========>.....................] - ETA: 2:50 - loss: 0.4265 - regression_loss: 0.3929 - classification_loss: 0.0336 161/500 [========>.....................] - ETA: 2:50 - loss: 0.4269 - regression_loss: 0.3932 - classification_loss: 0.0336 162/500 [========>.....................] - ETA: 2:49 - loss: 0.4283 - regression_loss: 0.3945 - classification_loss: 0.0338 163/500 [========>.....................] - ETA: 2:49 - loss: 0.4285 - regression_loss: 0.3946 - classification_loss: 0.0339 164/500 [========>.....................] - ETA: 2:48 - loss: 0.4273 - regression_loss: 0.3936 - classification_loss: 0.0338 165/500 [========>.....................] - ETA: 2:48 - loss: 0.4287 - regression_loss: 0.3948 - classification_loss: 0.0339 166/500 [========>.....................] - ETA: 2:47 - loss: 0.4272 - regression_loss: 0.3934 - classification_loss: 0.0337 167/500 [=========>....................] - ETA: 2:47 - loss: 0.4263 - regression_loss: 0.3927 - classification_loss: 0.0336 168/500 [=========>....................] - ETA: 2:46 - loss: 0.4254 - regression_loss: 0.3919 - classification_loss: 0.0335 169/500 [=========>....................] - ETA: 2:46 - loss: 0.4274 - regression_loss: 0.3936 - classification_loss: 0.0338 170/500 [=========>....................] - ETA: 2:45 - loss: 0.4285 - regression_loss: 0.3947 - classification_loss: 0.0338 171/500 [=========>....................] - ETA: 2:45 - loss: 0.4298 - regression_loss: 0.3959 - classification_loss: 0.0339 172/500 [=========>....................] - ETA: 2:44 - loss: 0.4292 - regression_loss: 0.3953 - classification_loss: 0.0338 173/500 [=========>....................] - ETA: 2:44 - loss: 0.4277 - regression_loss: 0.3941 - classification_loss: 0.0337 174/500 [=========>....................] - ETA: 2:43 - loss: 0.4277 - regression_loss: 0.3942 - classification_loss: 0.0335 175/500 [=========>....................] - ETA: 2:43 - loss: 0.4277 - regression_loss: 0.3941 - classification_loss: 0.0336 176/500 [=========>....................] - ETA: 2:42 - loss: 0.4273 - regression_loss: 0.3938 - classification_loss: 0.0335 177/500 [=========>....................] - ETA: 2:42 - loss: 0.4281 - regression_loss: 0.3943 - classification_loss: 0.0337 178/500 [=========>....................] - ETA: 2:41 - loss: 0.4308 - regression_loss: 0.3966 - classification_loss: 0.0342 179/500 [=========>....................] - ETA: 2:41 - loss: 0.4321 - regression_loss: 0.3976 - classification_loss: 0.0345 180/500 [=========>....................] - ETA: 2:40 - loss: 0.4317 - regression_loss: 0.3972 - classification_loss: 0.0345 181/500 [=========>....................] - ETA: 2:40 - loss: 0.4302 - regression_loss: 0.3959 - classification_loss: 0.0343 182/500 [=========>....................] - ETA: 2:39 - loss: 0.4295 - regression_loss: 0.3952 - classification_loss: 0.0343 183/500 [=========>....................] - ETA: 2:39 - loss: 0.4315 - regression_loss: 0.3970 - classification_loss: 0.0345 184/500 [==========>...................] - ETA: 2:38 - loss: 0.4309 - regression_loss: 0.3964 - classification_loss: 0.0344 185/500 [==========>...................] - ETA: 2:38 - loss: 0.4309 - regression_loss: 0.3964 - classification_loss: 0.0344 186/500 [==========>...................] - ETA: 2:37 - loss: 0.4316 - regression_loss: 0.3971 - classification_loss: 0.0345 187/500 [==========>...................] - ETA: 2:37 - loss: 0.4311 - regression_loss: 0.3966 - classification_loss: 0.0345 188/500 [==========>...................] - ETA: 2:36 - loss: 0.4313 - regression_loss: 0.3968 - classification_loss: 0.0345 189/500 [==========>...................] - ETA: 2:36 - loss: 0.4300 - regression_loss: 0.3956 - classification_loss: 0.0344 190/500 [==========>...................] - ETA: 2:35 - loss: 0.4303 - regression_loss: 0.3959 - classification_loss: 0.0344 191/500 [==========>...................] - ETA: 2:35 - loss: 0.4292 - regression_loss: 0.3950 - classification_loss: 0.0342 192/500 [==========>...................] - ETA: 2:34 - loss: 0.4294 - regression_loss: 0.3952 - classification_loss: 0.0342 193/500 [==========>...................] - ETA: 2:34 - loss: 0.4295 - regression_loss: 0.3952 - classification_loss: 0.0343 194/500 [==========>...................] - ETA: 2:33 - loss: 0.4299 - regression_loss: 0.3955 - classification_loss: 0.0344 195/500 [==========>...................] - ETA: 2:33 - loss: 0.4298 - regression_loss: 0.3955 - classification_loss: 0.0344 196/500 [==========>...................] - ETA: 2:32 - loss: 0.4295 - regression_loss: 0.3952 - classification_loss: 0.0343 197/500 [==========>...................] - ETA: 2:32 - loss: 0.4288 - regression_loss: 0.3945 - classification_loss: 0.0343 198/500 [==========>...................] - ETA: 2:31 - loss: 0.4285 - regression_loss: 0.3942 - classification_loss: 0.0342 199/500 [==========>...................] - ETA: 2:31 - loss: 0.4273 - regression_loss: 0.3932 - classification_loss: 0.0341 200/500 [===========>..................] - ETA: 2:30 - loss: 0.4288 - regression_loss: 0.3946 - classification_loss: 0.0343 201/500 [===========>..................] - ETA: 2:30 - loss: 0.4277 - regression_loss: 0.3935 - classification_loss: 0.0341 202/500 [===========>..................] - ETA: 2:29 - loss: 0.4276 - regression_loss: 0.3935 - classification_loss: 0.0341 203/500 [===========>..................] - ETA: 2:29 - loss: 0.4279 - regression_loss: 0.3937 - classification_loss: 0.0341 204/500 [===========>..................] - ETA: 2:28 - loss: 0.4283 - regression_loss: 0.3942 - classification_loss: 0.0342 205/500 [===========>..................] - ETA: 2:28 - loss: 0.4278 - regression_loss: 0.3936 - classification_loss: 0.0341 206/500 [===========>..................] - ETA: 2:27 - loss: 0.4277 - regression_loss: 0.3935 - classification_loss: 0.0341 207/500 [===========>..................] - ETA: 2:27 - loss: 0.4267 - regression_loss: 0.3926 - classification_loss: 0.0340 208/500 [===========>..................] - ETA: 2:26 - loss: 0.4273 - regression_loss: 0.3931 - classification_loss: 0.0342 209/500 [===========>..................] - ETA: 2:26 - loss: 0.4274 - regression_loss: 0.3932 - classification_loss: 0.0342 210/500 [===========>..................] - ETA: 2:25 - loss: 0.4271 - regression_loss: 0.3929 - classification_loss: 0.0342 211/500 [===========>..................] - ETA: 2:25 - loss: 0.4267 - regression_loss: 0.3924 - classification_loss: 0.0342 212/500 [===========>..................] - ETA: 2:24 - loss: 0.4263 - regression_loss: 0.3920 - classification_loss: 0.0343 213/500 [===========>..................] - ETA: 2:24 - loss: 0.4263 - regression_loss: 0.3919 - classification_loss: 0.0344 214/500 [===========>..................] - ETA: 2:23 - loss: 0.4254 - regression_loss: 0.3911 - classification_loss: 0.0343 215/500 [===========>..................] - ETA: 2:23 - loss: 0.4251 - regression_loss: 0.3908 - classification_loss: 0.0343 216/500 [===========>..................] - ETA: 2:22 - loss: 0.4240 - regression_loss: 0.3898 - classification_loss: 0.0342 217/500 [============>.................] - ETA: 2:22 - loss: 0.4244 - regression_loss: 0.3902 - classification_loss: 0.0342 218/500 [============>.................] - ETA: 2:21 - loss: 0.4243 - regression_loss: 0.3901 - classification_loss: 0.0342 219/500 [============>.................] - ETA: 2:21 - loss: 0.4248 - regression_loss: 0.3906 - classification_loss: 0.0342 220/500 [============>.................] - ETA: 2:20 - loss: 0.4242 - regression_loss: 0.3900 - classification_loss: 0.0342 221/500 [============>.................] - ETA: 2:20 - loss: 0.4239 - regression_loss: 0.3898 - classification_loss: 0.0341 222/500 [============>.................] - ETA: 2:19 - loss: 0.4251 - regression_loss: 0.3910 - classification_loss: 0.0342 223/500 [============>.................] - ETA: 2:19 - loss: 0.4258 - regression_loss: 0.3916 - classification_loss: 0.0342 224/500 [============>.................] - ETA: 2:18 - loss: 0.4273 - regression_loss: 0.3929 - classification_loss: 0.0344 225/500 [============>.................] - ETA: 2:18 - loss: 0.4278 - regression_loss: 0.3934 - classification_loss: 0.0344 226/500 [============>.................] - ETA: 2:17 - loss: 0.4285 - regression_loss: 0.3941 - classification_loss: 0.0345 227/500 [============>.................] - ETA: 2:17 - loss: 0.4294 - regression_loss: 0.3948 - classification_loss: 0.0346 228/500 [============>.................] - ETA: 2:16 - loss: 0.4287 - regression_loss: 0.3942 - classification_loss: 0.0345 229/500 [============>.................] - ETA: 2:16 - loss: 0.4314 - regression_loss: 0.3966 - classification_loss: 0.0348 230/500 [============>.................] - ETA: 2:15 - loss: 0.4335 - regression_loss: 0.3984 - classification_loss: 0.0352 231/500 [============>.................] - ETA: 2:15 - loss: 0.4336 - regression_loss: 0.3985 - classification_loss: 0.0351 232/500 [============>.................] - ETA: 2:14 - loss: 0.4340 - regression_loss: 0.3989 - classification_loss: 0.0351 233/500 [============>.................] - ETA: 2:14 - loss: 0.4345 - regression_loss: 0.3994 - classification_loss: 0.0350 234/500 [=============>................] - ETA: 2:13 - loss: 0.4345 - regression_loss: 0.3995 - classification_loss: 0.0351 235/500 [=============>................] - ETA: 2:13 - loss: 0.4344 - regression_loss: 0.3994 - classification_loss: 0.0350 236/500 [=============>................] - ETA: 2:12 - loss: 0.4345 - regression_loss: 0.3995 - classification_loss: 0.0350 237/500 [=============>................] - ETA: 2:12 - loss: 0.4360 - regression_loss: 0.4009 - classification_loss: 0.0351 238/500 [=============>................] - ETA: 2:11 - loss: 0.4361 - regression_loss: 0.4010 - classification_loss: 0.0351 239/500 [=============>................] - ETA: 2:11 - loss: 0.4371 - regression_loss: 0.4021 - classification_loss: 0.0350 240/500 [=============>................] - ETA: 2:10 - loss: 0.4386 - regression_loss: 0.4036 - classification_loss: 0.0350 241/500 [=============>................] - ETA: 2:10 - loss: 0.4376 - regression_loss: 0.4026 - classification_loss: 0.0349 242/500 [=============>................] - ETA: 2:09 - loss: 0.4373 - regression_loss: 0.4024 - classification_loss: 0.0349 243/500 [=============>................] - ETA: 2:09 - loss: 0.4366 - regression_loss: 0.4018 - classification_loss: 0.0348 244/500 [=============>................] - ETA: 2:08 - loss: 0.4357 - regression_loss: 0.4010 - classification_loss: 0.0347 245/500 [=============>................] - ETA: 2:08 - loss: 0.4354 - regression_loss: 0.4007 - classification_loss: 0.0347 246/500 [=============>................] - ETA: 2:07 - loss: 0.4366 - regression_loss: 0.4016 - classification_loss: 0.0350 247/500 [=============>................] - ETA: 2:07 - loss: 0.4370 - regression_loss: 0.4021 - classification_loss: 0.0350 248/500 [=============>................] - ETA: 2:06 - loss: 0.4366 - regression_loss: 0.4016 - classification_loss: 0.0349 249/500 [=============>................] - ETA: 2:06 - loss: 0.4368 - regression_loss: 0.4018 - classification_loss: 0.0349 250/500 [==============>...............] - ETA: 2:05 - loss: 0.4357 - regression_loss: 0.4009 - classification_loss: 0.0348 251/500 [==============>...............] - ETA: 2:05 - loss: 0.4361 - regression_loss: 0.4013 - classification_loss: 0.0348 252/500 [==============>...............] - ETA: 2:04 - loss: 0.4350 - regression_loss: 0.4003 - classification_loss: 0.0347 253/500 [==============>...............] - ETA: 2:04 - loss: 0.4355 - regression_loss: 0.4007 - classification_loss: 0.0348 254/500 [==============>...............] - ETA: 2:03 - loss: 0.4365 - regression_loss: 0.4015 - classification_loss: 0.0349 255/500 [==============>...............] - ETA: 2:03 - loss: 0.4359 - regression_loss: 0.4010 - classification_loss: 0.0349 256/500 [==============>...............] - ETA: 2:02 - loss: 0.4353 - regression_loss: 0.4004 - classification_loss: 0.0348 257/500 [==============>...............] - ETA: 2:02 - loss: 0.4350 - regression_loss: 0.4002 - classification_loss: 0.0348 258/500 [==============>...............] - ETA: 2:01 - loss: 0.4360 - regression_loss: 0.4011 - classification_loss: 0.0349 259/500 [==============>...............] - ETA: 2:00 - loss: 0.4371 - regression_loss: 0.4021 - classification_loss: 0.0350 260/500 [==============>...............] - ETA: 2:00 - loss: 0.4374 - regression_loss: 0.4022 - classification_loss: 0.0352 261/500 [==============>...............] - ETA: 2:00 - loss: 0.4370 - regression_loss: 0.4019 - classification_loss: 0.0351 262/500 [==============>...............] - ETA: 1:59 - loss: 0.4367 - regression_loss: 0.4016 - classification_loss: 0.0351 263/500 [==============>...............] - ETA: 1:58 - loss: 0.4368 - regression_loss: 0.4017 - classification_loss: 0.0351 264/500 [==============>...............] - ETA: 1:58 - loss: 0.4375 - regression_loss: 0.4022 - classification_loss: 0.0353 265/500 [==============>...............] - ETA: 1:57 - loss: 0.4383 - regression_loss: 0.4030 - classification_loss: 0.0353 266/500 [==============>...............] - ETA: 1:57 - loss: 0.4381 - regression_loss: 0.4028 - classification_loss: 0.0353 267/500 [===============>..............] - ETA: 1:56 - loss: 0.4386 - regression_loss: 0.4033 - classification_loss: 0.0354 268/500 [===============>..............] - ETA: 1:56 - loss: 0.4388 - regression_loss: 0.4034 - classification_loss: 0.0355 269/500 [===============>..............] - ETA: 1:55 - loss: 0.4389 - regression_loss: 0.4035 - classification_loss: 0.0354 270/500 [===============>..............] - ETA: 1:55 - loss: 0.4387 - regression_loss: 0.4034 - classification_loss: 0.0354 271/500 [===============>..............] - ETA: 1:54 - loss: 0.4384 - regression_loss: 0.4030 - classification_loss: 0.0353 272/500 [===============>..............] - ETA: 1:54 - loss: 0.4380 - regression_loss: 0.4028 - classification_loss: 0.0353 273/500 [===============>..............] - ETA: 1:54 - loss: 0.4380 - regression_loss: 0.4028 - classification_loss: 0.0352 274/500 [===============>..............] - ETA: 1:53 - loss: 0.4372 - regression_loss: 0.4021 - classification_loss: 0.0352 275/500 [===============>..............] - ETA: 1:52 - loss: 0.4364 - regression_loss: 0.4014 - classification_loss: 0.0351 276/500 [===============>..............] - ETA: 1:52 - loss: 0.4363 - regression_loss: 0.4013 - classification_loss: 0.0350 277/500 [===============>..............] - ETA: 1:51 - loss: 0.4365 - regression_loss: 0.4014 - classification_loss: 0.0352 278/500 [===============>..............] - ETA: 1:51 - loss: 0.4366 - regression_loss: 0.4015 - classification_loss: 0.0352 279/500 [===============>..............] - ETA: 1:50 - loss: 0.4369 - regression_loss: 0.4017 - classification_loss: 0.0352 280/500 [===============>..............] - ETA: 1:50 - loss: 0.4372 - regression_loss: 0.4020 - classification_loss: 0.0352 281/500 [===============>..............] - ETA: 1:49 - loss: 0.4371 - regression_loss: 0.4019 - classification_loss: 0.0352 282/500 [===============>..............] - ETA: 1:49 - loss: 0.4367 - regression_loss: 0.4015 - classification_loss: 0.0351 283/500 [===============>..............] - ETA: 1:48 - loss: 0.4367 - regression_loss: 0.4016 - classification_loss: 0.0351 284/500 [================>.............] - ETA: 1:48 - loss: 0.4375 - regression_loss: 0.4024 - classification_loss: 0.0351 285/500 [================>.............] - ETA: 1:47 - loss: 0.4372 - regression_loss: 0.4021 - classification_loss: 0.0351 286/500 [================>.............] - ETA: 1:47 - loss: 0.4366 - regression_loss: 0.4016 - classification_loss: 0.0350 287/500 [================>.............] - ETA: 1:46 - loss: 0.4375 - regression_loss: 0.4024 - classification_loss: 0.0351 288/500 [================>.............] - ETA: 1:46 - loss: 0.4366 - regression_loss: 0.4015 - classification_loss: 0.0351 289/500 [================>.............] - ETA: 1:45 - loss: 0.4366 - regression_loss: 0.4015 - classification_loss: 0.0351 290/500 [================>.............] - ETA: 1:45 - loss: 0.4364 - regression_loss: 0.4013 - classification_loss: 0.0351 291/500 [================>.............] - ETA: 1:44 - loss: 0.4363 - regression_loss: 0.4012 - classification_loss: 0.0351 292/500 [================>.............] - ETA: 1:44 - loss: 0.4356 - regression_loss: 0.4006 - classification_loss: 0.0350 293/500 [================>.............] - ETA: 1:43 - loss: 0.4355 - regression_loss: 0.4005 - classification_loss: 0.0349 294/500 [================>.............] - ETA: 1:43 - loss: 0.4354 - regression_loss: 0.4005 - classification_loss: 0.0349 295/500 [================>.............] - ETA: 1:42 - loss: 0.4363 - regression_loss: 0.4013 - classification_loss: 0.0349 296/500 [================>.............] - ETA: 1:42 - loss: 0.4366 - regression_loss: 0.4016 - classification_loss: 0.0349 297/500 [================>.............] - ETA: 1:41 - loss: 0.4362 - regression_loss: 0.4013 - classification_loss: 0.0349 298/500 [================>.............] - ETA: 1:41 - loss: 0.4366 - regression_loss: 0.4016 - classification_loss: 0.0350 299/500 [================>.............] - ETA: 1:40 - loss: 0.4368 - regression_loss: 0.4019 - classification_loss: 0.0349 300/500 [=================>............] - ETA: 1:40 - loss: 0.4364 - regression_loss: 0.4015 - classification_loss: 0.0349 301/500 [=================>............] - ETA: 1:39 - loss: 0.4363 - regression_loss: 0.4013 - classification_loss: 0.0350 302/500 [=================>............] - ETA: 1:39 - loss: 0.4361 - regression_loss: 0.4012 - classification_loss: 0.0350 303/500 [=================>............] - ETA: 1:38 - loss: 0.4359 - regression_loss: 0.4009 - classification_loss: 0.0350 304/500 [=================>............] - ETA: 1:38 - loss: 0.4353 - regression_loss: 0.4004 - classification_loss: 0.0349 305/500 [=================>............] - ETA: 1:37 - loss: 0.4350 - regression_loss: 0.4000 - classification_loss: 0.0350 306/500 [=================>............] - ETA: 1:37 - loss: 0.4345 - regression_loss: 0.3996 - classification_loss: 0.0349 307/500 [=================>............] - ETA: 1:36 - loss: 0.4352 - regression_loss: 0.4003 - classification_loss: 0.0349 308/500 [=================>............] - ETA: 1:36 - loss: 0.4351 - regression_loss: 0.4002 - classification_loss: 0.0349 309/500 [=================>............] - ETA: 1:35 - loss: 0.4351 - regression_loss: 0.4002 - classification_loss: 0.0349 310/500 [=================>............] - ETA: 1:35 - loss: 0.4346 - regression_loss: 0.3997 - classification_loss: 0.0349 311/500 [=================>............] - ETA: 1:34 - loss: 0.4349 - regression_loss: 0.4000 - classification_loss: 0.0349 312/500 [=================>............] - ETA: 1:34 - loss: 0.4360 - regression_loss: 0.4009 - classification_loss: 0.0351 313/500 [=================>............] - ETA: 1:33 - loss: 0.4366 - regression_loss: 0.4016 - classification_loss: 0.0350 314/500 [=================>............] - ETA: 1:33 - loss: 0.4373 - regression_loss: 0.4022 - classification_loss: 0.0351 315/500 [=================>............] - ETA: 1:32 - loss: 0.4364 - regression_loss: 0.4013 - classification_loss: 0.0351 316/500 [=================>............] - ETA: 1:32 - loss: 0.4359 - regression_loss: 0.4009 - classification_loss: 0.0350 317/500 [==================>...........] - ETA: 1:31 - loss: 0.4357 - regression_loss: 0.4008 - classification_loss: 0.0350 318/500 [==================>...........] - ETA: 1:31 - loss: 0.4361 - regression_loss: 0.4010 - classification_loss: 0.0351 319/500 [==================>...........] - ETA: 1:30 - loss: 0.4358 - regression_loss: 0.4008 - classification_loss: 0.0350 320/500 [==================>...........] - ETA: 1:30 - loss: 0.4357 - regression_loss: 0.4007 - classification_loss: 0.0350 321/500 [==================>...........] - ETA: 1:29 - loss: 0.4359 - regression_loss: 0.4008 - classification_loss: 0.0351 322/500 [==================>...........] - ETA: 1:29 - loss: 0.4368 - regression_loss: 0.4017 - classification_loss: 0.0352 323/500 [==================>...........] - ETA: 1:28 - loss: 0.4366 - regression_loss: 0.4015 - classification_loss: 0.0351 324/500 [==================>...........] - ETA: 1:28 - loss: 0.4370 - regression_loss: 0.4018 - classification_loss: 0.0352 325/500 [==================>...........] - ETA: 1:27 - loss: 0.4371 - regression_loss: 0.4019 - classification_loss: 0.0352 326/500 [==================>...........] - ETA: 1:27 - loss: 0.4384 - regression_loss: 0.4031 - classification_loss: 0.0353 327/500 [==================>...........] - ETA: 1:26 - loss: 0.4376 - regression_loss: 0.4024 - classification_loss: 0.0352 328/500 [==================>...........] - ETA: 1:26 - loss: 0.4371 - regression_loss: 0.4020 - classification_loss: 0.0351 329/500 [==================>...........] - ETA: 1:25 - loss: 0.4387 - regression_loss: 0.4036 - classification_loss: 0.0351 330/500 [==================>...........] - ETA: 1:25 - loss: 0.4386 - regression_loss: 0.4035 - classification_loss: 0.0350 331/500 [==================>...........] - ETA: 1:24 - loss: 0.4383 - regression_loss: 0.4032 - classification_loss: 0.0350 332/500 [==================>...........] - ETA: 1:24 - loss: 0.4385 - regression_loss: 0.4035 - classification_loss: 0.0350 333/500 [==================>...........] - ETA: 1:23 - loss: 0.4381 - regression_loss: 0.4032 - classification_loss: 0.0349 334/500 [===================>..........] - ETA: 1:23 - loss: 0.4379 - regression_loss: 0.4030 - classification_loss: 0.0349 335/500 [===================>..........] - ETA: 1:22 - loss: 0.4371 - regression_loss: 0.4022 - classification_loss: 0.0348 336/500 [===================>..........] - ETA: 1:22 - loss: 0.4376 - regression_loss: 0.4027 - classification_loss: 0.0349 337/500 [===================>..........] - ETA: 1:21 - loss: 0.4379 - regression_loss: 0.4030 - classification_loss: 0.0349 338/500 [===================>..........] - ETA: 1:21 - loss: 0.4376 - regression_loss: 0.4027 - classification_loss: 0.0348 339/500 [===================>..........] - ETA: 1:20 - loss: 0.4383 - regression_loss: 0.4034 - classification_loss: 0.0348 340/500 [===================>..........] - ETA: 1:20 - loss: 0.4392 - regression_loss: 0.4043 - classification_loss: 0.0349 341/500 [===================>..........] - ETA: 1:19 - loss: 0.4394 - regression_loss: 0.4045 - classification_loss: 0.0349 342/500 [===================>..........] - ETA: 1:19 - loss: 0.4400 - regression_loss: 0.4049 - classification_loss: 0.0351 343/500 [===================>..........] - ETA: 1:18 - loss: 0.4399 - regression_loss: 0.4048 - classification_loss: 0.0351 344/500 [===================>..........] - ETA: 1:18 - loss: 0.4409 - regression_loss: 0.4057 - classification_loss: 0.0352 345/500 [===================>..........] - ETA: 1:17 - loss: 0.4415 - regression_loss: 0.4060 - classification_loss: 0.0355 346/500 [===================>..........] - ETA: 1:17 - loss: 0.4421 - regression_loss: 0.4065 - classification_loss: 0.0355 347/500 [===================>..........] - ETA: 1:16 - loss: 0.4413 - regression_loss: 0.4059 - classification_loss: 0.0355 348/500 [===================>..........] - ETA: 1:16 - loss: 0.4409 - regression_loss: 0.4055 - classification_loss: 0.0354 349/500 [===================>..........] - ETA: 1:15 - loss: 0.4417 - regression_loss: 0.4061 - classification_loss: 0.0355 350/500 [====================>.........] - ETA: 1:15 - loss: 0.4411 - regression_loss: 0.4056 - classification_loss: 0.0355 351/500 [====================>.........] - ETA: 1:14 - loss: 0.4408 - regression_loss: 0.4053 - classification_loss: 0.0354 352/500 [====================>.........] - ETA: 1:14 - loss: 0.4414 - regression_loss: 0.4058 - classification_loss: 0.0355 353/500 [====================>.........] - ETA: 1:13 - loss: 0.4411 - regression_loss: 0.4056 - classification_loss: 0.0355 354/500 [====================>.........] - ETA: 1:13 - loss: 0.4420 - regression_loss: 0.4064 - classification_loss: 0.0356 355/500 [====================>.........] - ETA: 1:12 - loss: 0.4430 - regression_loss: 0.4072 - classification_loss: 0.0358 356/500 [====================>.........] - ETA: 1:12 - loss: 0.4423 - regression_loss: 0.4065 - classification_loss: 0.0358 357/500 [====================>.........] - ETA: 1:11 - loss: 0.4427 - regression_loss: 0.4070 - classification_loss: 0.0358 358/500 [====================>.........] - ETA: 1:11 - loss: 0.4423 - regression_loss: 0.4065 - classification_loss: 0.0357 359/500 [====================>.........] - ETA: 1:10 - loss: 0.4425 - regression_loss: 0.4068 - classification_loss: 0.0357 360/500 [====================>.........] - ETA: 1:10 - loss: 0.4422 - regression_loss: 0.4066 - classification_loss: 0.0357 361/500 [====================>.........] - ETA: 1:09 - loss: 0.4416 - regression_loss: 0.4060 - classification_loss: 0.0356 362/500 [====================>.........] - ETA: 1:09 - loss: 0.4415 - regression_loss: 0.4059 - classification_loss: 0.0356 363/500 [====================>.........] - ETA: 1:08 - loss: 0.4415 - regression_loss: 0.4059 - classification_loss: 0.0356 364/500 [====================>.........] - ETA: 1:08 - loss: 0.4417 - regression_loss: 0.4061 - classification_loss: 0.0356 365/500 [====================>.........] - ETA: 1:07 - loss: 0.4417 - regression_loss: 0.4060 - classification_loss: 0.0356 366/500 [====================>.........] - ETA: 1:07 - loss: 0.4415 - regression_loss: 0.4059 - classification_loss: 0.0356 367/500 [=====================>........] - ETA: 1:06 - loss: 0.4409 - regression_loss: 0.4054 - classification_loss: 0.0355 368/500 [=====================>........] - ETA: 1:06 - loss: 0.4413 - regression_loss: 0.4058 - classification_loss: 0.0355 369/500 [=====================>........] - ETA: 1:05 - loss: 0.4411 - regression_loss: 0.4056 - classification_loss: 0.0355 370/500 [=====================>........] - ETA: 1:05 - loss: 0.4404 - regression_loss: 0.4050 - classification_loss: 0.0354 371/500 [=====================>........] - ETA: 1:04 - loss: 0.4404 - regression_loss: 0.4050 - classification_loss: 0.0354 372/500 [=====================>........] - ETA: 1:04 - loss: 0.4407 - regression_loss: 0.4053 - classification_loss: 0.0354 373/500 [=====================>........] - ETA: 1:03 - loss: 0.4406 - regression_loss: 0.4053 - classification_loss: 0.0353 374/500 [=====================>........] - ETA: 1:03 - loss: 0.4414 - regression_loss: 0.4060 - classification_loss: 0.0355 375/500 [=====================>........] - ETA: 1:02 - loss: 0.4411 - regression_loss: 0.4057 - classification_loss: 0.0354 376/500 [=====================>........] - ETA: 1:02 - loss: 0.4408 - regression_loss: 0.4054 - classification_loss: 0.0354 377/500 [=====================>........] - ETA: 1:01 - loss: 0.4407 - regression_loss: 0.4054 - classification_loss: 0.0353 378/500 [=====================>........] - ETA: 1:01 - loss: 0.4416 - regression_loss: 0.4062 - classification_loss: 0.0354 379/500 [=====================>........] - ETA: 1:00 - loss: 0.4422 - regression_loss: 0.4067 - classification_loss: 0.0355 380/500 [=====================>........] - ETA: 1:00 - loss: 0.4415 - regression_loss: 0.4061 - classification_loss: 0.0354 381/500 [=====================>........] - ETA: 59s - loss: 0.4411 - regression_loss: 0.4057 - classification_loss: 0.0354  382/500 [=====================>........] - ETA: 59s - loss: 0.4409 - regression_loss: 0.4055 - classification_loss: 0.0354 383/500 [=====================>........] - ETA: 58s - loss: 0.4410 - regression_loss: 0.4057 - classification_loss: 0.0353 384/500 [======================>.......] - ETA: 58s - loss: 0.4412 - regression_loss: 0.4059 - classification_loss: 0.0353 385/500 [======================>.......] - ETA: 57s - loss: 0.4413 - regression_loss: 0.4060 - classification_loss: 0.0353 386/500 [======================>.......] - ETA: 57s - loss: 0.4409 - regression_loss: 0.4056 - classification_loss: 0.0353 387/500 [======================>.......] - ETA: 56s - loss: 0.4401 - regression_loss: 0.4049 - classification_loss: 0.0352 388/500 [======================>.......] - ETA: 56s - loss: 0.4395 - regression_loss: 0.4043 - classification_loss: 0.0352 389/500 [======================>.......] - ETA: 55s - loss: 0.4395 - regression_loss: 0.4043 - classification_loss: 0.0352 390/500 [======================>.......] - ETA: 55s - loss: 0.4395 - regression_loss: 0.4043 - classification_loss: 0.0352 391/500 [======================>.......] - ETA: 54s - loss: 0.4390 - regression_loss: 0.4039 - classification_loss: 0.0352 392/500 [======================>.......] - ETA: 54s - loss: 0.4388 - regression_loss: 0.4037 - classification_loss: 0.0351 393/500 [======================>.......] - ETA: 53s - loss: 0.4388 - regression_loss: 0.4037 - classification_loss: 0.0351 394/500 [======================>.......] - ETA: 53s - loss: 0.4387 - regression_loss: 0.4036 - classification_loss: 0.0351 395/500 [======================>.......] - ETA: 52s - loss: 0.4389 - regression_loss: 0.4037 - classification_loss: 0.0351 396/500 [======================>.......] - ETA: 52s - loss: 0.4391 - regression_loss: 0.4040 - classification_loss: 0.0352 397/500 [======================>.......] - ETA: 51s - loss: 0.4391 - regression_loss: 0.4039 - classification_loss: 0.0351 398/500 [======================>.......] - ETA: 51s - loss: 0.4386 - regression_loss: 0.4035 - classification_loss: 0.0351 399/500 [======================>.......] - ETA: 50s - loss: 0.4389 - regression_loss: 0.4038 - classification_loss: 0.0351 400/500 [=======================>......] - ETA: 50s - loss: 0.4398 - regression_loss: 0.4045 - classification_loss: 0.0352 401/500 [=======================>......] - ETA: 49s - loss: 0.4394 - regression_loss: 0.4042 - classification_loss: 0.0352 402/500 [=======================>......] - ETA: 49s - loss: 0.4391 - regression_loss: 0.4040 - classification_loss: 0.0351 403/500 [=======================>......] - ETA: 48s - loss: 0.4394 - regression_loss: 0.4042 - classification_loss: 0.0352 404/500 [=======================>......] - ETA: 48s - loss: 0.4389 - regression_loss: 0.4037 - classification_loss: 0.0352 405/500 [=======================>......] - ETA: 47s - loss: 0.4387 - regression_loss: 0.4036 - classification_loss: 0.0351 406/500 [=======================>......] - ETA: 47s - loss: 0.4380 - regression_loss: 0.4029 - classification_loss: 0.0351 407/500 [=======================>......] - ETA: 46s - loss: 0.4381 - regression_loss: 0.4031 - classification_loss: 0.0351 408/500 [=======================>......] - ETA: 46s - loss: 0.4381 - regression_loss: 0.4030 - classification_loss: 0.0351 409/500 [=======================>......] - ETA: 45s - loss: 0.4377 - regression_loss: 0.4027 - classification_loss: 0.0351 410/500 [=======================>......] - ETA: 45s - loss: 0.4377 - regression_loss: 0.4026 - classification_loss: 0.0351 411/500 [=======================>......] - ETA: 44s - loss: 0.4376 - regression_loss: 0.4026 - classification_loss: 0.0351 412/500 [=======================>......] - ETA: 44s - loss: 0.4373 - regression_loss: 0.4023 - classification_loss: 0.0350 413/500 [=======================>......] - ETA: 43s - loss: 0.4373 - regression_loss: 0.4023 - classification_loss: 0.0350 414/500 [=======================>......] - ETA: 43s - loss: 0.4377 - regression_loss: 0.4027 - classification_loss: 0.0350 415/500 [=======================>......] - ETA: 42s - loss: 0.4374 - regression_loss: 0.4025 - classification_loss: 0.0350 416/500 [=======================>......] - ETA: 42s - loss: 0.4373 - regression_loss: 0.4024 - classification_loss: 0.0349 417/500 [========================>.....] - ETA: 41s - loss: 0.4369 - regression_loss: 0.4021 - classification_loss: 0.0349 418/500 [========================>.....] - ETA: 41s - loss: 0.4378 - regression_loss: 0.4028 - classification_loss: 0.0350 419/500 [========================>.....] - ETA: 40s - loss: 0.4374 - regression_loss: 0.4025 - classification_loss: 0.0349 420/500 [========================>.....] - ETA: 40s - loss: 0.4370 - regression_loss: 0.4021 - classification_loss: 0.0349 421/500 [========================>.....] - ETA: 39s - loss: 0.4372 - regression_loss: 0.4023 - classification_loss: 0.0349 422/500 [========================>.....] - ETA: 39s - loss: 0.4371 - regression_loss: 0.4022 - classification_loss: 0.0349 423/500 [========================>.....] - ETA: 38s - loss: 0.4366 - regression_loss: 0.4018 - classification_loss: 0.0348 424/500 [========================>.....] - ETA: 38s - loss: 0.4362 - regression_loss: 0.4015 - classification_loss: 0.0348 425/500 [========================>.....] - ETA: 37s - loss: 0.4365 - regression_loss: 0.4017 - classification_loss: 0.0348 426/500 [========================>.....] - ETA: 37s - loss: 0.4364 - regression_loss: 0.4016 - classification_loss: 0.0348 427/500 [========================>.....] - ETA: 36s - loss: 0.4368 - regression_loss: 0.4019 - classification_loss: 0.0350 428/500 [========================>.....] - ETA: 36s - loss: 0.4365 - regression_loss: 0.4015 - classification_loss: 0.0350 429/500 [========================>.....] - ETA: 35s - loss: 0.4366 - regression_loss: 0.4016 - classification_loss: 0.0350 430/500 [========================>.....] - ETA: 35s - loss: 0.4367 - regression_loss: 0.4018 - classification_loss: 0.0350 431/500 [========================>.....] - ETA: 34s - loss: 0.4365 - regression_loss: 0.4016 - classification_loss: 0.0349 432/500 [========================>.....] - ETA: 34s - loss: 0.4361 - regression_loss: 0.4013 - classification_loss: 0.0349 433/500 [========================>.....] - ETA: 33s - loss: 0.4359 - regression_loss: 0.4010 - classification_loss: 0.0348 434/500 [=========================>....] - ETA: 33s - loss: 0.4356 - regression_loss: 0.4008 - classification_loss: 0.0348 435/500 [=========================>....] - ETA: 32s - loss: 0.4356 - regression_loss: 0.4008 - classification_loss: 0.0348 436/500 [=========================>....] - ETA: 32s - loss: 0.4361 - regression_loss: 0.4013 - classification_loss: 0.0348 437/500 [=========================>....] - ETA: 31s - loss: 0.4355 - regression_loss: 0.4007 - classification_loss: 0.0348 438/500 [=========================>....] - ETA: 31s - loss: 0.4358 - regression_loss: 0.4010 - classification_loss: 0.0348 439/500 [=========================>....] - ETA: 30s - loss: 0.4356 - regression_loss: 0.4008 - classification_loss: 0.0348 440/500 [=========================>....] - ETA: 30s - loss: 0.4351 - regression_loss: 0.4004 - classification_loss: 0.0347 441/500 [=========================>....] - ETA: 29s - loss: 0.4352 - regression_loss: 0.4005 - classification_loss: 0.0347 442/500 [=========================>....] - ETA: 29s - loss: 0.4358 - regression_loss: 0.4011 - classification_loss: 0.0347 443/500 [=========================>....] - ETA: 28s - loss: 0.4359 - regression_loss: 0.4012 - classification_loss: 0.0347 444/500 [=========================>....] - ETA: 28s - loss: 0.4359 - regression_loss: 0.4012 - classification_loss: 0.0347 445/500 [=========================>....] - ETA: 27s - loss: 0.4359 - regression_loss: 0.4012 - classification_loss: 0.0347 446/500 [=========================>....] - ETA: 27s - loss: 0.4355 - regression_loss: 0.4009 - classification_loss: 0.0346 447/500 [=========================>....] - ETA: 26s - loss: 0.4360 - regression_loss: 0.4012 - classification_loss: 0.0348 448/500 [=========================>....] - ETA: 26s - loss: 0.4358 - regression_loss: 0.4010 - classification_loss: 0.0348 449/500 [=========================>....] - ETA: 25s - loss: 0.4363 - regression_loss: 0.4014 - classification_loss: 0.0349 450/500 [==========================>...] - ETA: 25s - loss: 0.4363 - regression_loss: 0.4014 - classification_loss: 0.0349 451/500 [==========================>...] - ETA: 24s - loss: 0.4356 - regression_loss: 0.4007 - classification_loss: 0.0348 452/500 [==========================>...] - ETA: 24s - loss: 0.4354 - regression_loss: 0.4006 - classification_loss: 0.0348 453/500 [==========================>...] - ETA: 23s - loss: 0.4348 - regression_loss: 0.4001 - classification_loss: 0.0348 454/500 [==========================>...] - ETA: 23s - loss: 0.4351 - regression_loss: 0.4003 - classification_loss: 0.0348 455/500 [==========================>...] - ETA: 22s - loss: 0.4359 - regression_loss: 0.4010 - classification_loss: 0.0349 456/500 [==========================>...] - ETA: 22s - loss: 0.4360 - regression_loss: 0.4011 - classification_loss: 0.0349 457/500 [==========================>...] - ETA: 21s - loss: 0.4360 - regression_loss: 0.4012 - classification_loss: 0.0349 458/500 [==========================>...] - ETA: 21s - loss: 0.4359 - regression_loss: 0.4011 - classification_loss: 0.0349 459/500 [==========================>...] - ETA: 20s - loss: 0.4359 - regression_loss: 0.4010 - classification_loss: 0.0348 460/500 [==========================>...] - ETA: 20s - loss: 0.4359 - regression_loss: 0.4011 - classification_loss: 0.0348 461/500 [==========================>...] - ETA: 19s - loss: 0.4368 - regression_loss: 0.4020 - classification_loss: 0.0348 462/500 [==========================>...] - ETA: 19s - loss: 0.4370 - regression_loss: 0.4022 - classification_loss: 0.0348 463/500 [==========================>...] - ETA: 18s - loss: 0.4371 - regression_loss: 0.4023 - classification_loss: 0.0348 464/500 [==========================>...] - ETA: 18s - loss: 0.4372 - regression_loss: 0.4023 - classification_loss: 0.0349 465/500 [==========================>...] - ETA: 17s - loss: 0.4369 - regression_loss: 0.4021 - classification_loss: 0.0348 466/500 [==========================>...] - ETA: 17s - loss: 0.4365 - regression_loss: 0.4017 - classification_loss: 0.0348 467/500 [===========================>..] - ETA: 16s - loss: 0.4364 - regression_loss: 0.4016 - classification_loss: 0.0348 468/500 [===========================>..] - ETA: 16s - loss: 0.4361 - regression_loss: 0.4014 - classification_loss: 0.0347 469/500 [===========================>..] - ETA: 15s - loss: 0.4357 - regression_loss: 0.4011 - classification_loss: 0.0347 470/500 [===========================>..] - ETA: 15s - loss: 0.4357 - regression_loss: 0.4010 - classification_loss: 0.0347 471/500 [===========================>..] - ETA: 14s - loss: 0.4355 - regression_loss: 0.4008 - classification_loss: 0.0347 472/500 [===========================>..] - ETA: 14s - loss: 0.4353 - regression_loss: 0.4007 - classification_loss: 0.0346 473/500 [===========================>..] - ETA: 13s - loss: 0.4361 - regression_loss: 0.4014 - classification_loss: 0.0347 474/500 [===========================>..] - ETA: 13s - loss: 0.4360 - regression_loss: 0.4013 - classification_loss: 0.0346 475/500 [===========================>..] - ETA: 12s - loss: 0.4359 - regression_loss: 0.4013 - classification_loss: 0.0346 476/500 [===========================>..] - ETA: 12s - loss: 0.4357 - regression_loss: 0.4011 - classification_loss: 0.0346 477/500 [===========================>..] - ETA: 11s - loss: 0.4364 - regression_loss: 0.4019 - classification_loss: 0.0346 478/500 [===========================>..] - ETA: 11s - loss: 0.4365 - regression_loss: 0.4019 - classification_loss: 0.0346 479/500 [===========================>..] - ETA: 10s - loss: 0.4364 - regression_loss: 0.4018 - classification_loss: 0.0346 480/500 [===========================>..] - ETA: 10s - loss: 0.4360 - regression_loss: 0.4015 - classification_loss: 0.0345 481/500 [===========================>..] - ETA: 9s - loss: 0.4361 - regression_loss: 0.4016 - classification_loss: 0.0346  482/500 [===========================>..] - ETA: 9s - loss: 0.4360 - regression_loss: 0.4014 - classification_loss: 0.0346 483/500 [===========================>..] - ETA: 8s - loss: 0.4357 - regression_loss: 0.4011 - classification_loss: 0.0346 484/500 [============================>.] - ETA: 8s - loss: 0.4355 - regression_loss: 0.4010 - classification_loss: 0.0346 485/500 [============================>.] - ETA: 7s - loss: 0.4355 - regression_loss: 0.4009 - classification_loss: 0.0345 486/500 [============================>.] - ETA: 7s - loss: 0.4352 - regression_loss: 0.4007 - classification_loss: 0.0345 487/500 [============================>.] - ETA: 6s - loss: 0.4347 - regression_loss: 0.4003 - classification_loss: 0.0344 488/500 [============================>.] - ETA: 6s - loss: 0.4347 - regression_loss: 0.4003 - classification_loss: 0.0344 489/500 [============================>.] - ETA: 5s - loss: 0.4348 - regression_loss: 0.4004 - classification_loss: 0.0344 490/500 [============================>.] - ETA: 5s - loss: 0.4347 - regression_loss: 0.4003 - classification_loss: 0.0344 491/500 [============================>.] - ETA: 4s - loss: 0.4345 - regression_loss: 0.4001 - classification_loss: 0.0344 492/500 [============================>.] - ETA: 4s - loss: 0.4350 - regression_loss: 0.4006 - classification_loss: 0.0344 493/500 [============================>.] - ETA: 3s - loss: 0.4348 - regression_loss: 0.4004 - classification_loss: 0.0344 494/500 [============================>.] - ETA: 3s - loss: 0.4346 - regression_loss: 0.4002 - classification_loss: 0.0344 495/500 [============================>.] - ETA: 2s - loss: 0.4346 - regression_loss: 0.4002 - classification_loss: 0.0344 496/500 [============================>.] - ETA: 2s - loss: 0.4345 - regression_loss: 0.4001 - classification_loss: 0.0344 497/500 [============================>.] - ETA: 1s - loss: 0.4346 - regression_loss: 0.4002 - classification_loss: 0.0344 498/500 [============================>.] - ETA: 1s - loss: 0.4346 - regression_loss: 0.4002 - classification_loss: 0.0344 499/500 [============================>.] - ETA: 0s - loss: 0.4348 - regression_loss: 0.4004 - classification_loss: 0.0344 500/500 [==============================] - 251s 502ms/step - loss: 0.4353 - regression_loss: 0.4009 - classification_loss: 0.0344 1172 instances of class plum with average precision: 0.7262 mAP: 0.7262 Epoch 00005: saving model to ./training/snapshots/resnet101_pascal_05.h5 Epoch 6/150 1/500 [..............................] - ETA: 4:11 - loss: 0.2745 - regression_loss: 0.2569 - classification_loss: 0.0176 2/500 [..............................] - ETA: 4:05 - loss: 0.4875 - regression_loss: 0.4457 - classification_loss: 0.0418 3/500 [..............................] - ETA: 4:03 - loss: 0.4717 - regression_loss: 0.4315 - classification_loss: 0.0402 4/500 [..............................] - ETA: 4:02 - loss: 0.3826 - regression_loss: 0.3494 - classification_loss: 0.0332 5/500 [..............................] - ETA: 4:02 - loss: 0.3815 - regression_loss: 0.3479 - classification_loss: 0.0337 6/500 [..............................] - ETA: 4:03 - loss: 0.3868 - regression_loss: 0.3569 - classification_loss: 0.0299 7/500 [..............................] - ETA: 4:04 - loss: 0.3943 - regression_loss: 0.3641 - classification_loss: 0.0302 8/500 [..............................] - ETA: 4:04 - loss: 0.3868 - regression_loss: 0.3584 - classification_loss: 0.0285 9/500 [..............................] - ETA: 4:04 - loss: 0.4342 - regression_loss: 0.4026 - classification_loss: 0.0316 10/500 [..............................] - ETA: 4:04 - loss: 0.4536 - regression_loss: 0.4197 - classification_loss: 0.0339 11/500 [..............................] - ETA: 4:04 - loss: 0.4649 - regression_loss: 0.4267 - classification_loss: 0.0382 12/500 [..............................] - ETA: 4:02 - loss: 0.4516 - regression_loss: 0.4149 - classification_loss: 0.0367 13/500 [..............................] - ETA: 4:02 - loss: 0.4261 - regression_loss: 0.3918 - classification_loss: 0.0343 14/500 [..............................] - ETA: 4:01 - loss: 0.4276 - regression_loss: 0.3933 - classification_loss: 0.0342 15/500 [..............................] - ETA: 4:01 - loss: 0.4303 - regression_loss: 0.3962 - classification_loss: 0.0341 16/500 [..............................] - ETA: 4:01 - loss: 0.4396 - regression_loss: 0.4046 - classification_loss: 0.0350 17/500 [>.............................] - ETA: 4:00 - loss: 0.4387 - regression_loss: 0.4027 - classification_loss: 0.0360 18/500 [>.............................] - ETA: 4:00 - loss: 0.4389 - regression_loss: 0.4037 - classification_loss: 0.0352 19/500 [>.............................] - ETA: 4:00 - loss: 0.4460 - regression_loss: 0.4107 - classification_loss: 0.0353 20/500 [>.............................] - ETA: 3:59 - loss: 0.4397 - regression_loss: 0.4045 - classification_loss: 0.0352 21/500 [>.............................] - ETA: 3:59 - loss: 0.4415 - regression_loss: 0.4062 - classification_loss: 0.0353 22/500 [>.............................] - ETA: 3:59 - loss: 0.4403 - regression_loss: 0.4052 - classification_loss: 0.0352 23/500 [>.............................] - ETA: 3:58 - loss: 0.4366 - regression_loss: 0.4021 - classification_loss: 0.0344 24/500 [>.............................] - ETA: 3:58 - loss: 0.4280 - regression_loss: 0.3943 - classification_loss: 0.0337 25/500 [>.............................] - ETA: 3:57 - loss: 0.4258 - regression_loss: 0.3917 - classification_loss: 0.0341 26/500 [>.............................] - ETA: 3:57 - loss: 0.4214 - regression_loss: 0.3876 - classification_loss: 0.0338 27/500 [>.............................] - ETA: 3:56 - loss: 0.4121 - regression_loss: 0.3789 - classification_loss: 0.0332 28/500 [>.............................] - ETA: 3:56 - loss: 0.4161 - regression_loss: 0.3823 - classification_loss: 0.0338 29/500 [>.............................] - ETA: 3:56 - loss: 0.4160 - regression_loss: 0.3824 - classification_loss: 0.0336 30/500 [>.............................] - ETA: 3:55 - loss: 0.4233 - regression_loss: 0.3901 - classification_loss: 0.0332 31/500 [>.............................] - ETA: 3:55 - loss: 0.4255 - regression_loss: 0.3915 - classification_loss: 0.0340 32/500 [>.............................] - ETA: 3:54 - loss: 0.4251 - regression_loss: 0.3911 - classification_loss: 0.0340 33/500 [>.............................] - ETA: 3:54 - loss: 0.4265 - regression_loss: 0.3924 - classification_loss: 0.0341 34/500 [=>............................] - ETA: 3:53 - loss: 0.4285 - regression_loss: 0.3944 - classification_loss: 0.0342 35/500 [=>............................] - ETA: 3:52 - loss: 0.4266 - regression_loss: 0.3928 - classification_loss: 0.0338 36/500 [=>............................] - ETA: 3:52 - loss: 0.4279 - regression_loss: 0.3944 - classification_loss: 0.0335 37/500 [=>............................] - ETA: 3:52 - loss: 0.4307 - regression_loss: 0.3970 - classification_loss: 0.0337 38/500 [=>............................] - ETA: 3:51 - loss: 0.4369 - regression_loss: 0.4027 - classification_loss: 0.0342 39/500 [=>............................] - ETA: 3:51 - loss: 0.4350 - regression_loss: 0.4012 - classification_loss: 0.0338 40/500 [=>............................] - ETA: 3:50 - loss: 0.4345 - regression_loss: 0.4009 - classification_loss: 0.0336 41/500 [=>............................] - ETA: 3:50 - loss: 0.4451 - regression_loss: 0.4106 - classification_loss: 0.0345 42/500 [=>............................] - ETA: 3:49 - loss: 0.4435 - regression_loss: 0.4093 - classification_loss: 0.0342 43/500 [=>............................] - ETA: 3:49 - loss: 0.4503 - regression_loss: 0.4148 - classification_loss: 0.0355 44/500 [=>............................] - ETA: 3:48 - loss: 0.4465 - regression_loss: 0.4117 - classification_loss: 0.0349 45/500 [=>............................] - ETA: 3:48 - loss: 0.4525 - regression_loss: 0.4170 - classification_loss: 0.0354 46/500 [=>............................] - ETA: 3:47 - loss: 0.4597 - regression_loss: 0.4235 - classification_loss: 0.0362 47/500 [=>............................] - ETA: 3:47 - loss: 0.4574 - regression_loss: 0.4219 - classification_loss: 0.0356 48/500 [=>............................] - ETA: 3:46 - loss: 0.4527 - regression_loss: 0.4173 - classification_loss: 0.0354 49/500 [=>............................] - ETA: 3:46 - loss: 0.4539 - regression_loss: 0.4186 - classification_loss: 0.0353 50/500 [==>...........................] - ETA: 3:45 - loss: 0.4504 - regression_loss: 0.4154 - classification_loss: 0.0350 51/500 [==>...........................] - ETA: 3:45 - loss: 0.4538 - regression_loss: 0.4187 - classification_loss: 0.0351 52/500 [==>...........................] - ETA: 3:44 - loss: 0.4554 - regression_loss: 0.4198 - classification_loss: 0.0355 53/500 [==>...........................] - ETA: 3:44 - loss: 0.4525 - regression_loss: 0.4171 - classification_loss: 0.0354 54/500 [==>...........................] - ETA: 3:43 - loss: 0.4563 - regression_loss: 0.4202 - classification_loss: 0.0361 55/500 [==>...........................] - ETA: 3:43 - loss: 0.4599 - regression_loss: 0.4230 - classification_loss: 0.0368 56/500 [==>...........................] - ETA: 3:42 - loss: 0.4620 - regression_loss: 0.4248 - classification_loss: 0.0372 57/500 [==>...........................] - ETA: 3:42 - loss: 0.4591 - regression_loss: 0.4223 - classification_loss: 0.0367 58/500 [==>...........................] - ETA: 3:41 - loss: 0.4560 - regression_loss: 0.4196 - classification_loss: 0.0364 59/500 [==>...........................] - ETA: 3:41 - loss: 0.4567 - regression_loss: 0.4203 - classification_loss: 0.0364 60/500 [==>...........................] - ETA: 3:40 - loss: 0.4551 - regression_loss: 0.4189 - classification_loss: 0.0363 61/500 [==>...........................] - ETA: 3:40 - loss: 0.4582 - regression_loss: 0.4219 - classification_loss: 0.0364 62/500 [==>...........................] - ETA: 3:39 - loss: 0.4565 - regression_loss: 0.4202 - classification_loss: 0.0362 63/500 [==>...........................] - ETA: 3:39 - loss: 0.4529 - regression_loss: 0.4170 - classification_loss: 0.0359 64/500 [==>...........................] - ETA: 3:39 - loss: 0.4610 - regression_loss: 0.4239 - classification_loss: 0.0371 65/500 [==>...........................] - ETA: 3:38 - loss: 0.4622 - regression_loss: 0.4250 - classification_loss: 0.0372 66/500 [==>...........................] - ETA: 3:37 - loss: 0.4576 - regression_loss: 0.4209 - classification_loss: 0.0368 67/500 [===>..........................] - ETA: 3:37 - loss: 0.4557 - regression_loss: 0.4188 - classification_loss: 0.0368 68/500 [===>..........................] - ETA: 3:36 - loss: 0.4562 - regression_loss: 0.4196 - classification_loss: 0.0365 69/500 [===>..........................] - ETA: 3:36 - loss: 0.4537 - regression_loss: 0.4172 - classification_loss: 0.0365 70/500 [===>..........................] - ETA: 3:35 - loss: 0.4579 - regression_loss: 0.4210 - classification_loss: 0.0369 71/500 [===>..........................] - ETA: 3:35 - loss: 0.4584 - regression_loss: 0.4215 - classification_loss: 0.0369 72/500 [===>..........................] - ETA: 3:34 - loss: 0.4537 - regression_loss: 0.4171 - classification_loss: 0.0366 73/500 [===>..........................] - ETA: 3:34 - loss: 0.4570 - regression_loss: 0.4198 - classification_loss: 0.0372 74/500 [===>..........................] - ETA: 3:34 - loss: 0.4562 - regression_loss: 0.4191 - classification_loss: 0.0371 75/500 [===>..........................] - ETA: 3:33 - loss: 0.4549 - regression_loss: 0.4180 - classification_loss: 0.0369 76/500 [===>..........................] - ETA: 3:33 - loss: 0.4575 - regression_loss: 0.4207 - classification_loss: 0.0368 77/500 [===>..........................] - ETA: 3:32 - loss: 0.4618 - regression_loss: 0.4247 - classification_loss: 0.0371 78/500 [===>..........................] - ETA: 3:32 - loss: 0.4660 - regression_loss: 0.4282 - classification_loss: 0.0379 79/500 [===>..........................] - ETA: 3:31 - loss: 0.4687 - regression_loss: 0.4307 - classification_loss: 0.0381 80/500 [===>..........................] - ETA: 3:30 - loss: 0.4669 - regression_loss: 0.4291 - classification_loss: 0.0378 81/500 [===>..........................] - ETA: 3:30 - loss: 0.4673 - regression_loss: 0.4293 - classification_loss: 0.0380 82/500 [===>..........................] - ETA: 3:29 - loss: 0.4637 - regression_loss: 0.4260 - classification_loss: 0.0376 83/500 [===>..........................] - ETA: 3:29 - loss: 0.4648 - regression_loss: 0.4272 - classification_loss: 0.0376 84/500 [====>.........................] - ETA: 3:28 - loss: 0.4636 - regression_loss: 0.4261 - classification_loss: 0.0374 85/500 [====>.........................] - ETA: 3:28 - loss: 0.4645 - regression_loss: 0.4271 - classification_loss: 0.0374 86/500 [====>.........................] - ETA: 3:27 - loss: 0.4605 - regression_loss: 0.4234 - classification_loss: 0.0371 87/500 [====>.........................] - ETA: 3:27 - loss: 0.4599 - regression_loss: 0.4230 - classification_loss: 0.0369 88/500 [====>.........................] - ETA: 3:26 - loss: 0.4602 - regression_loss: 0.4234 - classification_loss: 0.0368 89/500 [====>.........................] - ETA: 3:26 - loss: 0.4597 - regression_loss: 0.4230 - classification_loss: 0.0368 90/500 [====>.........................] - ETA: 3:26 - loss: 0.4587 - regression_loss: 0.4220 - classification_loss: 0.0367 91/500 [====>.........................] - ETA: 3:25 - loss: 0.4578 - regression_loss: 0.4211 - classification_loss: 0.0367 92/500 [====>.........................] - ETA: 3:24 - loss: 0.4551 - regression_loss: 0.4186 - classification_loss: 0.0365 93/500 [====>.........................] - ETA: 3:24 - loss: 0.4549 - regression_loss: 0.4183 - classification_loss: 0.0366 94/500 [====>.........................] - ETA: 3:23 - loss: 0.4547 - regression_loss: 0.4181 - classification_loss: 0.0366 95/500 [====>.........................] - ETA: 3:23 - loss: 0.4549 - regression_loss: 0.4184 - classification_loss: 0.0364 96/500 [====>.........................] - ETA: 3:22 - loss: 0.4519 - regression_loss: 0.4158 - classification_loss: 0.0361 97/500 [====>.........................] - ETA: 3:22 - loss: 0.4497 - regression_loss: 0.4138 - classification_loss: 0.0359 98/500 [====>.........................] - ETA: 3:22 - loss: 0.4468 - regression_loss: 0.4112 - classification_loss: 0.0356 99/500 [====>.........................] - ETA: 3:21 - loss: 0.4467 - regression_loss: 0.4111 - classification_loss: 0.0356 100/500 [=====>........................] - ETA: 3:21 - loss: 0.4446 - regression_loss: 0.4092 - classification_loss: 0.0354 101/500 [=====>........................] - ETA: 3:20 - loss: 0.4447 - regression_loss: 0.4094 - classification_loss: 0.0354 102/500 [=====>........................] - ETA: 3:19 - loss: 0.4481 - regression_loss: 0.4125 - classification_loss: 0.0356 103/500 [=====>........................] - ETA: 3:19 - loss: 0.4488 - regression_loss: 0.4130 - classification_loss: 0.0358 104/500 [=====>........................] - ETA: 3:19 - loss: 0.4510 - regression_loss: 0.4149 - classification_loss: 0.0361 105/500 [=====>........................] - ETA: 3:18 - loss: 0.4498 - regression_loss: 0.4139 - classification_loss: 0.0358 106/500 [=====>........................] - ETA: 3:18 - loss: 0.4498 - regression_loss: 0.4140 - classification_loss: 0.0357 107/500 [=====>........................] - ETA: 3:17 - loss: 0.4494 - regression_loss: 0.4138 - classification_loss: 0.0356 108/500 [=====>........................] - ETA: 3:16 - loss: 0.4498 - regression_loss: 0.4142 - classification_loss: 0.0356 109/500 [=====>........................] - ETA: 3:16 - loss: 0.4472 - regression_loss: 0.4118 - classification_loss: 0.0354 110/500 [=====>........................] - ETA: 3:15 - loss: 0.4466 - regression_loss: 0.4113 - classification_loss: 0.0353 111/500 [=====>........................] - ETA: 3:15 - loss: 0.4473 - regression_loss: 0.4119 - classification_loss: 0.0354 112/500 [=====>........................] - ETA: 3:14 - loss: 0.4461 - regression_loss: 0.4108 - classification_loss: 0.0352 113/500 [=====>........................] - ETA: 3:14 - loss: 0.4462 - regression_loss: 0.4108 - classification_loss: 0.0354 114/500 [=====>........................] - ETA: 3:13 - loss: 0.4444 - regression_loss: 0.4091 - classification_loss: 0.0353 115/500 [=====>........................] - ETA: 3:13 - loss: 0.4420 - regression_loss: 0.4069 - classification_loss: 0.0351 116/500 [=====>........................] - ETA: 3:12 - loss: 0.4415 - regression_loss: 0.4064 - classification_loss: 0.0351 117/500 [======>.......................] - ETA: 3:12 - loss: 0.4389 - regression_loss: 0.4040 - classification_loss: 0.0349 118/500 [======>.......................] - ETA: 3:11 - loss: 0.4394 - regression_loss: 0.4043 - classification_loss: 0.0351 119/500 [======>.......................] - ETA: 3:11 - loss: 0.4405 - regression_loss: 0.4054 - classification_loss: 0.0351 120/500 [======>.......................] - ETA: 3:10 - loss: 0.4416 - regression_loss: 0.4062 - classification_loss: 0.0355 121/500 [======>.......................] - ETA: 3:10 - loss: 0.4439 - regression_loss: 0.4080 - classification_loss: 0.0359 122/500 [======>.......................] - ETA: 3:10 - loss: 0.4451 - regression_loss: 0.4091 - classification_loss: 0.0360 123/500 [======>.......................] - ETA: 3:09 - loss: 0.4463 - regression_loss: 0.4103 - classification_loss: 0.0360 124/500 [======>.......................] - ETA: 3:09 - loss: 0.4460 - regression_loss: 0.4101 - classification_loss: 0.0359 125/500 [======>.......................] - ETA: 3:08 - loss: 0.4448 - regression_loss: 0.4091 - classification_loss: 0.0357 126/500 [======>.......................] - ETA: 3:07 - loss: 0.4426 - regression_loss: 0.4071 - classification_loss: 0.0355 127/500 [======>.......................] - ETA: 3:07 - loss: 0.4427 - regression_loss: 0.4071 - classification_loss: 0.0356 128/500 [======>.......................] - ETA: 3:06 - loss: 0.4447 - regression_loss: 0.4087 - classification_loss: 0.0360 129/500 [======>.......................] - ETA: 3:06 - loss: 0.4433 - regression_loss: 0.4074 - classification_loss: 0.0359 130/500 [======>.......................] - ETA: 3:05 - loss: 0.4431 - regression_loss: 0.4072 - classification_loss: 0.0358 131/500 [======>.......................] - ETA: 3:05 - loss: 0.4439 - regression_loss: 0.4079 - classification_loss: 0.0360 132/500 [======>.......................] - ETA: 3:04 - loss: 0.4422 - regression_loss: 0.4064 - classification_loss: 0.0358 133/500 [======>.......................] - ETA: 3:04 - loss: 0.4423 - regression_loss: 0.4065 - classification_loss: 0.0358 134/500 [=======>......................] - ETA: 3:03 - loss: 0.4432 - regression_loss: 0.4075 - classification_loss: 0.0357 135/500 [=======>......................] - ETA: 3:03 - loss: 0.4431 - regression_loss: 0.4073 - classification_loss: 0.0357 136/500 [=======>......................] - ETA: 3:02 - loss: 0.4421 - regression_loss: 0.4065 - classification_loss: 0.0356 137/500 [=======>......................] - ETA: 3:02 - loss: 0.4414 - regression_loss: 0.4058 - classification_loss: 0.0356 138/500 [=======>......................] - ETA: 3:01 - loss: 0.4407 - regression_loss: 0.4052 - classification_loss: 0.0355 139/500 [=======>......................] - ETA: 3:01 - loss: 0.4387 - regression_loss: 0.4034 - classification_loss: 0.0353 140/500 [=======>......................] - ETA: 3:00 - loss: 0.4396 - regression_loss: 0.4043 - classification_loss: 0.0353 141/500 [=======>......................] - ETA: 3:00 - loss: 0.4389 - regression_loss: 0.4037 - classification_loss: 0.0352 142/500 [=======>......................] - ETA: 2:59 - loss: 0.4381 - regression_loss: 0.4029 - classification_loss: 0.0351 143/500 [=======>......................] - ETA: 2:59 - loss: 0.4397 - regression_loss: 0.4043 - classification_loss: 0.0354 144/500 [=======>......................] - ETA: 2:58 - loss: 0.4394 - regression_loss: 0.4040 - classification_loss: 0.0354 145/500 [=======>......................] - ETA: 2:58 - loss: 0.4406 - regression_loss: 0.4051 - classification_loss: 0.0354 146/500 [=======>......................] - ETA: 2:57 - loss: 0.4395 - regression_loss: 0.4043 - classification_loss: 0.0352 147/500 [=======>......................] - ETA: 2:57 - loss: 0.4385 - regression_loss: 0.4034 - classification_loss: 0.0351 148/500 [=======>......................] - ETA: 2:56 - loss: 0.4393 - regression_loss: 0.4042 - classification_loss: 0.0351 149/500 [=======>......................] - ETA: 2:56 - loss: 0.4391 - regression_loss: 0.4040 - classification_loss: 0.0351 150/500 [========>.....................] - ETA: 2:55 - loss: 0.4415 - regression_loss: 0.4061 - classification_loss: 0.0354 151/500 [========>.....................] - ETA: 2:55 - loss: 0.4399 - regression_loss: 0.4046 - classification_loss: 0.0353 152/500 [========>.....................] - ETA: 2:54 - loss: 0.4406 - regression_loss: 0.4053 - classification_loss: 0.0353 153/500 [========>.....................] - ETA: 2:54 - loss: 0.4411 - regression_loss: 0.4058 - classification_loss: 0.0353 154/500 [========>.....................] - ETA: 2:53 - loss: 0.4410 - regression_loss: 0.4058 - classification_loss: 0.0352 155/500 [========>.....................] - ETA: 2:53 - loss: 0.4427 - regression_loss: 0.4072 - classification_loss: 0.0355 156/500 [========>.....................] - ETA: 2:52 - loss: 0.4428 - regression_loss: 0.4074 - classification_loss: 0.0355 157/500 [========>.....................] - ETA: 2:52 - loss: 0.4414 - regression_loss: 0.4060 - classification_loss: 0.0354 158/500 [========>.....................] - ETA: 2:51 - loss: 0.4404 - regression_loss: 0.4051 - classification_loss: 0.0353 159/500 [========>.....................] - ETA: 2:51 - loss: 0.4401 - regression_loss: 0.4048 - classification_loss: 0.0353 160/500 [========>.....................] - ETA: 2:50 - loss: 0.4391 - regression_loss: 0.4039 - classification_loss: 0.0352 161/500 [========>.....................] - ETA: 2:50 - loss: 0.4404 - regression_loss: 0.4053 - classification_loss: 0.0352 162/500 [========>.....................] - ETA: 2:49 - loss: 0.4390 - regression_loss: 0.4040 - classification_loss: 0.0350 163/500 [========>.....................] - ETA: 2:49 - loss: 0.4393 - regression_loss: 0.4043 - classification_loss: 0.0350 164/500 [========>.....................] - ETA: 2:48 - loss: 0.4383 - regression_loss: 0.4034 - classification_loss: 0.0349 165/500 [========>.....................] - ETA: 2:48 - loss: 0.4380 - regression_loss: 0.4031 - classification_loss: 0.0350 166/500 [========>.....................] - ETA: 2:47 - loss: 0.4400 - regression_loss: 0.4049 - classification_loss: 0.0351 167/500 [=========>....................] - ETA: 2:47 - loss: 0.4399 - regression_loss: 0.4048 - classification_loss: 0.0351 168/500 [=========>....................] - ETA: 2:46 - loss: 0.4387 - regression_loss: 0.4037 - classification_loss: 0.0350 169/500 [=========>....................] - ETA: 2:46 - loss: 0.4389 - regression_loss: 0.4040 - classification_loss: 0.0350 170/500 [=========>....................] - ETA: 2:45 - loss: 0.4374 - regression_loss: 0.4025 - classification_loss: 0.0348 171/500 [=========>....................] - ETA: 2:45 - loss: 0.4364 - regression_loss: 0.4017 - classification_loss: 0.0347 172/500 [=========>....................] - ETA: 2:44 - loss: 0.4363 - regression_loss: 0.4016 - classification_loss: 0.0347 173/500 [=========>....................] - ETA: 2:44 - loss: 0.4342 - regression_loss: 0.3997 - classification_loss: 0.0345 174/500 [=========>....................] - ETA: 2:43 - loss: 0.4324 - regression_loss: 0.3980 - classification_loss: 0.0344 175/500 [=========>....................] - ETA: 2:43 - loss: 0.4313 - regression_loss: 0.3970 - classification_loss: 0.0343 176/500 [=========>....................] - ETA: 2:42 - loss: 0.4314 - regression_loss: 0.3972 - classification_loss: 0.0343 177/500 [=========>....................] - ETA: 2:42 - loss: 0.4300 - regression_loss: 0.3958 - classification_loss: 0.0341 178/500 [=========>....................] - ETA: 2:41 - loss: 0.4305 - regression_loss: 0.3963 - classification_loss: 0.0341 179/500 [=========>....................] - ETA: 2:41 - loss: 0.4308 - regression_loss: 0.3966 - classification_loss: 0.0342 180/500 [=========>....................] - ETA: 2:40 - loss: 0.4301 - regression_loss: 0.3960 - classification_loss: 0.0341 181/500 [=========>....................] - ETA: 2:40 - loss: 0.4288 - regression_loss: 0.3949 - classification_loss: 0.0340 182/500 [=========>....................] - ETA: 2:39 - loss: 0.4293 - regression_loss: 0.3952 - classification_loss: 0.0341 183/500 [=========>....................] - ETA: 2:39 - loss: 0.4280 - regression_loss: 0.3939 - classification_loss: 0.0341 184/500 [==========>...................] - ETA: 2:38 - loss: 0.4286 - regression_loss: 0.3945 - classification_loss: 0.0341 185/500 [==========>...................] - ETA: 2:38 - loss: 0.4287 - regression_loss: 0.3946 - classification_loss: 0.0341 186/500 [==========>...................] - ETA: 2:37 - loss: 0.4276 - regression_loss: 0.3936 - classification_loss: 0.0340 187/500 [==========>...................] - ETA: 2:37 - loss: 0.4276 - regression_loss: 0.3937 - classification_loss: 0.0339 188/500 [==========>...................] - ETA: 2:36 - loss: 0.4270 - regression_loss: 0.3932 - classification_loss: 0.0338 189/500 [==========>...................] - ETA: 2:36 - loss: 0.4257 - regression_loss: 0.3919 - classification_loss: 0.0338 190/500 [==========>...................] - ETA: 2:35 - loss: 0.4261 - regression_loss: 0.3922 - classification_loss: 0.0339 191/500 [==========>...................] - ETA: 2:35 - loss: 0.4259 - regression_loss: 0.3921 - classification_loss: 0.0338 192/500 [==========>...................] - ETA: 2:34 - loss: 0.4248 - regression_loss: 0.3911 - classification_loss: 0.0337 193/500 [==========>...................] - ETA: 2:34 - loss: 0.4261 - regression_loss: 0.3923 - classification_loss: 0.0338 194/500 [==========>...................] - ETA: 2:33 - loss: 0.4271 - regression_loss: 0.3930 - classification_loss: 0.0341 195/500 [==========>...................] - ETA: 2:33 - loss: 0.4275 - regression_loss: 0.3933 - classification_loss: 0.0341 196/500 [==========>...................] - ETA: 2:32 - loss: 0.4270 - regression_loss: 0.3930 - classification_loss: 0.0341 197/500 [==========>...................] - ETA: 2:32 - loss: 0.4282 - regression_loss: 0.3941 - classification_loss: 0.0340 198/500 [==========>...................] - ETA: 2:31 - loss: 0.4289 - regression_loss: 0.3947 - classification_loss: 0.0342 199/500 [==========>...................] - ETA: 2:31 - loss: 0.4301 - regression_loss: 0.3960 - classification_loss: 0.0341 200/500 [===========>..................] - ETA: 2:30 - loss: 0.4309 - regression_loss: 0.3966 - classification_loss: 0.0343 201/500 [===========>..................] - ETA: 2:30 - loss: 0.4299 - regression_loss: 0.3958 - classification_loss: 0.0342 202/500 [===========>..................] - ETA: 2:29 - loss: 0.4306 - regression_loss: 0.3964 - classification_loss: 0.0342 203/500 [===========>..................] - ETA: 2:29 - loss: 0.4307 - regression_loss: 0.3964 - classification_loss: 0.0343 204/500 [===========>..................] - ETA: 2:28 - loss: 0.4308 - regression_loss: 0.3966 - classification_loss: 0.0342 205/500 [===========>..................] - ETA: 2:28 - loss: 0.4313 - regression_loss: 0.3970 - classification_loss: 0.0342 206/500 [===========>..................] - ETA: 2:27 - loss: 0.4321 - regression_loss: 0.3978 - classification_loss: 0.0343 207/500 [===========>..................] - ETA: 2:27 - loss: 0.4339 - regression_loss: 0.3994 - classification_loss: 0.0345 208/500 [===========>..................] - ETA: 2:26 - loss: 0.4331 - regression_loss: 0.3987 - classification_loss: 0.0344 209/500 [===========>..................] - ETA: 2:26 - loss: 0.4334 - regression_loss: 0.3989 - classification_loss: 0.0344 210/500 [===========>..................] - ETA: 2:25 - loss: 0.4326 - regression_loss: 0.3983 - classification_loss: 0.0344 211/500 [===========>..................] - ETA: 2:25 - loss: 0.4315 - regression_loss: 0.3971 - classification_loss: 0.0343 212/500 [===========>..................] - ETA: 2:24 - loss: 0.4323 - regression_loss: 0.3979 - classification_loss: 0.0345 213/500 [===========>..................] - ETA: 2:24 - loss: 0.4341 - regression_loss: 0.3997 - classification_loss: 0.0344 214/500 [===========>..................] - ETA: 2:23 - loss: 0.4337 - regression_loss: 0.3994 - classification_loss: 0.0343 215/500 [===========>..................] - ETA: 2:23 - loss: 0.4337 - regression_loss: 0.3994 - classification_loss: 0.0343 216/500 [===========>..................] - ETA: 2:22 - loss: 0.4333 - regression_loss: 0.3990 - classification_loss: 0.0343 217/500 [============>.................] - ETA: 2:22 - loss: 0.4336 - regression_loss: 0.3993 - classification_loss: 0.0342 218/500 [============>.................] - ETA: 2:21 - loss: 0.4335 - regression_loss: 0.3993 - classification_loss: 0.0342 219/500 [============>.................] - ETA: 2:21 - loss: 0.4344 - regression_loss: 0.4001 - classification_loss: 0.0343 220/500 [============>.................] - ETA: 2:20 - loss: 0.4353 - regression_loss: 0.4010 - classification_loss: 0.0343 221/500 [============>.................] - ETA: 2:20 - loss: 0.4343 - regression_loss: 0.4000 - classification_loss: 0.0342 222/500 [============>.................] - ETA: 2:19 - loss: 0.4351 - regression_loss: 0.4009 - classification_loss: 0.0342 223/500 [============>.................] - ETA: 2:19 - loss: 0.4342 - regression_loss: 0.4001 - classification_loss: 0.0341 224/500 [============>.................] - ETA: 2:18 - loss: 0.4350 - regression_loss: 0.4007 - classification_loss: 0.0343 225/500 [============>.................] - ETA: 2:18 - loss: 0.4348 - regression_loss: 0.4006 - classification_loss: 0.0342 226/500 [============>.................] - ETA: 2:17 - loss: 0.4344 - regression_loss: 0.4002 - classification_loss: 0.0342 227/500 [============>.................] - ETA: 2:17 - loss: 0.4340 - regression_loss: 0.3999 - classification_loss: 0.0341 228/500 [============>.................] - ETA: 2:16 - loss: 0.4340 - regression_loss: 0.3999 - classification_loss: 0.0341 229/500 [============>.................] - ETA: 2:16 - loss: 0.4349 - regression_loss: 0.4006 - classification_loss: 0.0343 230/500 [============>.................] - ETA: 2:15 - loss: 0.4359 - regression_loss: 0.4015 - classification_loss: 0.0344 231/500 [============>.................] - ETA: 2:15 - loss: 0.4359 - regression_loss: 0.4015 - classification_loss: 0.0344 232/500 [============>.................] - ETA: 2:14 - loss: 0.4360 - regression_loss: 0.4015 - classification_loss: 0.0345 233/500 [============>.................] - ETA: 2:14 - loss: 0.4353 - regression_loss: 0.4008 - classification_loss: 0.0344 234/500 [=============>................] - ETA: 2:13 - loss: 0.4346 - regression_loss: 0.4002 - classification_loss: 0.0344 235/500 [=============>................] - ETA: 2:13 - loss: 0.4334 - regression_loss: 0.3992 - classification_loss: 0.0343 236/500 [=============>................] - ETA: 2:12 - loss: 0.4350 - regression_loss: 0.4001 - classification_loss: 0.0349 237/500 [=============>................] - ETA: 2:12 - loss: 0.4337 - regression_loss: 0.3989 - classification_loss: 0.0348 238/500 [=============>................] - ETA: 2:11 - loss: 0.4331 - regression_loss: 0.3984 - classification_loss: 0.0348 239/500 [=============>................] - ETA: 2:11 - loss: 0.4333 - regression_loss: 0.3985 - classification_loss: 0.0348 240/500 [=============>................] - ETA: 2:10 - loss: 0.4328 - regression_loss: 0.3981 - classification_loss: 0.0347 241/500 [=============>................] - ETA: 2:10 - loss: 0.4330 - regression_loss: 0.3983 - classification_loss: 0.0347 242/500 [=============>................] - ETA: 2:09 - loss: 0.4331 - regression_loss: 0.3984 - classification_loss: 0.0347 243/500 [=============>................] - ETA: 2:09 - loss: 0.4329 - regression_loss: 0.3981 - classification_loss: 0.0348 244/500 [=============>................] - ETA: 2:08 - loss: 0.4332 - regression_loss: 0.3984 - classification_loss: 0.0347 245/500 [=============>................] - ETA: 2:08 - loss: 0.4325 - regression_loss: 0.3978 - classification_loss: 0.0346 246/500 [=============>................] - ETA: 2:07 - loss: 0.4324 - regression_loss: 0.3978 - classification_loss: 0.0346 247/500 [=============>................] - ETA: 2:07 - loss: 0.4322 - regression_loss: 0.3975 - classification_loss: 0.0346 248/500 [=============>................] - ETA: 2:06 - loss: 0.4325 - regression_loss: 0.3979 - classification_loss: 0.0346 249/500 [=============>................] - ETA: 2:06 - loss: 0.4330 - regression_loss: 0.3984 - classification_loss: 0.0346 250/500 [==============>...............] - ETA: 2:05 - loss: 0.4330 - regression_loss: 0.3984 - classification_loss: 0.0346 251/500 [==============>...............] - ETA: 2:05 - loss: 0.4335 - regression_loss: 0.3989 - classification_loss: 0.0346 252/500 [==============>...............] - ETA: 2:04 - loss: 0.4330 - regression_loss: 0.3984 - classification_loss: 0.0346 253/500 [==============>...............] - ETA: 2:04 - loss: 0.4319 - regression_loss: 0.3974 - classification_loss: 0.0345 254/500 [==============>...............] - ETA: 2:03 - loss: 0.4319 - regression_loss: 0.3974 - classification_loss: 0.0345 255/500 [==============>...............] - ETA: 2:03 - loss: 0.4312 - regression_loss: 0.3968 - classification_loss: 0.0344 256/500 [==============>...............] - ETA: 2:02 - loss: 0.4311 - regression_loss: 0.3966 - classification_loss: 0.0344 257/500 [==============>...............] - ETA: 2:02 - loss: 0.4311 - regression_loss: 0.3967 - classification_loss: 0.0345 258/500 [==============>...............] - ETA: 2:01 - loss: 0.4317 - regression_loss: 0.3972 - classification_loss: 0.0345 259/500 [==============>...............] - ETA: 2:01 - loss: 0.4325 - regression_loss: 0.3979 - classification_loss: 0.0345 260/500 [==============>...............] - ETA: 2:00 - loss: 0.4329 - regression_loss: 0.3984 - classification_loss: 0.0345 261/500 [==============>...............] - ETA: 2:00 - loss: 0.4333 - regression_loss: 0.3987 - classification_loss: 0.0345 262/500 [==============>...............] - ETA: 1:59 - loss: 0.4343 - regression_loss: 0.3997 - classification_loss: 0.0346 263/500 [==============>...............] - ETA: 1:59 - loss: 0.4354 - regression_loss: 0.4006 - classification_loss: 0.0348 264/500 [==============>...............] - ETA: 1:58 - loss: 0.4349 - regression_loss: 0.4001 - classification_loss: 0.0348 265/500 [==============>...............] - ETA: 1:58 - loss: 0.4343 - regression_loss: 0.3996 - classification_loss: 0.0347 266/500 [==============>...............] - ETA: 1:57 - loss: 0.4354 - regression_loss: 0.4006 - classification_loss: 0.0348 267/500 [===============>..............] - ETA: 1:57 - loss: 0.4356 - regression_loss: 0.4009 - classification_loss: 0.0347 268/500 [===============>..............] - ETA: 1:56 - loss: 0.4372 - regression_loss: 0.4022 - classification_loss: 0.0350 269/500 [===============>..............] - ETA: 1:56 - loss: 0.4376 - regression_loss: 0.4027 - classification_loss: 0.0349 270/500 [===============>..............] - ETA: 1:55 - loss: 0.4382 - regression_loss: 0.4032 - classification_loss: 0.0349 271/500 [===============>..............] - ETA: 1:55 - loss: 0.4383 - regression_loss: 0.4034 - classification_loss: 0.0349 272/500 [===============>..............] - ETA: 1:54 - loss: 0.4379 - regression_loss: 0.4030 - classification_loss: 0.0349 273/500 [===============>..............] - ETA: 1:54 - loss: 0.4376 - regression_loss: 0.4028 - classification_loss: 0.0348 274/500 [===============>..............] - ETA: 1:53 - loss: 0.4378 - regression_loss: 0.4030 - classification_loss: 0.0347 275/500 [===============>..............] - ETA: 1:53 - loss: 0.4387 - regression_loss: 0.4038 - classification_loss: 0.0349 276/500 [===============>..............] - ETA: 1:52 - loss: 0.4391 - regression_loss: 0.4042 - classification_loss: 0.0349 277/500 [===============>..............] - ETA: 1:52 - loss: 0.4389 - regression_loss: 0.4041 - classification_loss: 0.0349 278/500 [===============>..............] - ETA: 1:51 - loss: 0.4382 - regression_loss: 0.4034 - classification_loss: 0.0348 279/500 [===============>..............] - ETA: 1:51 - loss: 0.4384 - regression_loss: 0.4036 - classification_loss: 0.0348 280/500 [===============>..............] - ETA: 1:50 - loss: 0.4380 - regression_loss: 0.4033 - classification_loss: 0.0347 281/500 [===============>..............] - ETA: 1:50 - loss: 0.4373 - regression_loss: 0.4026 - classification_loss: 0.0347 282/500 [===============>..............] - ETA: 1:49 - loss: 0.4373 - regression_loss: 0.4027 - classification_loss: 0.0347 283/500 [===============>..............] - ETA: 1:49 - loss: 0.4380 - regression_loss: 0.4033 - classification_loss: 0.0347 284/500 [================>.............] - ETA: 1:48 - loss: 0.4374 - regression_loss: 0.4027 - classification_loss: 0.0347 285/500 [================>.............] - ETA: 1:48 - loss: 0.4373 - regression_loss: 0.4026 - classification_loss: 0.0347 286/500 [================>.............] - ETA: 1:47 - loss: 0.4373 - regression_loss: 0.4025 - classification_loss: 0.0348 287/500 [================>.............] - ETA: 1:47 - loss: 0.4373 - regression_loss: 0.4025 - classification_loss: 0.0347 288/500 [================>.............] - ETA: 1:46 - loss: 0.4370 - regression_loss: 0.4024 - classification_loss: 0.0346 289/500 [================>.............] - ETA: 1:46 - loss: 0.4368 - regression_loss: 0.4023 - classification_loss: 0.0346 290/500 [================>.............] - ETA: 1:45 - loss: 0.4361 - regression_loss: 0.4016 - classification_loss: 0.0345 291/500 [================>.............] - ETA: 1:45 - loss: 0.4360 - regression_loss: 0.4015 - classification_loss: 0.0345 292/500 [================>.............] - ETA: 1:44 - loss: 0.4357 - regression_loss: 0.4012 - classification_loss: 0.0345 293/500 [================>.............] - ETA: 1:44 - loss: 0.4357 - regression_loss: 0.4012 - classification_loss: 0.0345 294/500 [================>.............] - ETA: 1:43 - loss: 0.4360 - regression_loss: 0.4014 - classification_loss: 0.0345 295/500 [================>.............] - ETA: 1:43 - loss: 0.4360 - regression_loss: 0.4015 - classification_loss: 0.0345 296/500 [================>.............] - ETA: 1:42 - loss: 0.4361 - regression_loss: 0.4016 - classification_loss: 0.0345 297/500 [================>.............] - ETA: 1:42 - loss: 0.4375 - regression_loss: 0.4029 - classification_loss: 0.0346 298/500 [================>.............] - ETA: 1:41 - loss: 0.4371 - regression_loss: 0.4026 - classification_loss: 0.0345 299/500 [================>.............] - ETA: 1:41 - loss: 0.4367 - regression_loss: 0.4022 - classification_loss: 0.0345 300/500 [=================>............] - ETA: 1:40 - loss: 0.4362 - regression_loss: 0.4018 - classification_loss: 0.0344 301/500 [=================>............] - ETA: 1:40 - loss: 0.4359 - regression_loss: 0.4015 - classification_loss: 0.0344 302/500 [=================>............] - ETA: 1:39 - loss: 0.4356 - regression_loss: 0.4013 - classification_loss: 0.0344 303/500 [=================>............] - ETA: 1:39 - loss: 0.4354 - regression_loss: 0.4011 - classification_loss: 0.0343 304/500 [=================>............] - ETA: 1:38 - loss: 0.4351 - regression_loss: 0.4008 - classification_loss: 0.0343 305/500 [=================>............] - ETA: 1:38 - loss: 0.4346 - regression_loss: 0.4003 - classification_loss: 0.0343 306/500 [=================>............] - ETA: 1:37 - loss: 0.4352 - regression_loss: 0.4008 - classification_loss: 0.0344 307/500 [=================>............] - ETA: 1:37 - loss: 0.4347 - regression_loss: 0.4004 - classification_loss: 0.0343 308/500 [=================>............] - ETA: 1:36 - loss: 0.4351 - regression_loss: 0.4006 - classification_loss: 0.0345 309/500 [=================>............] - ETA: 1:36 - loss: 0.4351 - regression_loss: 0.4006 - classification_loss: 0.0345 310/500 [=================>............] - ETA: 1:35 - loss: 0.4355 - regression_loss: 0.4009 - classification_loss: 0.0345 311/500 [=================>............] - ETA: 1:35 - loss: 0.4351 - regression_loss: 0.4006 - classification_loss: 0.0345 312/500 [=================>............] - ETA: 1:34 - loss: 0.4349 - regression_loss: 0.4005 - classification_loss: 0.0345 313/500 [=================>............] - ETA: 1:34 - loss: 0.4343 - regression_loss: 0.3999 - classification_loss: 0.0344 314/500 [=================>............] - ETA: 1:33 - loss: 0.4342 - regression_loss: 0.3998 - classification_loss: 0.0344 315/500 [=================>............] - ETA: 1:33 - loss: 0.4350 - regression_loss: 0.4005 - classification_loss: 0.0344 316/500 [=================>............] - ETA: 1:32 - loss: 0.4341 - regression_loss: 0.3997 - classification_loss: 0.0343 317/500 [==================>...........] - ETA: 1:31 - loss: 0.4347 - regression_loss: 0.4003 - classification_loss: 0.0344 318/500 [==================>...........] - ETA: 1:31 - loss: 0.4349 - regression_loss: 0.4005 - classification_loss: 0.0345 319/500 [==================>...........] - ETA: 1:30 - loss: 0.4360 - regression_loss: 0.4015 - classification_loss: 0.0345 320/500 [==================>...........] - ETA: 1:30 - loss: 0.4364 - regression_loss: 0.4020 - classification_loss: 0.0344 321/500 [==================>...........] - ETA: 1:29 - loss: 0.4369 - regression_loss: 0.4025 - classification_loss: 0.0344 322/500 [==================>...........] - ETA: 1:29 - loss: 0.4373 - regression_loss: 0.4028 - classification_loss: 0.0344 323/500 [==================>...........] - ETA: 1:28 - loss: 0.4371 - regression_loss: 0.4027 - classification_loss: 0.0344 324/500 [==================>...........] - ETA: 1:28 - loss: 0.4369 - regression_loss: 0.4025 - classification_loss: 0.0344 325/500 [==================>...........] - ETA: 1:27 - loss: 0.4370 - regression_loss: 0.4027 - classification_loss: 0.0343 326/500 [==================>...........] - ETA: 1:27 - loss: 0.4370 - regression_loss: 0.4026 - classification_loss: 0.0344 327/500 [==================>...........] - ETA: 1:26 - loss: 0.4371 - regression_loss: 0.4027 - classification_loss: 0.0343 328/500 [==================>...........] - ETA: 1:26 - loss: 0.4372 - regression_loss: 0.4029 - classification_loss: 0.0344 329/500 [==================>...........] - ETA: 1:25 - loss: 0.4370 - regression_loss: 0.4026 - classification_loss: 0.0344 330/500 [==================>...........] - ETA: 1:25 - loss: 0.4365 - regression_loss: 0.4022 - classification_loss: 0.0343 331/500 [==================>...........] - ETA: 1:24 - loss: 0.4359 - regression_loss: 0.4017 - classification_loss: 0.0342 332/500 [==================>...........] - ETA: 1:24 - loss: 0.4356 - regression_loss: 0.4015 - classification_loss: 0.0342 333/500 [==================>...........] - ETA: 1:23 - loss: 0.4363 - regression_loss: 0.4021 - classification_loss: 0.0342 334/500 [===================>..........] - ETA: 1:23 - loss: 0.4363 - regression_loss: 0.4021 - classification_loss: 0.0342 335/500 [===================>..........] - ETA: 1:22 - loss: 0.4366 - regression_loss: 0.4024 - classification_loss: 0.0342 336/500 [===================>..........] - ETA: 1:22 - loss: 0.4362 - regression_loss: 0.4021 - classification_loss: 0.0342 337/500 [===================>..........] - ETA: 1:21 - loss: 0.4369 - regression_loss: 0.4028 - classification_loss: 0.0341 338/500 [===================>..........] - ETA: 1:21 - loss: 0.4371 - regression_loss: 0.4031 - classification_loss: 0.0341 339/500 [===================>..........] - ETA: 1:20 - loss: 0.4368 - regression_loss: 0.4028 - classification_loss: 0.0340 340/500 [===================>..........] - ETA: 1:20 - loss: 0.4372 - regression_loss: 0.4032 - classification_loss: 0.0340 341/500 [===================>..........] - ETA: 1:19 - loss: 0.4375 - regression_loss: 0.4035 - classification_loss: 0.0340 342/500 [===================>..........] - ETA: 1:19 - loss: 0.4370 - regression_loss: 0.4031 - classification_loss: 0.0340 343/500 [===================>..........] - ETA: 1:18 - loss: 0.4365 - regression_loss: 0.4026 - classification_loss: 0.0339 344/500 [===================>..........] - ETA: 1:18 - loss: 0.4368 - regression_loss: 0.4028 - classification_loss: 0.0339 345/500 [===================>..........] - ETA: 1:17 - loss: 0.4368 - regression_loss: 0.4030 - classification_loss: 0.0338 346/500 [===================>..........] - ETA: 1:17 - loss: 0.4370 - regression_loss: 0.4032 - classification_loss: 0.0338 347/500 [===================>..........] - ETA: 1:16 - loss: 0.4364 - regression_loss: 0.4026 - classification_loss: 0.0338 348/500 [===================>..........] - ETA: 1:16 - loss: 0.4358 - regression_loss: 0.4021 - classification_loss: 0.0337 349/500 [===================>..........] - ETA: 1:15 - loss: 0.4352 - regression_loss: 0.4015 - classification_loss: 0.0337 350/500 [====================>.........] - ETA: 1:15 - loss: 0.4354 - regression_loss: 0.4017 - classification_loss: 0.0337 351/500 [====================>.........] - ETA: 1:14 - loss: 0.4350 - regression_loss: 0.4013 - classification_loss: 0.0337 352/500 [====================>.........] - ETA: 1:14 - loss: 0.4350 - regression_loss: 0.4013 - classification_loss: 0.0337 353/500 [====================>.........] - ETA: 1:13 - loss: 0.4347 - regression_loss: 0.4010 - classification_loss: 0.0337 354/500 [====================>.........] - ETA: 1:13 - loss: 0.4348 - regression_loss: 0.4010 - classification_loss: 0.0337 355/500 [====================>.........] - ETA: 1:12 - loss: 0.4340 - regression_loss: 0.4004 - classification_loss: 0.0337 356/500 [====================>.........] - ETA: 1:12 - loss: 0.4348 - regression_loss: 0.4011 - classification_loss: 0.0336 357/500 [====================>.........] - ETA: 1:11 - loss: 0.4344 - regression_loss: 0.4009 - classification_loss: 0.0336 358/500 [====================>.........] - ETA: 1:11 - loss: 0.4337 - regression_loss: 0.4002 - classification_loss: 0.0335 359/500 [====================>.........] - ETA: 1:10 - loss: 0.4337 - regression_loss: 0.4002 - classification_loss: 0.0335 360/500 [====================>.........] - ETA: 1:10 - loss: 0.4333 - regression_loss: 0.3998 - classification_loss: 0.0335 361/500 [====================>.........] - ETA: 1:09 - loss: 0.4336 - regression_loss: 0.4000 - classification_loss: 0.0336 362/500 [====================>.........] - ETA: 1:09 - loss: 0.4339 - regression_loss: 0.4003 - classification_loss: 0.0336 363/500 [====================>.........] - ETA: 1:08 - loss: 0.4339 - regression_loss: 0.4004 - classification_loss: 0.0336 364/500 [====================>.........] - ETA: 1:08 - loss: 0.4333 - regression_loss: 0.3998 - classification_loss: 0.0335 365/500 [====================>.........] - ETA: 1:07 - loss: 0.4330 - regression_loss: 0.3994 - classification_loss: 0.0335 366/500 [====================>.........] - ETA: 1:07 - loss: 0.4332 - regression_loss: 0.3996 - classification_loss: 0.0335 367/500 [=====================>........] - ETA: 1:06 - loss: 0.4329 - regression_loss: 0.3993 - classification_loss: 0.0335 368/500 [=====================>........] - ETA: 1:06 - loss: 0.4323 - regression_loss: 0.3988 - classification_loss: 0.0335 369/500 [=====================>........] - ETA: 1:05 - loss: 0.4320 - regression_loss: 0.3986 - classification_loss: 0.0335 370/500 [=====================>........] - ETA: 1:05 - loss: 0.4322 - regression_loss: 0.3987 - classification_loss: 0.0335 371/500 [=====================>........] - ETA: 1:04 - loss: 0.4322 - regression_loss: 0.3988 - classification_loss: 0.0335 372/500 [=====================>........] - ETA: 1:04 - loss: 0.4318 - regression_loss: 0.3984 - classification_loss: 0.0334 373/500 [=====================>........] - ETA: 1:03 - loss: 0.4317 - regression_loss: 0.3983 - classification_loss: 0.0334 374/500 [=====================>........] - ETA: 1:03 - loss: 0.4317 - regression_loss: 0.3983 - classification_loss: 0.0334 375/500 [=====================>........] - ETA: 1:02 - loss: 0.4310 - regression_loss: 0.3977 - classification_loss: 0.0333 376/500 [=====================>........] - ETA: 1:02 - loss: 0.4313 - regression_loss: 0.3979 - classification_loss: 0.0334 377/500 [=====================>........] - ETA: 1:01 - loss: 0.4318 - regression_loss: 0.3983 - classification_loss: 0.0335 378/500 [=====================>........] - ETA: 1:01 - loss: 0.4313 - regression_loss: 0.3979 - classification_loss: 0.0334 379/500 [=====================>........] - ETA: 1:00 - loss: 0.4311 - regression_loss: 0.3977 - classification_loss: 0.0334 380/500 [=====================>........] - ETA: 1:00 - loss: 0.4304 - regression_loss: 0.3971 - classification_loss: 0.0333 381/500 [=====================>........] - ETA: 59s - loss: 0.4311 - regression_loss: 0.3977 - classification_loss: 0.0334  382/500 [=====================>........] - ETA: 59s - loss: 0.4311 - regression_loss: 0.3978 - classification_loss: 0.0333 383/500 [=====================>........] - ETA: 58s - loss: 0.4312 - regression_loss: 0.3979 - classification_loss: 0.0333 384/500 [======================>.......] - ETA: 58s - loss: 0.4311 - regression_loss: 0.3978 - classification_loss: 0.0333 385/500 [======================>.......] - ETA: 57s - loss: 0.4313 - regression_loss: 0.3979 - classification_loss: 0.0334 386/500 [======================>.......] - ETA: 57s - loss: 0.4312 - regression_loss: 0.3978 - classification_loss: 0.0334 387/500 [======================>.......] - ETA: 56s - loss: 0.4307 - regression_loss: 0.3974 - classification_loss: 0.0334 388/500 [======================>.......] - ETA: 56s - loss: 0.4317 - regression_loss: 0.3984 - classification_loss: 0.0333 389/500 [======================>.......] - ETA: 55s - loss: 0.4322 - regression_loss: 0.3988 - classification_loss: 0.0334 390/500 [======================>.......] - ETA: 55s - loss: 0.4320 - regression_loss: 0.3987 - classification_loss: 0.0334 391/500 [======================>.......] - ETA: 54s - loss: 0.4332 - regression_loss: 0.3997 - classification_loss: 0.0334 392/500 [======================>.......] - ETA: 54s - loss: 0.4337 - regression_loss: 0.4001 - classification_loss: 0.0336 393/500 [======================>.......] - ETA: 53s - loss: 0.4345 - regression_loss: 0.4008 - classification_loss: 0.0337 394/500 [======================>.......] - ETA: 53s - loss: 0.4343 - regression_loss: 0.4005 - classification_loss: 0.0338 395/500 [======================>.......] - ETA: 52s - loss: 0.4345 - regression_loss: 0.4008 - classification_loss: 0.0337 396/500 [======================>.......] - ETA: 52s - loss: 0.4343 - regression_loss: 0.4006 - classification_loss: 0.0337 397/500 [======================>.......] - ETA: 51s - loss: 0.4345 - regression_loss: 0.4008 - classification_loss: 0.0337 398/500 [======================>.......] - ETA: 51s - loss: 0.4353 - regression_loss: 0.4016 - classification_loss: 0.0337 399/500 [======================>.......] - ETA: 50s - loss: 0.4347 - regression_loss: 0.4010 - classification_loss: 0.0337 400/500 [=======================>......] - ETA: 50s - loss: 0.4358 - regression_loss: 0.4020 - classification_loss: 0.0338 401/500 [=======================>......] - ETA: 49s - loss: 0.4364 - regression_loss: 0.4025 - classification_loss: 0.0339 402/500 [=======================>......] - ETA: 49s - loss: 0.4363 - regression_loss: 0.4024 - classification_loss: 0.0339 403/500 [=======================>......] - ETA: 48s - loss: 0.4371 - regression_loss: 0.4031 - classification_loss: 0.0340 404/500 [=======================>......] - ETA: 48s - loss: 0.4374 - regression_loss: 0.4034 - classification_loss: 0.0340 405/500 [=======================>......] - ETA: 47s - loss: 0.4381 - regression_loss: 0.4041 - classification_loss: 0.0340 406/500 [=======================>......] - ETA: 47s - loss: 0.4388 - regression_loss: 0.4047 - classification_loss: 0.0341 407/500 [=======================>......] - ETA: 46s - loss: 0.4386 - regression_loss: 0.4046 - classification_loss: 0.0340 408/500 [=======================>......] - ETA: 46s - loss: 0.4383 - regression_loss: 0.4043 - classification_loss: 0.0340 409/500 [=======================>......] - ETA: 45s - loss: 0.4389 - regression_loss: 0.4049 - classification_loss: 0.0340 410/500 [=======================>......] - ETA: 45s - loss: 0.4387 - regression_loss: 0.4047 - classification_loss: 0.0340 411/500 [=======================>......] - ETA: 44s - loss: 0.4389 - regression_loss: 0.4049 - classification_loss: 0.0340 412/500 [=======================>......] - ETA: 44s - loss: 0.4390 - regression_loss: 0.4050 - classification_loss: 0.0340 413/500 [=======================>......] - ETA: 43s - loss: 0.4391 - regression_loss: 0.4051 - classification_loss: 0.0341 414/500 [=======================>......] - ETA: 43s - loss: 0.4388 - regression_loss: 0.4048 - classification_loss: 0.0340 415/500 [=======================>......] - ETA: 42s - loss: 0.4383 - regression_loss: 0.4043 - classification_loss: 0.0340 416/500 [=======================>......] - ETA: 42s - loss: 0.4397 - regression_loss: 0.4057 - classification_loss: 0.0340 417/500 [========================>.....] - ETA: 41s - loss: 0.4406 - regression_loss: 0.4065 - classification_loss: 0.0341 418/500 [========================>.....] - ETA: 41s - loss: 0.4417 - regression_loss: 0.4075 - classification_loss: 0.0343 419/500 [========================>.....] - ETA: 40s - loss: 0.4415 - regression_loss: 0.4073 - classification_loss: 0.0342 420/500 [========================>.....] - ETA: 40s - loss: 0.4419 - regression_loss: 0.4076 - classification_loss: 0.0343 421/500 [========================>.....] - ETA: 39s - loss: 0.4417 - regression_loss: 0.4074 - classification_loss: 0.0343 422/500 [========================>.....] - ETA: 39s - loss: 0.4410 - regression_loss: 0.4068 - classification_loss: 0.0342 423/500 [========================>.....] - ETA: 38s - loss: 0.4417 - regression_loss: 0.4075 - classification_loss: 0.0342 424/500 [========================>.....] - ETA: 38s - loss: 0.4421 - regression_loss: 0.4079 - classification_loss: 0.0342 425/500 [========================>.....] - ETA: 37s - loss: 0.4440 - regression_loss: 0.4096 - classification_loss: 0.0344 426/500 [========================>.....] - ETA: 37s - loss: 0.4444 - regression_loss: 0.4100 - classification_loss: 0.0344 427/500 [========================>.....] - ETA: 36s - loss: 0.4444 - regression_loss: 0.4101 - classification_loss: 0.0344 428/500 [========================>.....] - ETA: 36s - loss: 0.4450 - regression_loss: 0.4106 - classification_loss: 0.0344 429/500 [========================>.....] - ETA: 35s - loss: 0.4455 - regression_loss: 0.4110 - classification_loss: 0.0344 430/500 [========================>.....] - ETA: 35s - loss: 0.4454 - regression_loss: 0.4110 - classification_loss: 0.0344 431/500 [========================>.....] - ETA: 34s - loss: 0.4454 - regression_loss: 0.4110 - classification_loss: 0.0344 432/500 [========================>.....] - ETA: 34s - loss: 0.4453 - regression_loss: 0.4110 - classification_loss: 0.0344 433/500 [========================>.....] - ETA: 33s - loss: 0.4453 - regression_loss: 0.4109 - classification_loss: 0.0344 434/500 [=========================>....] - ETA: 33s - loss: 0.4450 - regression_loss: 0.4106 - classification_loss: 0.0343 435/500 [=========================>....] - ETA: 32s - loss: 0.4454 - regression_loss: 0.4110 - classification_loss: 0.0344 436/500 [=========================>....] - ETA: 32s - loss: 0.4452 - regression_loss: 0.4108 - classification_loss: 0.0344 437/500 [=========================>....] - ETA: 31s - loss: 0.4449 - regression_loss: 0.4105 - classification_loss: 0.0344 438/500 [=========================>....] - ETA: 31s - loss: 0.4451 - regression_loss: 0.4106 - classification_loss: 0.0345 439/500 [=========================>....] - ETA: 30s - loss: 0.4446 - regression_loss: 0.4102 - classification_loss: 0.0344 440/500 [=========================>....] - ETA: 30s - loss: 0.4446 - regression_loss: 0.4102 - classification_loss: 0.0344 441/500 [=========================>....] - ETA: 29s - loss: 0.4452 - regression_loss: 0.4107 - classification_loss: 0.0345 442/500 [=========================>....] - ETA: 29s - loss: 0.4454 - regression_loss: 0.4109 - classification_loss: 0.0344 443/500 [=========================>....] - ETA: 28s - loss: 0.4454 - regression_loss: 0.4110 - classification_loss: 0.0345 444/500 [=========================>....] - ETA: 28s - loss: 0.4458 - regression_loss: 0.4114 - classification_loss: 0.0344 445/500 [=========================>....] - ETA: 27s - loss: 0.4454 - regression_loss: 0.4110 - classification_loss: 0.0344 446/500 [=========================>....] - ETA: 27s - loss: 0.4452 - regression_loss: 0.4108 - classification_loss: 0.0344 447/500 [=========================>....] - ETA: 26s - loss: 0.4447 - regression_loss: 0.4104 - classification_loss: 0.0343 448/500 [=========================>....] - ETA: 26s - loss: 0.4442 - regression_loss: 0.4099 - classification_loss: 0.0343 449/500 [=========================>....] - ETA: 25s - loss: 0.4436 - regression_loss: 0.4094 - classification_loss: 0.0342 450/500 [==========================>...] - ETA: 25s - loss: 0.4435 - regression_loss: 0.4093 - classification_loss: 0.0342 451/500 [==========================>...] - ETA: 24s - loss: 0.4434 - regression_loss: 0.4092 - classification_loss: 0.0342 452/500 [==========================>...] - ETA: 24s - loss: 0.4434 - regression_loss: 0.4092 - classification_loss: 0.0341 453/500 [==========================>...] - ETA: 23s - loss: 0.4433 - regression_loss: 0.4092 - classification_loss: 0.0341 454/500 [==========================>...] - ETA: 23s - loss: 0.4434 - regression_loss: 0.4093 - classification_loss: 0.0341 455/500 [==========================>...] - ETA: 22s - loss: 0.4437 - regression_loss: 0.4095 - classification_loss: 0.0342 456/500 [==========================>...] - ETA: 22s - loss: 0.4448 - regression_loss: 0.4106 - classification_loss: 0.0342 457/500 [==========================>...] - ETA: 21s - loss: 0.4446 - regression_loss: 0.4104 - classification_loss: 0.0342 458/500 [==========================>...] - ETA: 21s - loss: 0.4444 - regression_loss: 0.4102 - classification_loss: 0.0342 459/500 [==========================>...] - ETA: 20s - loss: 0.4439 - regression_loss: 0.4098 - classification_loss: 0.0341 460/500 [==========================>...] - ETA: 20s - loss: 0.4434 - regression_loss: 0.4093 - classification_loss: 0.0341 461/500 [==========================>...] - ETA: 19s - loss: 0.4432 - regression_loss: 0.4092 - classification_loss: 0.0341 462/500 [==========================>...] - ETA: 19s - loss: 0.4430 - regression_loss: 0.4090 - classification_loss: 0.0340 463/500 [==========================>...] - ETA: 18s - loss: 0.4428 - regression_loss: 0.4088 - classification_loss: 0.0340 464/500 [==========================>...] - ETA: 18s - loss: 0.4423 - regression_loss: 0.4084 - classification_loss: 0.0340 465/500 [==========================>...] - ETA: 17s - loss: 0.4424 - regression_loss: 0.4084 - classification_loss: 0.0340 466/500 [==========================>...] - ETA: 17s - loss: 0.4423 - regression_loss: 0.4083 - classification_loss: 0.0340 467/500 [===========================>..] - ETA: 16s - loss: 0.4423 - regression_loss: 0.4083 - classification_loss: 0.0341 468/500 [===========================>..] - ETA: 16s - loss: 0.4425 - regression_loss: 0.4085 - classification_loss: 0.0341 469/500 [===========================>..] - ETA: 15s - loss: 0.4422 - regression_loss: 0.4082 - classification_loss: 0.0341 470/500 [===========================>..] - ETA: 15s - loss: 0.4426 - regression_loss: 0.4086 - classification_loss: 0.0341 471/500 [===========================>..] - ETA: 14s - loss: 0.4428 - regression_loss: 0.4086 - classification_loss: 0.0341 472/500 [===========================>..] - ETA: 14s - loss: 0.4428 - regression_loss: 0.4087 - classification_loss: 0.0341 473/500 [===========================>..] - ETA: 13s - loss: 0.4436 - regression_loss: 0.4095 - classification_loss: 0.0341 474/500 [===========================>..] - ETA: 13s - loss: 0.4432 - regression_loss: 0.4090 - classification_loss: 0.0341 475/500 [===========================>..] - ETA: 12s - loss: 0.4435 - regression_loss: 0.4093 - classification_loss: 0.0342 476/500 [===========================>..] - ETA: 12s - loss: 0.4435 - regression_loss: 0.4093 - classification_loss: 0.0342 477/500 [===========================>..] - ETA: 11s - loss: 0.4437 - regression_loss: 0.4095 - classification_loss: 0.0342 478/500 [===========================>..] - ETA: 11s - loss: 0.4431 - regression_loss: 0.4090 - classification_loss: 0.0342 479/500 [===========================>..] - ETA: 10s - loss: 0.4431 - regression_loss: 0.4090 - classification_loss: 0.0342 480/500 [===========================>..] - ETA: 10s - loss: 0.4436 - regression_loss: 0.4093 - classification_loss: 0.0343 481/500 [===========================>..] - ETA: 9s - loss: 0.4439 - regression_loss: 0.4096 - classification_loss: 0.0343  482/500 [===========================>..] - ETA: 9s - loss: 0.4437 - regression_loss: 0.4094 - classification_loss: 0.0342 483/500 [===========================>..] - ETA: 8s - loss: 0.4437 - regression_loss: 0.4094 - classification_loss: 0.0343 484/500 [============================>.] - ETA: 8s - loss: 0.4436 - regression_loss: 0.4093 - classification_loss: 0.0343 485/500 [============================>.] - ETA: 7s - loss: 0.4433 - regression_loss: 0.4090 - classification_loss: 0.0343 486/500 [============================>.] - ETA: 7s - loss: 0.4429 - regression_loss: 0.4087 - classification_loss: 0.0342 487/500 [============================>.] - ETA: 6s - loss: 0.4431 - regression_loss: 0.4089 - classification_loss: 0.0342 488/500 [============================>.] - ETA: 6s - loss: 0.4431 - regression_loss: 0.4088 - classification_loss: 0.0342 489/500 [============================>.] - ETA: 5s - loss: 0.4428 - regression_loss: 0.4086 - classification_loss: 0.0342 490/500 [============================>.] - ETA: 5s - loss: 0.4427 - regression_loss: 0.4085 - classification_loss: 0.0342 491/500 [============================>.] - ETA: 4s - loss: 0.4422 - regression_loss: 0.4081 - classification_loss: 0.0341 492/500 [============================>.] - ETA: 4s - loss: 0.4421 - regression_loss: 0.4080 - classification_loss: 0.0341 493/500 [============================>.] - ETA: 3s - loss: 0.4421 - regression_loss: 0.4079 - classification_loss: 0.0341 494/500 [============================>.] - ETA: 3s - loss: 0.4422 - regression_loss: 0.4080 - classification_loss: 0.0342 495/500 [============================>.] - ETA: 2s - loss: 0.4421 - regression_loss: 0.4079 - classification_loss: 0.0342 496/500 [============================>.] - ETA: 2s - loss: 0.4420 - regression_loss: 0.4078 - classification_loss: 0.0342 497/500 [============================>.] - ETA: 1s - loss: 0.4413 - regression_loss: 0.4072 - classification_loss: 0.0341 498/500 [============================>.] - ETA: 1s - loss: 0.4418 - regression_loss: 0.4077 - classification_loss: 0.0341 499/500 [============================>.] - ETA: 0s - loss: 0.4421 - regression_loss: 0.4080 - classification_loss: 0.0341 500/500 [==============================] - 251s 502ms/step - loss: 0.4428 - regression_loss: 0.4087 - classification_loss: 0.0341 1172 instances of class plum with average precision: 0.7127 mAP: 0.7127 Epoch 00006: saving model to ./training/snapshots/resnet101_pascal_06.h5 Epoch 7/150 1/500 [..............................] - ETA: 4:13 - loss: 0.4728 - regression_loss: 0.4588 - classification_loss: 0.0139 2/500 [..............................] - ETA: 4:06 - loss: 0.4598 - regression_loss: 0.4360 - classification_loss: 0.0238 3/500 [..............................] - ETA: 4:03 - loss: 0.4638 - regression_loss: 0.4381 - classification_loss: 0.0258 4/500 [..............................] - ETA: 4:04 - loss: 0.4533 - regression_loss: 0.4282 - classification_loss: 0.0252 5/500 [..............................] - ETA: 4:05 - loss: 0.4314 - regression_loss: 0.3995 - classification_loss: 0.0319 6/500 [..............................] - ETA: 4:06 - loss: 0.4476 - regression_loss: 0.4139 - classification_loss: 0.0337 7/500 [..............................] - ETA: 4:06 - loss: 0.4280 - regression_loss: 0.3957 - classification_loss: 0.0323 8/500 [..............................] - ETA: 4:05 - loss: 0.4176 - regression_loss: 0.3860 - classification_loss: 0.0316 9/500 [..............................] - ETA: 4:04 - loss: 0.4089 - regression_loss: 0.3764 - classification_loss: 0.0324 10/500 [..............................] - ETA: 4:03 - loss: 0.4243 - regression_loss: 0.3927 - classification_loss: 0.0315 11/500 [..............................] - ETA: 4:03 - loss: 0.4181 - regression_loss: 0.3882 - classification_loss: 0.0299 12/500 [..............................] - ETA: 4:03 - loss: 0.4069 - regression_loss: 0.3788 - classification_loss: 0.0280 13/500 [..............................] - ETA: 4:03 - loss: 0.4162 - regression_loss: 0.3879 - classification_loss: 0.0283 14/500 [..............................] - ETA: 4:03 - loss: 0.4321 - regression_loss: 0.4020 - classification_loss: 0.0301 15/500 [..............................] - ETA: 4:02 - loss: 0.4503 - regression_loss: 0.4174 - classification_loss: 0.0329 16/500 [..............................] - ETA: 4:02 - loss: 0.4481 - regression_loss: 0.4155 - classification_loss: 0.0326 17/500 [>.............................] - ETA: 4:01 - loss: 0.4366 - regression_loss: 0.4051 - classification_loss: 0.0315 18/500 [>.............................] - ETA: 4:01 - loss: 0.4456 - regression_loss: 0.4142 - classification_loss: 0.0314 19/500 [>.............................] - ETA: 4:00 - loss: 0.4354 - regression_loss: 0.4048 - classification_loss: 0.0306 20/500 [>.............................] - ETA: 4:00 - loss: 0.4388 - regression_loss: 0.4075 - classification_loss: 0.0313 21/500 [>.............................] - ETA: 3:59 - loss: 0.4336 - regression_loss: 0.4021 - classification_loss: 0.0315 22/500 [>.............................] - ETA: 3:59 - loss: 0.4259 - regression_loss: 0.3950 - classification_loss: 0.0309 23/500 [>.............................] - ETA: 3:58 - loss: 0.4240 - regression_loss: 0.3922 - classification_loss: 0.0317 24/500 [>.............................] - ETA: 3:58 - loss: 0.4200 - regression_loss: 0.3886 - classification_loss: 0.0313 25/500 [>.............................] - ETA: 3:58 - loss: 0.4098 - regression_loss: 0.3793 - classification_loss: 0.0305 26/500 [>.............................] - ETA: 3:57 - loss: 0.4166 - regression_loss: 0.3860 - classification_loss: 0.0306 27/500 [>.............................] - ETA: 3:57 - loss: 0.4074 - regression_loss: 0.3776 - classification_loss: 0.0298 28/500 [>.............................] - ETA: 3:56 - loss: 0.4037 - regression_loss: 0.3743 - classification_loss: 0.0294 29/500 [>.............................] - ETA: 3:56 - loss: 0.3968 - regression_loss: 0.3680 - classification_loss: 0.0288 30/500 [>.............................] - ETA: 3:55 - loss: 0.4003 - regression_loss: 0.3713 - classification_loss: 0.0290 31/500 [>.............................] - ETA: 3:55 - loss: 0.3909 - regression_loss: 0.3627 - classification_loss: 0.0283 32/500 [>.............................] - ETA: 3:54 - loss: 0.3878 - regression_loss: 0.3596 - classification_loss: 0.0281 33/500 [>.............................] - ETA: 3:54 - loss: 0.3838 - regression_loss: 0.3559 - classification_loss: 0.0279 34/500 [=>............................] - ETA: 3:53 - loss: 0.3816 - regression_loss: 0.3540 - classification_loss: 0.0276 35/500 [=>............................] - ETA: 3:53 - loss: 0.3819 - regression_loss: 0.3546 - classification_loss: 0.0273 36/500 [=>............................] - ETA: 3:52 - loss: 0.3863 - regression_loss: 0.3587 - classification_loss: 0.0277 37/500 [=>............................] - ETA: 3:52 - loss: 0.3883 - regression_loss: 0.3604 - classification_loss: 0.0279 38/500 [=>............................] - ETA: 3:51 - loss: 0.3932 - regression_loss: 0.3652 - classification_loss: 0.0281 39/500 [=>............................] - ETA: 3:51 - loss: 0.3936 - regression_loss: 0.3650 - classification_loss: 0.0286 40/500 [=>............................] - ETA: 3:50 - loss: 0.3874 - regression_loss: 0.3595 - classification_loss: 0.0279 41/500 [=>............................] - ETA: 3:50 - loss: 0.3956 - regression_loss: 0.3665 - classification_loss: 0.0292 42/500 [=>............................] - ETA: 3:49 - loss: 0.3956 - regression_loss: 0.3664 - classification_loss: 0.0292 43/500 [=>............................] - ETA: 3:49 - loss: 0.4036 - regression_loss: 0.3734 - classification_loss: 0.0302 44/500 [=>............................] - ETA: 3:49 - loss: 0.4105 - regression_loss: 0.3790 - classification_loss: 0.0315 45/500 [=>............................] - ETA: 3:48 - loss: 0.4114 - regression_loss: 0.3799 - classification_loss: 0.0316 46/500 [=>............................] - ETA: 3:47 - loss: 0.4088 - regression_loss: 0.3776 - classification_loss: 0.0313 47/500 [=>............................] - ETA: 3:47 - loss: 0.4051 - regression_loss: 0.3743 - classification_loss: 0.0308 48/500 [=>............................] - ETA: 3:46 - loss: 0.4035 - regression_loss: 0.3728 - classification_loss: 0.0306 49/500 [=>............................] - ETA: 3:46 - loss: 0.4065 - regression_loss: 0.3756 - classification_loss: 0.0309 50/500 [==>...........................] - ETA: 3:45 - loss: 0.4111 - regression_loss: 0.3802 - classification_loss: 0.0309 51/500 [==>...........................] - ETA: 3:45 - loss: 0.4080 - regression_loss: 0.3773 - classification_loss: 0.0306 52/500 [==>...........................] - ETA: 3:44 - loss: 0.4133 - regression_loss: 0.3824 - classification_loss: 0.0308 53/500 [==>...........................] - ETA: 3:44 - loss: 0.4162 - regression_loss: 0.3852 - classification_loss: 0.0310 54/500 [==>...........................] - ETA: 3:43 - loss: 0.4105 - regression_loss: 0.3799 - classification_loss: 0.0306 55/500 [==>...........................] - ETA: 3:42 - loss: 0.4162 - regression_loss: 0.3854 - classification_loss: 0.0308 56/500 [==>...........................] - ETA: 3:42 - loss: 0.4188 - regression_loss: 0.3878 - classification_loss: 0.0310 57/500 [==>...........................] - ETA: 3:41 - loss: 0.4229 - regression_loss: 0.3907 - classification_loss: 0.0322 58/500 [==>...........................] - ETA: 3:41 - loss: 0.4189 - regression_loss: 0.3865 - classification_loss: 0.0323 59/500 [==>...........................] - ETA: 3:40 - loss: 0.4240 - regression_loss: 0.3909 - classification_loss: 0.0331 60/500 [==>...........................] - ETA: 3:40 - loss: 0.4278 - regression_loss: 0.3939 - classification_loss: 0.0339 61/500 [==>...........................] - ETA: 3:39 - loss: 0.4258 - regression_loss: 0.3919 - classification_loss: 0.0339 62/500 [==>...........................] - ETA: 3:39 - loss: 0.4251 - regression_loss: 0.3913 - classification_loss: 0.0338 63/500 [==>...........................] - ETA: 3:38 - loss: 0.4249 - regression_loss: 0.3911 - classification_loss: 0.0338 64/500 [==>...........................] - ETA: 3:38 - loss: 0.4277 - regression_loss: 0.3936 - classification_loss: 0.0341 65/500 [==>...........................] - ETA: 3:37 - loss: 0.4293 - regression_loss: 0.3952 - classification_loss: 0.0342 66/500 [==>...........................] - ETA: 3:37 - loss: 0.4297 - regression_loss: 0.3953 - classification_loss: 0.0344 67/500 [===>..........................] - ETA: 3:36 - loss: 0.4319 - regression_loss: 0.3975 - classification_loss: 0.0344 68/500 [===>..........................] - ETA: 3:36 - loss: 0.4307 - regression_loss: 0.3965 - classification_loss: 0.0342 69/500 [===>..........................] - ETA: 3:35 - loss: 0.4307 - regression_loss: 0.3964 - classification_loss: 0.0343 70/500 [===>..........................] - ETA: 3:35 - loss: 0.4322 - regression_loss: 0.3979 - classification_loss: 0.0344 71/500 [===>..........................] - ETA: 3:34 - loss: 0.4323 - regression_loss: 0.3980 - classification_loss: 0.0343 72/500 [===>..........................] - ETA: 3:34 - loss: 0.4321 - regression_loss: 0.3980 - classification_loss: 0.0340 73/500 [===>..........................] - ETA: 3:34 - loss: 0.4319 - regression_loss: 0.3978 - classification_loss: 0.0341 74/500 [===>..........................] - ETA: 3:33 - loss: 0.4318 - regression_loss: 0.3977 - classification_loss: 0.0341 75/500 [===>..........................] - ETA: 3:33 - loss: 0.4314 - regression_loss: 0.3973 - classification_loss: 0.0341 76/500 [===>..........................] - ETA: 3:32 - loss: 0.4324 - regression_loss: 0.3983 - classification_loss: 0.0341 77/500 [===>..........................] - ETA: 3:31 - loss: 0.4294 - regression_loss: 0.3956 - classification_loss: 0.0338 78/500 [===>..........................] - ETA: 3:31 - loss: 0.4283 - regression_loss: 0.3948 - classification_loss: 0.0335 79/500 [===>..........................] - ETA: 3:30 - loss: 0.4285 - regression_loss: 0.3950 - classification_loss: 0.0334 80/500 [===>..........................] - ETA: 3:30 - loss: 0.4259 - regression_loss: 0.3927 - classification_loss: 0.0331 81/500 [===>..........................] - ETA: 3:29 - loss: 0.4294 - regression_loss: 0.3960 - classification_loss: 0.0334 82/500 [===>..........................] - ETA: 3:29 - loss: 0.4282 - regression_loss: 0.3950 - classification_loss: 0.0331 83/500 [===>..........................] - ETA: 3:28 - loss: 0.4295 - regression_loss: 0.3963 - classification_loss: 0.0331 84/500 [====>.........................] - ETA: 3:28 - loss: 0.4276 - regression_loss: 0.3947 - classification_loss: 0.0329 85/500 [====>.........................] - ETA: 3:27 - loss: 0.4258 - regression_loss: 0.3932 - classification_loss: 0.0326 86/500 [====>.........................] - ETA: 3:27 - loss: 0.4269 - regression_loss: 0.3942 - classification_loss: 0.0327 87/500 [====>.........................] - ETA: 3:26 - loss: 0.4252 - regression_loss: 0.3926 - classification_loss: 0.0326 88/500 [====>.........................] - ETA: 3:26 - loss: 0.4237 - regression_loss: 0.3909 - classification_loss: 0.0328 89/500 [====>.........................] - ETA: 3:25 - loss: 0.4253 - regression_loss: 0.3921 - classification_loss: 0.0333 90/500 [====>.........................] - ETA: 3:25 - loss: 0.4246 - regression_loss: 0.3912 - classification_loss: 0.0334 91/500 [====>.........................] - ETA: 3:24 - loss: 0.4257 - regression_loss: 0.3920 - classification_loss: 0.0336 92/500 [====>.........................] - ETA: 3:24 - loss: 0.4270 - regression_loss: 0.3932 - classification_loss: 0.0338 93/500 [====>.........................] - ETA: 3:23 - loss: 0.4271 - regression_loss: 0.3932 - classification_loss: 0.0339 94/500 [====>.........................] - ETA: 3:23 - loss: 0.4259 - regression_loss: 0.3922 - classification_loss: 0.0337 95/500 [====>.........................] - ETA: 3:22 - loss: 0.4263 - regression_loss: 0.3926 - classification_loss: 0.0336 96/500 [====>.........................] - ETA: 3:22 - loss: 0.4251 - regression_loss: 0.3916 - classification_loss: 0.0335 97/500 [====>.........................] - ETA: 3:21 - loss: 0.4250 - regression_loss: 0.3916 - classification_loss: 0.0334 98/500 [====>.........................] - ETA: 3:21 - loss: 0.4234 - regression_loss: 0.3902 - classification_loss: 0.0332 99/500 [====>.........................] - ETA: 3:20 - loss: 0.4260 - regression_loss: 0.3929 - classification_loss: 0.0331 100/500 [=====>........................] - ETA: 3:20 - loss: 0.4293 - regression_loss: 0.3962 - classification_loss: 0.0331 101/500 [=====>........................] - ETA: 3:19 - loss: 0.4287 - regression_loss: 0.3957 - classification_loss: 0.0330 102/500 [=====>........................] - ETA: 3:19 - loss: 0.4293 - regression_loss: 0.3959 - classification_loss: 0.0333 103/500 [=====>........................] - ETA: 3:18 - loss: 0.4279 - regression_loss: 0.3947 - classification_loss: 0.0331 104/500 [=====>........................] - ETA: 3:18 - loss: 0.4278 - regression_loss: 0.3944 - classification_loss: 0.0333 105/500 [=====>........................] - ETA: 3:17 - loss: 0.4252 - regression_loss: 0.3921 - classification_loss: 0.0331 106/500 [=====>........................] - ETA: 3:17 - loss: 0.4265 - regression_loss: 0.3933 - classification_loss: 0.0332 107/500 [=====>........................] - ETA: 3:16 - loss: 0.4244 - regression_loss: 0.3915 - classification_loss: 0.0330 108/500 [=====>........................] - ETA: 3:16 - loss: 0.4242 - regression_loss: 0.3912 - classification_loss: 0.0330 109/500 [=====>........................] - ETA: 3:15 - loss: 0.4257 - regression_loss: 0.3926 - classification_loss: 0.0331 110/500 [=====>........................] - ETA: 3:15 - loss: 0.4249 - regression_loss: 0.3920 - classification_loss: 0.0329 111/500 [=====>........................] - ETA: 3:14 - loss: 0.4239 - regression_loss: 0.3911 - classification_loss: 0.0328 112/500 [=====>........................] - ETA: 3:14 - loss: 0.4245 - regression_loss: 0.3917 - classification_loss: 0.0328 113/500 [=====>........................] - ETA: 3:13 - loss: 0.4256 - regression_loss: 0.3928 - classification_loss: 0.0328 114/500 [=====>........................] - ETA: 3:13 - loss: 0.4242 - regression_loss: 0.3914 - classification_loss: 0.0327 115/500 [=====>........................] - ETA: 3:12 - loss: 0.4232 - regression_loss: 0.3905 - classification_loss: 0.0327 116/500 [=====>........................] - ETA: 3:12 - loss: 0.4213 - regression_loss: 0.3888 - classification_loss: 0.0325 117/500 [======>.......................] - ETA: 3:11 - loss: 0.4198 - regression_loss: 0.3873 - classification_loss: 0.0324 118/500 [======>.......................] - ETA: 3:11 - loss: 0.4193 - regression_loss: 0.3869 - classification_loss: 0.0324 119/500 [======>.......................] - ETA: 3:10 - loss: 0.4187 - regression_loss: 0.3864 - classification_loss: 0.0322 120/500 [======>.......................] - ETA: 3:10 - loss: 0.4193 - regression_loss: 0.3869 - classification_loss: 0.0323 121/500 [======>.......................] - ETA: 3:09 - loss: 0.4169 - regression_loss: 0.3848 - classification_loss: 0.0321 122/500 [======>.......................] - ETA: 3:09 - loss: 0.4172 - regression_loss: 0.3851 - classification_loss: 0.0320 123/500 [======>.......................] - ETA: 3:08 - loss: 0.4164 - regression_loss: 0.3844 - classification_loss: 0.0320 124/500 [======>.......................] - ETA: 3:08 - loss: 0.4176 - regression_loss: 0.3857 - classification_loss: 0.0319 125/500 [======>.......................] - ETA: 3:07 - loss: 0.4164 - regression_loss: 0.3846 - classification_loss: 0.0318 126/500 [======>.......................] - ETA: 3:07 - loss: 0.4173 - regression_loss: 0.3853 - classification_loss: 0.0320 127/500 [======>.......................] - ETA: 3:06 - loss: 0.4182 - regression_loss: 0.3861 - classification_loss: 0.0321 128/500 [======>.......................] - ETA: 3:06 - loss: 0.4191 - regression_loss: 0.3870 - classification_loss: 0.0321 129/500 [======>.......................] - ETA: 3:05 - loss: 0.4193 - regression_loss: 0.3871 - classification_loss: 0.0322 130/500 [======>.......................] - ETA: 3:05 - loss: 0.4189 - regression_loss: 0.3868 - classification_loss: 0.0321 131/500 [======>.......................] - ETA: 3:04 - loss: 0.4185 - regression_loss: 0.3865 - classification_loss: 0.0320 132/500 [======>.......................] - ETA: 3:04 - loss: 0.4195 - regression_loss: 0.3874 - classification_loss: 0.0320 133/500 [======>.......................] - ETA: 3:03 - loss: 0.4187 - regression_loss: 0.3867 - classification_loss: 0.0320 134/500 [=======>......................] - ETA: 3:03 - loss: 0.4189 - regression_loss: 0.3869 - classification_loss: 0.0320 135/500 [=======>......................] - ETA: 3:02 - loss: 0.4186 - regression_loss: 0.3867 - classification_loss: 0.0319 136/500 [=======>......................] - ETA: 3:02 - loss: 0.4192 - regression_loss: 0.3872 - classification_loss: 0.0320 137/500 [=======>......................] - ETA: 3:01 - loss: 0.4179 - regression_loss: 0.3860 - classification_loss: 0.0319 138/500 [=======>......................] - ETA: 3:01 - loss: 0.4171 - regression_loss: 0.3853 - classification_loss: 0.0317 139/500 [=======>......................] - ETA: 3:00 - loss: 0.4191 - regression_loss: 0.3873 - classification_loss: 0.0318 140/500 [=======>......................] - ETA: 3:00 - loss: 0.4213 - regression_loss: 0.3893 - classification_loss: 0.0320 141/500 [=======>......................] - ETA: 2:59 - loss: 0.4197 - regression_loss: 0.3879 - classification_loss: 0.0318 142/500 [=======>......................] - ETA: 2:59 - loss: 0.4180 - regression_loss: 0.3864 - classification_loss: 0.0317 143/500 [=======>......................] - ETA: 2:58 - loss: 0.4181 - regression_loss: 0.3864 - classification_loss: 0.0317 144/500 [=======>......................] - ETA: 2:58 - loss: 0.4163 - regression_loss: 0.3848 - classification_loss: 0.0315 145/500 [=======>......................] - ETA: 2:57 - loss: 0.4170 - regression_loss: 0.3854 - classification_loss: 0.0316 146/500 [=======>......................] - ETA: 2:57 - loss: 0.4176 - regression_loss: 0.3861 - classification_loss: 0.0315 147/500 [=======>......................] - ETA: 2:56 - loss: 0.4162 - regression_loss: 0.3848 - classification_loss: 0.0314 148/500 [=======>......................] - ETA: 2:56 - loss: 0.4178 - regression_loss: 0.3863 - classification_loss: 0.0315 149/500 [=======>......................] - ETA: 2:55 - loss: 0.4171 - regression_loss: 0.3857 - classification_loss: 0.0314 150/500 [========>.....................] - ETA: 2:55 - loss: 0.4172 - regression_loss: 0.3858 - classification_loss: 0.0315 151/500 [========>.....................] - ETA: 2:55 - loss: 0.4155 - regression_loss: 0.3842 - classification_loss: 0.0313 152/500 [========>.....................] - ETA: 2:54 - loss: 0.4169 - regression_loss: 0.3854 - classification_loss: 0.0315 153/500 [========>.....................] - ETA: 2:54 - loss: 0.4151 - regression_loss: 0.3838 - classification_loss: 0.0314 154/500 [========>.....................] - ETA: 2:53 - loss: 0.4151 - regression_loss: 0.3837 - classification_loss: 0.0313 155/500 [========>.....................] - ETA: 2:53 - loss: 0.4144 - regression_loss: 0.3832 - classification_loss: 0.0312 156/500 [========>.....................] - ETA: 2:52 - loss: 0.4138 - regression_loss: 0.3827 - classification_loss: 0.0311 157/500 [========>.....................] - ETA: 2:52 - loss: 0.4139 - regression_loss: 0.3829 - classification_loss: 0.0310 158/500 [========>.....................] - ETA: 2:51 - loss: 0.4145 - regression_loss: 0.3834 - classification_loss: 0.0311 159/500 [========>.....................] - ETA: 2:51 - loss: 0.4158 - regression_loss: 0.3847 - classification_loss: 0.0311 160/500 [========>.....................] - ETA: 2:50 - loss: 0.4156 - regression_loss: 0.3845 - classification_loss: 0.0310 161/500 [========>.....................] - ETA: 2:49 - loss: 0.4173 - regression_loss: 0.3862 - classification_loss: 0.0311 162/500 [========>.....................] - ETA: 2:49 - loss: 0.4167 - regression_loss: 0.3856 - classification_loss: 0.0311 163/500 [========>.....................] - ETA: 2:48 - loss: 0.4155 - regression_loss: 0.3845 - classification_loss: 0.0310 164/500 [========>.....................] - ETA: 2:48 - loss: 0.4143 - regression_loss: 0.3833 - classification_loss: 0.0310 165/500 [========>.....................] - ETA: 2:47 - loss: 0.4134 - regression_loss: 0.3825 - classification_loss: 0.0309 166/500 [========>.....................] - ETA: 2:47 - loss: 0.4143 - regression_loss: 0.3836 - classification_loss: 0.0308 167/500 [=========>....................] - ETA: 2:47 - loss: 0.4150 - regression_loss: 0.3841 - classification_loss: 0.0309 168/500 [=========>....................] - ETA: 2:46 - loss: 0.4165 - regression_loss: 0.3855 - classification_loss: 0.0310 169/500 [=========>....................] - ETA: 2:46 - loss: 0.4156 - regression_loss: 0.3848 - classification_loss: 0.0308 170/500 [=========>....................] - ETA: 2:45 - loss: 0.4178 - regression_loss: 0.3869 - classification_loss: 0.0310 171/500 [=========>....................] - ETA: 2:45 - loss: 0.4183 - regression_loss: 0.3872 - classification_loss: 0.0311 172/500 [=========>....................] - ETA: 2:44 - loss: 0.4195 - regression_loss: 0.3884 - classification_loss: 0.0311 173/500 [=========>....................] - ETA: 2:44 - loss: 0.4186 - regression_loss: 0.3877 - classification_loss: 0.0310 174/500 [=========>....................] - ETA: 2:43 - loss: 0.4173 - regression_loss: 0.3865 - classification_loss: 0.0308 175/500 [=========>....................] - ETA: 2:42 - loss: 0.4164 - regression_loss: 0.3856 - classification_loss: 0.0308 176/500 [=========>....................] - ETA: 2:42 - loss: 0.4151 - regression_loss: 0.3844 - classification_loss: 0.0307 177/500 [=========>....................] - ETA: 2:41 - loss: 0.4149 - regression_loss: 0.3842 - classification_loss: 0.0307 178/500 [=========>....................] - ETA: 2:41 - loss: 0.4161 - regression_loss: 0.3854 - classification_loss: 0.0307 179/500 [=========>....................] - ETA: 2:40 - loss: 0.4164 - regression_loss: 0.3858 - classification_loss: 0.0306 180/500 [=========>....................] - ETA: 2:40 - loss: 0.4179 - regression_loss: 0.3872 - classification_loss: 0.0307 181/500 [=========>....................] - ETA: 2:39 - loss: 0.4182 - regression_loss: 0.3875 - classification_loss: 0.0307 182/500 [=========>....................] - ETA: 2:39 - loss: 0.4178 - regression_loss: 0.3872 - classification_loss: 0.0306 183/500 [=========>....................] - ETA: 2:38 - loss: 0.4181 - regression_loss: 0.3874 - classification_loss: 0.0307 184/500 [==========>...................] - ETA: 2:38 - loss: 0.4184 - regression_loss: 0.3876 - classification_loss: 0.0307 185/500 [==========>...................] - ETA: 2:37 - loss: 0.4176 - regression_loss: 0.3870 - classification_loss: 0.0307 186/500 [==========>...................] - ETA: 2:37 - loss: 0.4177 - regression_loss: 0.3870 - classification_loss: 0.0306 187/500 [==========>...................] - ETA: 2:36 - loss: 0.4174 - regression_loss: 0.3868 - classification_loss: 0.0305 188/500 [==========>...................] - ETA: 2:36 - loss: 0.4177 - regression_loss: 0.3871 - classification_loss: 0.0306 189/500 [==========>...................] - ETA: 2:35 - loss: 0.4180 - regression_loss: 0.3875 - classification_loss: 0.0305 190/500 [==========>...................] - ETA: 2:35 - loss: 0.4173 - regression_loss: 0.3868 - classification_loss: 0.0305 191/500 [==========>...................] - ETA: 2:34 - loss: 0.4171 - regression_loss: 0.3867 - classification_loss: 0.0304 192/500 [==========>...................] - ETA: 2:34 - loss: 0.4163 - regression_loss: 0.3859 - classification_loss: 0.0303 193/500 [==========>...................] - ETA: 2:33 - loss: 0.4149 - regression_loss: 0.3847 - classification_loss: 0.0302 194/500 [==========>...................] - ETA: 2:33 - loss: 0.4137 - regression_loss: 0.3836 - classification_loss: 0.0301 195/500 [==========>...................] - ETA: 2:32 - loss: 0.4141 - regression_loss: 0.3840 - classification_loss: 0.0301 196/500 [==========>...................] - ETA: 2:32 - loss: 0.4140 - regression_loss: 0.3840 - classification_loss: 0.0300 197/500 [==========>...................] - ETA: 2:31 - loss: 0.4144 - regression_loss: 0.3843 - classification_loss: 0.0301 198/500 [==========>...................] - ETA: 2:31 - loss: 0.4146 - regression_loss: 0.3844 - classification_loss: 0.0301 199/500 [==========>...................] - ETA: 2:30 - loss: 0.4153 - regression_loss: 0.3853 - classification_loss: 0.0300 200/500 [===========>..................] - ETA: 2:30 - loss: 0.4141 - regression_loss: 0.3842 - classification_loss: 0.0299 201/500 [===========>..................] - ETA: 2:29 - loss: 0.4153 - regression_loss: 0.3853 - classification_loss: 0.0300 202/500 [===========>..................] - ETA: 2:29 - loss: 0.4155 - regression_loss: 0.3854 - classification_loss: 0.0301 203/500 [===========>..................] - ETA: 2:28 - loss: 0.4151 - regression_loss: 0.3850 - classification_loss: 0.0302 204/500 [===========>..................] - ETA: 2:28 - loss: 0.4145 - regression_loss: 0.3844 - classification_loss: 0.0301 205/500 [===========>..................] - ETA: 2:27 - loss: 0.4146 - regression_loss: 0.3844 - classification_loss: 0.0302 206/500 [===========>..................] - ETA: 2:27 - loss: 0.4144 - regression_loss: 0.3843 - classification_loss: 0.0301 207/500 [===========>..................] - ETA: 2:26 - loss: 0.4163 - regression_loss: 0.3860 - classification_loss: 0.0303 208/500 [===========>..................] - ETA: 2:26 - loss: 0.4168 - regression_loss: 0.3865 - classification_loss: 0.0303 209/500 [===========>..................] - ETA: 2:25 - loss: 0.4163 - regression_loss: 0.3860 - classification_loss: 0.0303 210/500 [===========>..................] - ETA: 2:25 - loss: 0.4155 - regression_loss: 0.3853 - classification_loss: 0.0303 211/500 [===========>..................] - ETA: 2:24 - loss: 0.4152 - regression_loss: 0.3849 - classification_loss: 0.0303 212/500 [===========>..................] - ETA: 2:24 - loss: 0.4151 - regression_loss: 0.3848 - classification_loss: 0.0303 213/500 [===========>..................] - ETA: 2:23 - loss: 0.4154 - regression_loss: 0.3851 - classification_loss: 0.0302 214/500 [===========>..................] - ETA: 2:23 - loss: 0.4144 - regression_loss: 0.3843 - classification_loss: 0.0302 215/500 [===========>..................] - ETA: 2:22 - loss: 0.4148 - regression_loss: 0.3847 - classification_loss: 0.0302 216/500 [===========>..................] - ETA: 2:22 - loss: 0.4134 - regression_loss: 0.3834 - classification_loss: 0.0301 217/500 [============>.................] - ETA: 2:21 - loss: 0.4143 - regression_loss: 0.3842 - classification_loss: 0.0301 218/500 [============>.................] - ETA: 2:21 - loss: 0.4144 - regression_loss: 0.3842 - classification_loss: 0.0301 219/500 [============>.................] - ETA: 2:20 - loss: 0.4144 - regression_loss: 0.3842 - classification_loss: 0.0301 220/500 [============>.................] - ETA: 2:20 - loss: 0.4145 - regression_loss: 0.3843 - classification_loss: 0.0302 221/500 [============>.................] - ETA: 2:19 - loss: 0.4147 - regression_loss: 0.3845 - classification_loss: 0.0302 222/500 [============>.................] - ETA: 2:19 - loss: 0.4138 - regression_loss: 0.3837 - classification_loss: 0.0301 223/500 [============>.................] - ETA: 2:18 - loss: 0.4145 - regression_loss: 0.3844 - classification_loss: 0.0301 224/500 [============>.................] - ETA: 2:18 - loss: 0.4140 - regression_loss: 0.3840 - classification_loss: 0.0301 225/500 [============>.................] - ETA: 2:17 - loss: 0.4139 - regression_loss: 0.3840 - classification_loss: 0.0300 226/500 [============>.................] - ETA: 2:17 - loss: 0.4138 - regression_loss: 0.3839 - classification_loss: 0.0299 227/500 [============>.................] - ETA: 2:16 - loss: 0.4133 - regression_loss: 0.3834 - classification_loss: 0.0299 228/500 [============>.................] - ETA: 2:16 - loss: 0.4139 - regression_loss: 0.3840 - classification_loss: 0.0299 229/500 [============>.................] - ETA: 2:15 - loss: 0.4141 - regression_loss: 0.3842 - classification_loss: 0.0299 230/500 [============>.................] - ETA: 2:15 - loss: 0.4136 - regression_loss: 0.3837 - classification_loss: 0.0299 231/500 [============>.................] - ETA: 2:14 - loss: 0.4140 - regression_loss: 0.3841 - classification_loss: 0.0299 232/500 [============>.................] - ETA: 2:14 - loss: 0.4154 - regression_loss: 0.3854 - classification_loss: 0.0300 233/500 [============>.................] - ETA: 2:13 - loss: 0.4153 - regression_loss: 0.3853 - classification_loss: 0.0300 234/500 [=============>................] - ETA: 2:13 - loss: 0.4155 - regression_loss: 0.3855 - classification_loss: 0.0300 235/500 [=============>................] - ETA: 2:12 - loss: 0.4153 - regression_loss: 0.3854 - classification_loss: 0.0299 236/500 [=============>................] - ETA: 2:12 - loss: 0.4150 - regression_loss: 0.3851 - classification_loss: 0.0299 237/500 [=============>................] - ETA: 2:11 - loss: 0.4159 - regression_loss: 0.3859 - classification_loss: 0.0300 238/500 [=============>................] - ETA: 2:11 - loss: 0.4156 - regression_loss: 0.3856 - classification_loss: 0.0300 239/500 [=============>................] - ETA: 2:10 - loss: 0.4165 - regression_loss: 0.3864 - classification_loss: 0.0301 240/500 [=============>................] - ETA: 2:10 - loss: 0.4166 - regression_loss: 0.3865 - classification_loss: 0.0301 241/500 [=============>................] - ETA: 2:09 - loss: 0.4163 - regression_loss: 0.3862 - classification_loss: 0.0301 242/500 [=============>................] - ETA: 2:09 - loss: 0.4163 - regression_loss: 0.3862 - classification_loss: 0.0301 243/500 [=============>................] - ETA: 2:08 - loss: 0.4159 - regression_loss: 0.3858 - classification_loss: 0.0301 244/500 [=============>................] - ETA: 2:08 - loss: 0.4171 - regression_loss: 0.3867 - classification_loss: 0.0304 245/500 [=============>................] - ETA: 2:07 - loss: 0.4167 - regression_loss: 0.3862 - classification_loss: 0.0305 246/500 [=============>................] - ETA: 2:07 - loss: 0.4187 - regression_loss: 0.3881 - classification_loss: 0.0306 247/500 [=============>................] - ETA: 2:06 - loss: 0.4181 - regression_loss: 0.3875 - classification_loss: 0.0305 248/500 [=============>................] - ETA: 2:06 - loss: 0.4208 - regression_loss: 0.3901 - classification_loss: 0.0307 249/500 [=============>................] - ETA: 2:05 - loss: 0.4206 - regression_loss: 0.3899 - classification_loss: 0.0307 250/500 [==============>...............] - ETA: 2:05 - loss: 0.4197 - regression_loss: 0.3890 - classification_loss: 0.0306 251/500 [==============>...............] - ETA: 2:04 - loss: 0.4196 - regression_loss: 0.3890 - classification_loss: 0.0306 252/500 [==============>...............] - ETA: 2:04 - loss: 0.4206 - regression_loss: 0.3899 - classification_loss: 0.0307 253/500 [==============>...............] - ETA: 2:03 - loss: 0.4208 - regression_loss: 0.3901 - classification_loss: 0.0307 254/500 [==============>...............] - ETA: 2:03 - loss: 0.4206 - regression_loss: 0.3899 - classification_loss: 0.0307 255/500 [==============>...............] - ETA: 2:02 - loss: 0.4208 - regression_loss: 0.3900 - classification_loss: 0.0308 256/500 [==============>...............] - ETA: 2:02 - loss: 0.4206 - regression_loss: 0.3898 - classification_loss: 0.0308 257/500 [==============>...............] - ETA: 2:01 - loss: 0.4200 - regression_loss: 0.3893 - classification_loss: 0.0308 258/500 [==============>...............] - ETA: 2:01 - loss: 0.4194 - regression_loss: 0.3887 - classification_loss: 0.0307 259/500 [==============>...............] - ETA: 2:00 - loss: 0.4189 - regression_loss: 0.3883 - classification_loss: 0.0307 260/500 [==============>...............] - ETA: 2:00 - loss: 0.4182 - regression_loss: 0.3876 - classification_loss: 0.0306 261/500 [==============>...............] - ETA: 1:59 - loss: 0.4180 - regression_loss: 0.3874 - classification_loss: 0.0306 262/500 [==============>...............] - ETA: 1:59 - loss: 0.4176 - regression_loss: 0.3871 - classification_loss: 0.0305 263/500 [==============>...............] - ETA: 1:58 - loss: 0.4190 - regression_loss: 0.3883 - classification_loss: 0.0308 264/500 [==============>...............] - ETA: 1:58 - loss: 0.4190 - regression_loss: 0.3882 - classification_loss: 0.0308 265/500 [==============>...............] - ETA: 1:57 - loss: 0.4188 - regression_loss: 0.3881 - classification_loss: 0.0307 266/500 [==============>...............] - ETA: 1:57 - loss: 0.4191 - regression_loss: 0.3883 - classification_loss: 0.0308 267/500 [===============>..............] - ETA: 1:56 - loss: 0.4190 - regression_loss: 0.3882 - classification_loss: 0.0308 268/500 [===============>..............] - ETA: 1:56 - loss: 0.4187 - regression_loss: 0.3879 - classification_loss: 0.0308 269/500 [===============>..............] - ETA: 1:55 - loss: 0.4184 - regression_loss: 0.3876 - classification_loss: 0.0307 270/500 [===============>..............] - ETA: 1:55 - loss: 0.4190 - regression_loss: 0.3882 - classification_loss: 0.0309 271/500 [===============>..............] - ETA: 1:54 - loss: 0.4196 - regression_loss: 0.3887 - classification_loss: 0.0309 272/500 [===============>..............] - ETA: 1:54 - loss: 0.4201 - regression_loss: 0.3892 - classification_loss: 0.0309 273/500 [===============>..............] - ETA: 1:53 - loss: 0.4213 - regression_loss: 0.3903 - classification_loss: 0.0310 274/500 [===============>..............] - ETA: 1:53 - loss: 0.4209 - regression_loss: 0.3899 - classification_loss: 0.0310 275/500 [===============>..............] - ETA: 1:52 - loss: 0.4210 - regression_loss: 0.3900 - classification_loss: 0.0310 276/500 [===============>..............] - ETA: 1:52 - loss: 0.4212 - regression_loss: 0.3902 - classification_loss: 0.0310 277/500 [===============>..............] - ETA: 1:51 - loss: 0.4211 - regression_loss: 0.3901 - classification_loss: 0.0309 278/500 [===============>..............] - ETA: 1:51 - loss: 0.4204 - regression_loss: 0.3895 - classification_loss: 0.0309 279/500 [===============>..............] - ETA: 1:50 - loss: 0.4198 - regression_loss: 0.3890 - classification_loss: 0.0308 280/500 [===============>..............] - ETA: 1:50 - loss: 0.4188 - regression_loss: 0.3881 - classification_loss: 0.0307 281/500 [===============>..............] - ETA: 1:49 - loss: 0.4198 - regression_loss: 0.3888 - classification_loss: 0.0310 282/500 [===============>..............] - ETA: 1:49 - loss: 0.4199 - regression_loss: 0.3888 - classification_loss: 0.0311 283/500 [===============>..............] - ETA: 1:48 - loss: 0.4202 - regression_loss: 0.3890 - classification_loss: 0.0312 284/500 [================>.............] - ETA: 1:48 - loss: 0.4200 - regression_loss: 0.3888 - classification_loss: 0.0312 285/500 [================>.............] - ETA: 1:47 - loss: 0.4210 - regression_loss: 0.3897 - classification_loss: 0.0313 286/500 [================>.............] - ETA: 1:47 - loss: 0.4208 - regression_loss: 0.3896 - classification_loss: 0.0312 287/500 [================>.............] - ETA: 1:46 - loss: 0.4202 - regression_loss: 0.3891 - classification_loss: 0.0311 288/500 [================>.............] - ETA: 1:46 - loss: 0.4205 - regression_loss: 0.3893 - classification_loss: 0.0312 289/500 [================>.............] - ETA: 1:45 - loss: 0.4210 - regression_loss: 0.3897 - classification_loss: 0.0313 290/500 [================>.............] - ETA: 1:45 - loss: 0.4212 - regression_loss: 0.3900 - classification_loss: 0.0313 291/500 [================>.............] - ETA: 1:44 - loss: 0.4212 - regression_loss: 0.3898 - classification_loss: 0.0314 292/500 [================>.............] - ETA: 1:44 - loss: 0.4218 - regression_loss: 0.3904 - classification_loss: 0.0314 293/500 [================>.............] - ETA: 1:43 - loss: 0.4212 - regression_loss: 0.3899 - classification_loss: 0.0313 294/500 [================>.............] - ETA: 1:43 - loss: 0.4225 - regression_loss: 0.3912 - classification_loss: 0.0313 295/500 [================>.............] - ETA: 1:42 - loss: 0.4226 - regression_loss: 0.3912 - classification_loss: 0.0314 296/500 [================>.............] - ETA: 1:42 - loss: 0.4227 - regression_loss: 0.3913 - classification_loss: 0.0314 297/500 [================>.............] - ETA: 1:41 - loss: 0.4240 - regression_loss: 0.3923 - classification_loss: 0.0316 298/500 [================>.............] - ETA: 1:41 - loss: 0.4236 - regression_loss: 0.3920 - classification_loss: 0.0316 299/500 [================>.............] - ETA: 1:40 - loss: 0.4246 - regression_loss: 0.3928 - classification_loss: 0.0318 300/500 [=================>............] - ETA: 1:40 - loss: 0.4245 - regression_loss: 0.3928 - classification_loss: 0.0317 301/500 [=================>............] - ETA: 1:39 - loss: 0.4241 - regression_loss: 0.3924 - classification_loss: 0.0317 302/500 [=================>............] - ETA: 1:39 - loss: 0.4239 - regression_loss: 0.3923 - classification_loss: 0.0316 303/500 [=================>............] - ETA: 1:38 - loss: 0.4236 - regression_loss: 0.3920 - classification_loss: 0.0316 304/500 [=================>............] - ETA: 1:38 - loss: 0.4230 - regression_loss: 0.3915 - classification_loss: 0.0315 305/500 [=================>............] - ETA: 1:37 - loss: 0.4222 - regression_loss: 0.3907 - classification_loss: 0.0315 306/500 [=================>............] - ETA: 1:37 - loss: 0.4219 - regression_loss: 0.3905 - classification_loss: 0.0314 307/500 [=================>............] - ETA: 1:36 - loss: 0.4235 - regression_loss: 0.3919 - classification_loss: 0.0316 308/500 [=================>............] - ETA: 1:36 - loss: 0.4239 - regression_loss: 0.3922 - classification_loss: 0.0317 309/500 [=================>............] - ETA: 1:35 - loss: 0.4242 - regression_loss: 0.3925 - classification_loss: 0.0318 310/500 [=================>............] - ETA: 1:35 - loss: 0.4243 - regression_loss: 0.3925 - classification_loss: 0.0318 311/500 [=================>............] - ETA: 1:34 - loss: 0.4246 - regression_loss: 0.3928 - classification_loss: 0.0318 312/500 [=================>............] - ETA: 1:34 - loss: 0.4242 - regression_loss: 0.3925 - classification_loss: 0.0317 313/500 [=================>............] - ETA: 1:33 - loss: 0.4242 - regression_loss: 0.3925 - classification_loss: 0.0317 314/500 [=================>............] - ETA: 1:33 - loss: 0.4246 - regression_loss: 0.3928 - classification_loss: 0.0318 315/500 [=================>............] - ETA: 1:32 - loss: 0.4247 - regression_loss: 0.3929 - classification_loss: 0.0318 316/500 [=================>............] - ETA: 1:32 - loss: 0.4253 - regression_loss: 0.3935 - classification_loss: 0.0318 317/500 [==================>...........] - ETA: 1:31 - loss: 0.4256 - regression_loss: 0.3937 - classification_loss: 0.0319 318/500 [==================>...........] - ETA: 1:31 - loss: 0.4251 - regression_loss: 0.3932 - classification_loss: 0.0319 319/500 [==================>...........] - ETA: 1:30 - loss: 0.4261 - regression_loss: 0.3940 - classification_loss: 0.0320 320/500 [==================>...........] - ETA: 1:30 - loss: 0.4259 - regression_loss: 0.3939 - classification_loss: 0.0320 321/500 [==================>...........] - ETA: 1:29 - loss: 0.4256 - regression_loss: 0.3936 - classification_loss: 0.0320 322/500 [==================>...........] - ETA: 1:29 - loss: 0.4259 - regression_loss: 0.3938 - classification_loss: 0.0321 323/500 [==================>...........] - ETA: 1:28 - loss: 0.4261 - regression_loss: 0.3938 - classification_loss: 0.0322 324/500 [==================>...........] - ETA: 1:28 - loss: 0.4262 - regression_loss: 0.3940 - classification_loss: 0.0322 325/500 [==================>...........] - ETA: 1:27 - loss: 0.4264 - regression_loss: 0.3941 - classification_loss: 0.0322 326/500 [==================>...........] - ETA: 1:27 - loss: 0.4260 - regression_loss: 0.3939 - classification_loss: 0.0322 327/500 [==================>...........] - ETA: 1:26 - loss: 0.4256 - regression_loss: 0.3935 - classification_loss: 0.0321 328/500 [==================>...........] - ETA: 1:26 - loss: 0.4252 - regression_loss: 0.3931 - classification_loss: 0.0321 329/500 [==================>...........] - ETA: 1:25 - loss: 0.4257 - regression_loss: 0.3935 - classification_loss: 0.0322 330/500 [==================>...........] - ETA: 1:25 - loss: 0.4254 - regression_loss: 0.3932 - classification_loss: 0.0322 331/500 [==================>...........] - ETA: 1:24 - loss: 0.4250 - regression_loss: 0.3928 - classification_loss: 0.0322 332/500 [==================>...........] - ETA: 1:24 - loss: 0.4248 - regression_loss: 0.3927 - classification_loss: 0.0321 333/500 [==================>...........] - ETA: 1:23 - loss: 0.4249 - regression_loss: 0.3928 - classification_loss: 0.0321 334/500 [===================>..........] - ETA: 1:23 - loss: 0.4252 - regression_loss: 0.3930 - classification_loss: 0.0321 335/500 [===================>..........] - ETA: 1:22 - loss: 0.4251 - regression_loss: 0.3930 - classification_loss: 0.0321 336/500 [===================>..........] - ETA: 1:22 - loss: 0.4256 - regression_loss: 0.3936 - classification_loss: 0.0321 337/500 [===================>..........] - ETA: 1:21 - loss: 0.4250 - regression_loss: 0.3930 - classification_loss: 0.0320 338/500 [===================>..........] - ETA: 1:21 - loss: 0.4245 - regression_loss: 0.3925 - classification_loss: 0.0320 339/500 [===================>..........] - ETA: 1:20 - loss: 0.4249 - regression_loss: 0.3929 - classification_loss: 0.0320 340/500 [===================>..........] - ETA: 1:20 - loss: 0.4241 - regression_loss: 0.3921 - classification_loss: 0.0319 341/500 [===================>..........] - ETA: 1:19 - loss: 0.4245 - regression_loss: 0.3926 - classification_loss: 0.0320 342/500 [===================>..........] - ETA: 1:19 - loss: 0.4245 - regression_loss: 0.3925 - classification_loss: 0.0320 343/500 [===================>..........] - ETA: 1:18 - loss: 0.4255 - regression_loss: 0.3932 - classification_loss: 0.0323 344/500 [===================>..........] - ETA: 1:18 - loss: 0.4251 - regression_loss: 0.3928 - classification_loss: 0.0322 345/500 [===================>..........] - ETA: 1:17 - loss: 0.4246 - regression_loss: 0.3924 - classification_loss: 0.0322 346/500 [===================>..........] - ETA: 1:17 - loss: 0.4247 - regression_loss: 0.3926 - classification_loss: 0.0322 347/500 [===================>..........] - ETA: 1:16 - loss: 0.4246 - regression_loss: 0.3924 - classification_loss: 0.0321 348/500 [===================>..........] - ETA: 1:16 - loss: 0.4248 - regression_loss: 0.3927 - classification_loss: 0.0322 349/500 [===================>..........] - ETA: 1:15 - loss: 0.4246 - regression_loss: 0.3925 - classification_loss: 0.0321 350/500 [====================>.........] - ETA: 1:15 - loss: 0.4243 - regression_loss: 0.3922 - classification_loss: 0.0321 351/500 [====================>.........] - ETA: 1:14 - loss: 0.4248 - regression_loss: 0.3927 - classification_loss: 0.0321 352/500 [====================>.........] - ETA: 1:14 - loss: 0.4252 - regression_loss: 0.3931 - classification_loss: 0.0322 353/500 [====================>.........] - ETA: 1:13 - loss: 0.4260 - regression_loss: 0.3936 - classification_loss: 0.0324 354/500 [====================>.........] - ETA: 1:13 - loss: 0.4262 - regression_loss: 0.3937 - classification_loss: 0.0324 355/500 [====================>.........] - ETA: 1:12 - loss: 0.4255 - regression_loss: 0.3931 - classification_loss: 0.0324 356/500 [====================>.........] - ETA: 1:12 - loss: 0.4261 - regression_loss: 0.3937 - classification_loss: 0.0324 357/500 [====================>.........] - ETA: 1:11 - loss: 0.4255 - regression_loss: 0.3931 - classification_loss: 0.0323 358/500 [====================>.........] - ETA: 1:11 - loss: 0.4257 - regression_loss: 0.3934 - classification_loss: 0.0323 359/500 [====================>.........] - ETA: 1:10 - loss: 0.4261 - regression_loss: 0.3938 - classification_loss: 0.0323 360/500 [====================>.........] - ETA: 1:10 - loss: 0.4261 - regression_loss: 0.3937 - classification_loss: 0.0323 361/500 [====================>.........] - ETA: 1:09 - loss: 0.4263 - regression_loss: 0.3940 - classification_loss: 0.0323 362/500 [====================>.........] - ETA: 1:09 - loss: 0.4263 - regression_loss: 0.3939 - classification_loss: 0.0324 363/500 [====================>.........] - ETA: 1:08 - loss: 0.4258 - regression_loss: 0.3934 - classification_loss: 0.0323 364/500 [====================>.........] - ETA: 1:08 - loss: 0.4263 - regression_loss: 0.3939 - classification_loss: 0.0324 365/500 [====================>.........] - ETA: 1:07 - loss: 0.4272 - regression_loss: 0.3948 - classification_loss: 0.0324 366/500 [====================>.........] - ETA: 1:07 - loss: 0.4281 - regression_loss: 0.3957 - classification_loss: 0.0324 367/500 [=====================>........] - ETA: 1:06 - loss: 0.4274 - regression_loss: 0.3951 - classification_loss: 0.0323 368/500 [=====================>........] - ETA: 1:06 - loss: 0.4274 - regression_loss: 0.3951 - classification_loss: 0.0323 369/500 [=====================>........] - ETA: 1:05 - loss: 0.4266 - regression_loss: 0.3944 - classification_loss: 0.0323 370/500 [=====================>........] - ETA: 1:05 - loss: 0.4258 - regression_loss: 0.3936 - classification_loss: 0.0322 371/500 [=====================>........] - ETA: 1:04 - loss: 0.4258 - regression_loss: 0.3936 - classification_loss: 0.0322 372/500 [=====================>........] - ETA: 1:04 - loss: 0.4258 - regression_loss: 0.3936 - classification_loss: 0.0322 373/500 [=====================>........] - ETA: 1:03 - loss: 0.4250 - regression_loss: 0.3929 - classification_loss: 0.0321 374/500 [=====================>........] - ETA: 1:03 - loss: 0.4247 - regression_loss: 0.3926 - classification_loss: 0.0321 375/500 [=====================>........] - ETA: 1:02 - loss: 0.4251 - regression_loss: 0.3929 - classification_loss: 0.0322 376/500 [=====================>........] - ETA: 1:02 - loss: 0.4255 - regression_loss: 0.3932 - classification_loss: 0.0322 377/500 [=====================>........] - ETA: 1:01 - loss: 0.4251 - regression_loss: 0.3929 - classification_loss: 0.0322 378/500 [=====================>........] - ETA: 1:01 - loss: 0.4253 - regression_loss: 0.3930 - classification_loss: 0.0322 379/500 [=====================>........] - ETA: 1:00 - loss: 0.4253 - regression_loss: 0.3931 - classification_loss: 0.0322 380/500 [=====================>........] - ETA: 1:00 - loss: 0.4250 - regression_loss: 0.3928 - classification_loss: 0.0322 381/500 [=====================>........] - ETA: 59s - loss: 0.4247 - regression_loss: 0.3925 - classification_loss: 0.0322  382/500 [=====================>........] - ETA: 59s - loss: 0.4248 - regression_loss: 0.3927 - classification_loss: 0.0321 383/500 [=====================>........] - ETA: 58s - loss: 0.4244 - regression_loss: 0.3924 - classification_loss: 0.0321 384/500 [======================>.......] - ETA: 58s - loss: 0.4248 - regression_loss: 0.3927 - classification_loss: 0.0321 385/500 [======================>.......] - ETA: 57s - loss: 0.4248 - regression_loss: 0.3927 - classification_loss: 0.0321 386/500 [======================>.......] - ETA: 57s - loss: 0.4246 - regression_loss: 0.3925 - classification_loss: 0.0321 387/500 [======================>.......] - ETA: 56s - loss: 0.4249 - regression_loss: 0.3928 - classification_loss: 0.0321 388/500 [======================>.......] - ETA: 56s - loss: 0.4243 - regression_loss: 0.3923 - classification_loss: 0.0320 389/500 [======================>.......] - ETA: 55s - loss: 0.4246 - regression_loss: 0.3925 - classification_loss: 0.0321 390/500 [======================>.......] - ETA: 55s - loss: 0.4241 - regression_loss: 0.3921 - classification_loss: 0.0321 391/500 [======================>.......] - ETA: 54s - loss: 0.4238 - regression_loss: 0.3918 - classification_loss: 0.0320 392/500 [======================>.......] - ETA: 54s - loss: 0.4241 - regression_loss: 0.3921 - classification_loss: 0.0321 393/500 [======================>.......] - ETA: 53s - loss: 0.4249 - regression_loss: 0.3928 - classification_loss: 0.0321 394/500 [======================>.......] - ETA: 53s - loss: 0.4250 - regression_loss: 0.3929 - classification_loss: 0.0321 395/500 [======================>.......] - ETA: 52s - loss: 0.4251 - regression_loss: 0.3930 - classification_loss: 0.0321 396/500 [======================>.......] - ETA: 52s - loss: 0.4250 - regression_loss: 0.3929 - classification_loss: 0.0321 397/500 [======================>.......] - ETA: 51s - loss: 0.4250 - regression_loss: 0.3929 - classification_loss: 0.0321 398/500 [======================>.......] - ETA: 51s - loss: 0.4249 - regression_loss: 0.3928 - classification_loss: 0.0321 399/500 [======================>.......] - ETA: 50s - loss: 0.4249 - regression_loss: 0.3928 - classification_loss: 0.0321 400/500 [=======================>......] - ETA: 50s - loss: 0.4247 - regression_loss: 0.3926 - classification_loss: 0.0321 401/500 [=======================>......] - ETA: 49s - loss: 0.4243 - regression_loss: 0.3922 - classification_loss: 0.0321 402/500 [=======================>......] - ETA: 49s - loss: 0.4237 - regression_loss: 0.3917 - classification_loss: 0.0320 403/500 [=======================>......] - ETA: 48s - loss: 0.4241 - regression_loss: 0.3919 - classification_loss: 0.0322 404/500 [=======================>......] - ETA: 48s - loss: 0.4240 - regression_loss: 0.3918 - classification_loss: 0.0321 405/500 [=======================>......] - ETA: 47s - loss: 0.4249 - regression_loss: 0.3926 - classification_loss: 0.0323 406/500 [=======================>......] - ETA: 47s - loss: 0.4246 - regression_loss: 0.3923 - classification_loss: 0.0323 407/500 [=======================>......] - ETA: 46s - loss: 0.4243 - regression_loss: 0.3920 - classification_loss: 0.0323 408/500 [=======================>......] - ETA: 46s - loss: 0.4236 - regression_loss: 0.3914 - classification_loss: 0.0322 409/500 [=======================>......] - ETA: 45s - loss: 0.4234 - regression_loss: 0.3912 - classification_loss: 0.0322 410/500 [=======================>......] - ETA: 45s - loss: 0.4236 - regression_loss: 0.3914 - classification_loss: 0.0321 411/500 [=======================>......] - ETA: 44s - loss: 0.4233 - regression_loss: 0.3912 - classification_loss: 0.0321 412/500 [=======================>......] - ETA: 44s - loss: 0.4230 - regression_loss: 0.3909 - classification_loss: 0.0321 413/500 [=======================>......] - ETA: 43s - loss: 0.4230 - regression_loss: 0.3909 - classification_loss: 0.0320 414/500 [=======================>......] - ETA: 43s - loss: 0.4232 - regression_loss: 0.3911 - classification_loss: 0.0321 415/500 [=======================>......] - ETA: 42s - loss: 0.4233 - regression_loss: 0.3912 - classification_loss: 0.0321 416/500 [=======================>......] - ETA: 42s - loss: 0.4236 - regression_loss: 0.3914 - classification_loss: 0.0321 417/500 [========================>.....] - ETA: 41s - loss: 0.4233 - regression_loss: 0.3911 - classification_loss: 0.0321 418/500 [========================>.....] - ETA: 41s - loss: 0.4234 - regression_loss: 0.3913 - classification_loss: 0.0321 419/500 [========================>.....] - ETA: 40s - loss: 0.4231 - regression_loss: 0.3910 - classification_loss: 0.0321 420/500 [========================>.....] - ETA: 40s - loss: 0.4227 - regression_loss: 0.3907 - classification_loss: 0.0320 421/500 [========================>.....] - ETA: 39s - loss: 0.4224 - regression_loss: 0.3904 - classification_loss: 0.0320 422/500 [========================>.....] - ETA: 39s - loss: 0.4225 - regression_loss: 0.3905 - classification_loss: 0.0320 423/500 [========================>.....] - ETA: 38s - loss: 0.4224 - regression_loss: 0.3904 - classification_loss: 0.0320 424/500 [========================>.....] - ETA: 38s - loss: 0.4220 - regression_loss: 0.3900 - classification_loss: 0.0320 425/500 [========================>.....] - ETA: 37s - loss: 0.4220 - regression_loss: 0.3901 - classification_loss: 0.0319 426/500 [========================>.....] - ETA: 37s - loss: 0.4222 - regression_loss: 0.3903 - classification_loss: 0.0319 427/500 [========================>.....] - ETA: 36s - loss: 0.4219 - regression_loss: 0.3900 - classification_loss: 0.0319 428/500 [========================>.....] - ETA: 36s - loss: 0.4222 - regression_loss: 0.3903 - classification_loss: 0.0319 429/500 [========================>.....] - ETA: 35s - loss: 0.4218 - regression_loss: 0.3900 - classification_loss: 0.0318 430/500 [========================>.....] - ETA: 35s - loss: 0.4220 - regression_loss: 0.3902 - classification_loss: 0.0318 431/500 [========================>.....] - ETA: 34s - loss: 0.4226 - regression_loss: 0.3907 - classification_loss: 0.0319 432/500 [========================>.....] - ETA: 34s - loss: 0.4228 - regression_loss: 0.3910 - classification_loss: 0.0319 433/500 [========================>.....] - ETA: 33s - loss: 0.4223 - regression_loss: 0.3905 - classification_loss: 0.0318 434/500 [=========================>....] - ETA: 33s - loss: 0.4227 - regression_loss: 0.3909 - classification_loss: 0.0319 435/500 [=========================>....] - ETA: 32s - loss: 0.4222 - regression_loss: 0.3904 - classification_loss: 0.0318 436/500 [=========================>....] - ETA: 32s - loss: 0.4220 - regression_loss: 0.3902 - classification_loss: 0.0318 437/500 [=========================>....] - ETA: 31s - loss: 0.4222 - regression_loss: 0.3904 - classification_loss: 0.0318 438/500 [=========================>....] - ETA: 31s - loss: 0.4222 - regression_loss: 0.3904 - classification_loss: 0.0318 439/500 [=========================>....] - ETA: 30s - loss: 0.4220 - regression_loss: 0.3902 - classification_loss: 0.0318 440/500 [=========================>....] - ETA: 30s - loss: 0.4223 - regression_loss: 0.3904 - classification_loss: 0.0319 441/500 [=========================>....] - ETA: 29s - loss: 0.4219 - regression_loss: 0.3901 - classification_loss: 0.0318 442/500 [=========================>....] - ETA: 29s - loss: 0.4220 - regression_loss: 0.3902 - classification_loss: 0.0318 443/500 [=========================>....] - ETA: 28s - loss: 0.4220 - regression_loss: 0.3902 - classification_loss: 0.0318 444/500 [=========================>....] - ETA: 28s - loss: 0.4223 - regression_loss: 0.3906 - classification_loss: 0.0317 445/500 [=========================>....] - ETA: 27s - loss: 0.4228 - regression_loss: 0.3910 - classification_loss: 0.0318 446/500 [=========================>....] - ETA: 27s - loss: 0.4232 - regression_loss: 0.3914 - classification_loss: 0.0318 447/500 [=========================>....] - ETA: 26s - loss: 0.4232 - regression_loss: 0.3914 - classification_loss: 0.0318 448/500 [=========================>....] - ETA: 26s - loss: 0.4237 - regression_loss: 0.3918 - classification_loss: 0.0319 449/500 [=========================>....] - ETA: 25s - loss: 0.4236 - regression_loss: 0.3917 - classification_loss: 0.0319 450/500 [==========================>...] - ETA: 25s - loss: 0.4234 - regression_loss: 0.3915 - classification_loss: 0.0319 451/500 [==========================>...] - ETA: 24s - loss: 0.4232 - regression_loss: 0.3913 - classification_loss: 0.0319 452/500 [==========================>...] - ETA: 24s - loss: 0.4227 - regression_loss: 0.3909 - classification_loss: 0.0318 453/500 [==========================>...] - ETA: 23s - loss: 0.4223 - regression_loss: 0.3906 - classification_loss: 0.0318 454/500 [==========================>...] - ETA: 23s - loss: 0.4225 - regression_loss: 0.3908 - classification_loss: 0.0318 455/500 [==========================>...] - ETA: 22s - loss: 0.4221 - regression_loss: 0.3904 - classification_loss: 0.0317 456/500 [==========================>...] - ETA: 22s - loss: 0.4217 - regression_loss: 0.3900 - classification_loss: 0.0317 457/500 [==========================>...] - ETA: 21s - loss: 0.4211 - regression_loss: 0.3895 - classification_loss: 0.0316 458/500 [==========================>...] - ETA: 21s - loss: 0.4215 - regression_loss: 0.3899 - classification_loss: 0.0316 459/500 [==========================>...] - ETA: 20s - loss: 0.4226 - regression_loss: 0.3908 - classification_loss: 0.0317 460/500 [==========================>...] - ETA: 20s - loss: 0.4229 - regression_loss: 0.3911 - classification_loss: 0.0318 461/500 [==========================>...] - ETA: 19s - loss: 0.4227 - regression_loss: 0.3910 - classification_loss: 0.0317 462/500 [==========================>...] - ETA: 19s - loss: 0.4222 - regression_loss: 0.3906 - classification_loss: 0.0317 463/500 [==========================>...] - ETA: 18s - loss: 0.4223 - regression_loss: 0.3907 - classification_loss: 0.0317 464/500 [==========================>...] - ETA: 18s - loss: 0.4229 - regression_loss: 0.3912 - classification_loss: 0.0317 465/500 [==========================>...] - ETA: 17s - loss: 0.4232 - regression_loss: 0.3916 - classification_loss: 0.0316 466/500 [==========================>...] - ETA: 17s - loss: 0.4237 - regression_loss: 0.3920 - classification_loss: 0.0316 467/500 [===========================>..] - ETA: 16s - loss: 0.4233 - regression_loss: 0.3917 - classification_loss: 0.0316 468/500 [===========================>..] - ETA: 16s - loss: 0.4230 - regression_loss: 0.3915 - classification_loss: 0.0316 469/500 [===========================>..] - ETA: 15s - loss: 0.4234 - regression_loss: 0.3918 - classification_loss: 0.0316 470/500 [===========================>..] - ETA: 15s - loss: 0.4241 - regression_loss: 0.3924 - classification_loss: 0.0317 471/500 [===========================>..] - ETA: 14s - loss: 0.4237 - regression_loss: 0.3920 - classification_loss: 0.0317 472/500 [===========================>..] - ETA: 14s - loss: 0.4240 - regression_loss: 0.3923 - classification_loss: 0.0317 473/500 [===========================>..] - ETA: 13s - loss: 0.4240 - regression_loss: 0.3922 - classification_loss: 0.0318 474/500 [===========================>..] - ETA: 13s - loss: 0.4238 - regression_loss: 0.3921 - classification_loss: 0.0317 475/500 [===========================>..] - ETA: 12s - loss: 0.4240 - regression_loss: 0.3922 - classification_loss: 0.0317 476/500 [===========================>..] - ETA: 12s - loss: 0.4245 - regression_loss: 0.3927 - classification_loss: 0.0318 477/500 [===========================>..] - ETA: 11s - loss: 0.4250 - regression_loss: 0.3932 - classification_loss: 0.0319 478/500 [===========================>..] - ETA: 11s - loss: 0.4248 - regression_loss: 0.3930 - classification_loss: 0.0319 479/500 [===========================>..] - ETA: 10s - loss: 0.4247 - regression_loss: 0.3929 - classification_loss: 0.0318 480/500 [===========================>..] - ETA: 10s - loss: 0.4251 - regression_loss: 0.3933 - classification_loss: 0.0319 481/500 [===========================>..] - ETA: 9s - loss: 0.4253 - regression_loss: 0.3934 - classification_loss: 0.0319  482/500 [===========================>..] - ETA: 9s - loss: 0.4254 - regression_loss: 0.3935 - classification_loss: 0.0319 483/500 [===========================>..] - ETA: 8s - loss: 0.4256 - regression_loss: 0.3937 - classification_loss: 0.0319 484/500 [============================>.] - ETA: 8s - loss: 0.4256 - regression_loss: 0.3937 - classification_loss: 0.0319 485/500 [============================>.] - ETA: 7s - loss: 0.4263 - regression_loss: 0.3942 - classification_loss: 0.0321 486/500 [============================>.] - ETA: 7s - loss: 0.4268 - regression_loss: 0.3947 - classification_loss: 0.0321 487/500 [============================>.] - ETA: 6s - loss: 0.4273 - regression_loss: 0.3951 - classification_loss: 0.0322 488/500 [============================>.] - ETA: 6s - loss: 0.4276 - regression_loss: 0.3954 - classification_loss: 0.0323 489/500 [============================>.] - ETA: 5s - loss: 0.4275 - regression_loss: 0.3952 - classification_loss: 0.0323 490/500 [============================>.] - ETA: 5s - loss: 0.4271 - regression_loss: 0.3949 - classification_loss: 0.0322 491/500 [============================>.] - ETA: 4s - loss: 0.4273 - regression_loss: 0.3951 - classification_loss: 0.0322 492/500 [============================>.] - ETA: 4s - loss: 0.4275 - regression_loss: 0.3952 - classification_loss: 0.0322 493/500 [============================>.] - ETA: 3s - loss: 0.4274 - regression_loss: 0.3952 - classification_loss: 0.0322 494/500 [============================>.] - ETA: 3s - loss: 0.4277 - regression_loss: 0.3954 - classification_loss: 0.0323 495/500 [============================>.] - ETA: 2s - loss: 0.4276 - regression_loss: 0.3953 - classification_loss: 0.0322 496/500 [============================>.] - ETA: 2s - loss: 0.4273 - regression_loss: 0.3951 - classification_loss: 0.0322 497/500 [============================>.] - ETA: 1s - loss: 0.4270 - regression_loss: 0.3949 - classification_loss: 0.0322 498/500 [============================>.] - ETA: 1s - loss: 0.4269 - regression_loss: 0.3947 - classification_loss: 0.0322 499/500 [============================>.] - ETA: 0s - loss: 0.4264 - regression_loss: 0.3943 - classification_loss: 0.0321 500/500 [==============================] - 251s 502ms/step - loss: 0.4264 - regression_loss: 0.3942 - classification_loss: 0.0322 1172 instances of class plum with average precision: 0.7250 mAP: 0.7250 Epoch 00007: saving model to ./training/snapshots/resnet101_pascal_07.h5 Epoch 8/150 1/500 [..............................] - ETA: 4:20 - loss: 0.2545 - regression_loss: 0.2434 - classification_loss: 0.0111 2/500 [..............................] - ETA: 4:19 - loss: 0.4015 - regression_loss: 0.3728 - classification_loss: 0.0287 3/500 [..............................] - ETA: 4:20 - loss: 0.4268 - regression_loss: 0.3925 - classification_loss: 0.0344 4/500 [..............................] - ETA: 4:19 - loss: 0.3640 - regression_loss: 0.3346 - classification_loss: 0.0294 5/500 [..............................] - ETA: 4:20 - loss: 0.4044 - regression_loss: 0.3664 - classification_loss: 0.0381 6/500 [..............................] - ETA: 4:17 - loss: 0.3873 - regression_loss: 0.3523 - classification_loss: 0.0349 7/500 [..............................] - ETA: 4:16 - loss: 0.4125 - regression_loss: 0.3778 - classification_loss: 0.0347 8/500 [..............................] - ETA: 4:14 - loss: 0.3761 - regression_loss: 0.3447 - classification_loss: 0.0314 9/500 [..............................] - ETA: 4:12 - loss: 0.3869 - regression_loss: 0.3562 - classification_loss: 0.0307 10/500 [..............................] - ETA: 4:12 - loss: 0.4035 - regression_loss: 0.3714 - classification_loss: 0.0321 11/500 [..............................] - ETA: 4:11 - loss: 0.4174 - regression_loss: 0.3846 - classification_loss: 0.0329 12/500 [..............................] - ETA: 4:09 - loss: 0.3953 - regression_loss: 0.3641 - classification_loss: 0.0311 13/500 [..............................] - ETA: 4:08 - loss: 0.3768 - regression_loss: 0.3470 - classification_loss: 0.0299 14/500 [..............................] - ETA: 4:08 - loss: 0.3794 - regression_loss: 0.3494 - classification_loss: 0.0299 15/500 [..............................] - ETA: 4:08 - loss: 0.3733 - regression_loss: 0.3440 - classification_loss: 0.0292 16/500 [..............................] - ETA: 4:06 - loss: 0.3616 - regression_loss: 0.3333 - classification_loss: 0.0283 17/500 [>.............................] - ETA: 4:05 - loss: 0.3539 - regression_loss: 0.3266 - classification_loss: 0.0273 18/500 [>.............................] - ETA: 4:05 - loss: 0.3569 - regression_loss: 0.3293 - classification_loss: 0.0276 19/500 [>.............................] - ETA: 4:04 - loss: 0.3505 - regression_loss: 0.3234 - classification_loss: 0.0271 20/500 [>.............................] - ETA: 4:03 - loss: 0.3467 - regression_loss: 0.3202 - classification_loss: 0.0265 21/500 [>.............................] - ETA: 4:02 - loss: 0.3544 - regression_loss: 0.3278 - classification_loss: 0.0267 22/500 [>.............................] - ETA: 4:02 - loss: 0.3570 - regression_loss: 0.3307 - classification_loss: 0.0263 23/500 [>.............................] - ETA: 4:01 - loss: 0.3541 - regression_loss: 0.3287 - classification_loss: 0.0255 24/500 [>.............................] - ETA: 4:01 - loss: 0.3495 - regression_loss: 0.3244 - classification_loss: 0.0251 25/500 [>.............................] - ETA: 4:00 - loss: 0.3565 - regression_loss: 0.3313 - classification_loss: 0.0252 26/500 [>.............................] - ETA: 4:00 - loss: 0.3664 - regression_loss: 0.3400 - classification_loss: 0.0264 27/500 [>.............................] - ETA: 4:00 - loss: 0.3660 - regression_loss: 0.3397 - classification_loss: 0.0264 28/500 [>.............................] - ETA: 3:59 - loss: 0.3715 - regression_loss: 0.3448 - classification_loss: 0.0267 29/500 [>.............................] - ETA: 3:58 - loss: 0.3737 - regression_loss: 0.3467 - classification_loss: 0.0270 30/500 [>.............................] - ETA: 3:58 - loss: 0.3677 - regression_loss: 0.3410 - classification_loss: 0.0266 31/500 [>.............................] - ETA: 3:57 - loss: 0.3646 - regression_loss: 0.3379 - classification_loss: 0.0267 32/500 [>.............................] - ETA: 3:57 - loss: 0.3627 - regression_loss: 0.3362 - classification_loss: 0.0264 33/500 [>.............................] - ETA: 3:56 - loss: 0.3659 - regression_loss: 0.3395 - classification_loss: 0.0264 34/500 [=>............................] - ETA: 3:55 - loss: 0.3653 - regression_loss: 0.3389 - classification_loss: 0.0264 35/500 [=>............................] - ETA: 3:55 - loss: 0.3767 - regression_loss: 0.3487 - classification_loss: 0.0281 36/500 [=>............................] - ETA: 3:54 - loss: 0.3803 - regression_loss: 0.3518 - classification_loss: 0.0285 37/500 [=>............................] - ETA: 3:53 - loss: 0.3832 - regression_loss: 0.3544 - classification_loss: 0.0288 38/500 [=>............................] - ETA: 3:53 - loss: 0.3810 - regression_loss: 0.3525 - classification_loss: 0.0285 39/500 [=>............................] - ETA: 3:52 - loss: 0.3755 - regression_loss: 0.3476 - classification_loss: 0.0280 40/500 [=>............................] - ETA: 3:52 - loss: 0.3797 - regression_loss: 0.3513 - classification_loss: 0.0285 41/500 [=>............................] - ETA: 3:52 - loss: 0.3777 - regression_loss: 0.3498 - classification_loss: 0.0279 42/500 [=>............................] - ETA: 3:51 - loss: 0.3865 - regression_loss: 0.3575 - classification_loss: 0.0290 43/500 [=>............................] - ETA: 3:51 - loss: 0.3888 - regression_loss: 0.3597 - classification_loss: 0.0292 44/500 [=>............................] - ETA: 3:50 - loss: 0.3938 - regression_loss: 0.3636 - classification_loss: 0.0301 45/500 [=>............................] - ETA: 3:49 - loss: 0.3892 - regression_loss: 0.3595 - classification_loss: 0.0297 46/500 [=>............................] - ETA: 3:49 - loss: 0.3887 - regression_loss: 0.3588 - classification_loss: 0.0299 47/500 [=>............................] - ETA: 3:48 - loss: 0.3858 - regression_loss: 0.3562 - classification_loss: 0.0296 48/500 [=>............................] - ETA: 3:48 - loss: 0.3844 - regression_loss: 0.3550 - classification_loss: 0.0294 49/500 [=>............................] - ETA: 3:48 - loss: 0.3865 - regression_loss: 0.3566 - classification_loss: 0.0298 50/500 [==>...........................] - ETA: 3:47 - loss: 0.3891 - regression_loss: 0.3588 - classification_loss: 0.0303 51/500 [==>...........................] - ETA: 3:46 - loss: 0.3941 - regression_loss: 0.3632 - classification_loss: 0.0309 52/500 [==>...........................] - ETA: 3:46 - loss: 0.3990 - regression_loss: 0.3675 - classification_loss: 0.0315 53/500 [==>...........................] - ETA: 3:46 - loss: 0.3982 - regression_loss: 0.3670 - classification_loss: 0.0312 54/500 [==>...........................] - ETA: 3:45 - loss: 0.3969 - regression_loss: 0.3659 - classification_loss: 0.0310 55/500 [==>...........................] - ETA: 3:44 - loss: 0.3990 - regression_loss: 0.3679 - classification_loss: 0.0311 56/500 [==>...........................] - ETA: 3:44 - loss: 0.4000 - regression_loss: 0.3689 - classification_loss: 0.0311 57/500 [==>...........................] - ETA: 3:43 - loss: 0.4011 - regression_loss: 0.3700 - classification_loss: 0.0311 58/500 [==>...........................] - ETA: 3:43 - loss: 0.4020 - regression_loss: 0.3707 - classification_loss: 0.0313 59/500 [==>...........................] - ETA: 3:42 - loss: 0.4023 - regression_loss: 0.3709 - classification_loss: 0.0313 60/500 [==>...........................] - ETA: 3:42 - loss: 0.4010 - regression_loss: 0.3699 - classification_loss: 0.0311 61/500 [==>...........................] - ETA: 3:41 - loss: 0.4004 - regression_loss: 0.3694 - classification_loss: 0.0310 62/500 [==>...........................] - ETA: 3:40 - loss: 0.3967 - regression_loss: 0.3660 - classification_loss: 0.0307 63/500 [==>...........................] - ETA: 3:40 - loss: 0.3958 - regression_loss: 0.3653 - classification_loss: 0.0304 64/500 [==>...........................] - ETA: 3:39 - loss: 0.3931 - regression_loss: 0.3628 - classification_loss: 0.0302 65/500 [==>...........................] - ETA: 3:38 - loss: 0.3959 - regression_loss: 0.3657 - classification_loss: 0.0302 66/500 [==>...........................] - ETA: 3:38 - loss: 0.3964 - regression_loss: 0.3663 - classification_loss: 0.0300 67/500 [===>..........................] - ETA: 3:37 - loss: 0.3960 - regression_loss: 0.3660 - classification_loss: 0.0300 68/500 [===>..........................] - ETA: 3:37 - loss: 0.3960 - regression_loss: 0.3660 - classification_loss: 0.0300 69/500 [===>..........................] - ETA: 3:36 - loss: 0.3933 - regression_loss: 0.3636 - classification_loss: 0.0297 70/500 [===>..........................] - ETA: 3:36 - loss: 0.3905 - regression_loss: 0.3611 - classification_loss: 0.0294 71/500 [===>..........................] - ETA: 3:35 - loss: 0.4011 - regression_loss: 0.3692 - classification_loss: 0.0319 72/500 [===>..........................] - ETA: 3:35 - loss: 0.4024 - regression_loss: 0.3704 - classification_loss: 0.0320 73/500 [===>..........................] - ETA: 3:34 - loss: 0.4019 - regression_loss: 0.3701 - classification_loss: 0.0319 74/500 [===>..........................] - ETA: 3:34 - loss: 0.4035 - regression_loss: 0.3709 - classification_loss: 0.0326 75/500 [===>..........................] - ETA: 3:34 - loss: 0.4103 - regression_loss: 0.3774 - classification_loss: 0.0329 76/500 [===>..........................] - ETA: 3:33 - loss: 0.4133 - regression_loss: 0.3803 - classification_loss: 0.0331 77/500 [===>..........................] - ETA: 3:33 - loss: 0.4105 - regression_loss: 0.3778 - classification_loss: 0.0327 78/500 [===>..........................] - ETA: 3:32 - loss: 0.4109 - regression_loss: 0.3783 - classification_loss: 0.0326 79/500 [===>..........................] - ETA: 3:32 - loss: 0.4088 - regression_loss: 0.3765 - classification_loss: 0.0324 80/500 [===>..........................] - ETA: 3:31 - loss: 0.4143 - regression_loss: 0.3817 - classification_loss: 0.0327 81/500 [===>..........................] - ETA: 3:31 - loss: 0.4157 - regression_loss: 0.3829 - classification_loss: 0.0327 82/500 [===>..........................] - ETA: 3:30 - loss: 0.4169 - regression_loss: 0.3842 - classification_loss: 0.0327 83/500 [===>..........................] - ETA: 3:30 - loss: 0.4193 - regression_loss: 0.3860 - classification_loss: 0.0333 84/500 [====>.........................] - ETA: 3:29 - loss: 0.4194 - regression_loss: 0.3862 - classification_loss: 0.0332 85/500 [====>.........................] - ETA: 3:29 - loss: 0.4187 - regression_loss: 0.3855 - classification_loss: 0.0332 86/500 [====>.........................] - ETA: 3:28 - loss: 0.4172 - regression_loss: 0.3842 - classification_loss: 0.0329 87/500 [====>.........................] - ETA: 3:28 - loss: 0.4216 - regression_loss: 0.3884 - classification_loss: 0.0332 88/500 [====>.........................] - ETA: 3:27 - loss: 0.4213 - regression_loss: 0.3883 - classification_loss: 0.0330 89/500 [====>.........................] - ETA: 3:27 - loss: 0.4229 - regression_loss: 0.3895 - classification_loss: 0.0333 90/500 [====>.........................] - ETA: 3:26 - loss: 0.4216 - regression_loss: 0.3886 - classification_loss: 0.0330 91/500 [====>.........................] - ETA: 3:26 - loss: 0.4196 - regression_loss: 0.3868 - classification_loss: 0.0327 92/500 [====>.........................] - ETA: 3:25 - loss: 0.4202 - regression_loss: 0.3874 - classification_loss: 0.0327 93/500 [====>.........................] - ETA: 3:25 - loss: 0.4199 - regression_loss: 0.3875 - classification_loss: 0.0324 94/500 [====>.........................] - ETA: 3:24 - loss: 0.4181 - regression_loss: 0.3859 - classification_loss: 0.0323 95/500 [====>.........................] - ETA: 3:24 - loss: 0.4211 - regression_loss: 0.3889 - classification_loss: 0.0323 96/500 [====>.........................] - ETA: 3:23 - loss: 0.4214 - regression_loss: 0.3890 - classification_loss: 0.0324 97/500 [====>.........................] - ETA: 3:23 - loss: 0.4214 - regression_loss: 0.3892 - classification_loss: 0.0322 98/500 [====>.........................] - ETA: 3:22 - loss: 0.4233 - regression_loss: 0.3908 - classification_loss: 0.0325 99/500 [====>.........................] - ETA: 3:22 - loss: 0.4234 - regression_loss: 0.3910 - classification_loss: 0.0324 100/500 [=====>........................] - ETA: 3:21 - loss: 0.4210 - regression_loss: 0.3888 - classification_loss: 0.0322 101/500 [=====>........................] - ETA: 3:21 - loss: 0.4202 - regression_loss: 0.3882 - classification_loss: 0.0320 102/500 [=====>........................] - ETA: 3:20 - loss: 0.4195 - regression_loss: 0.3875 - classification_loss: 0.0320 103/500 [=====>........................] - ETA: 3:20 - loss: 0.4221 - regression_loss: 0.3901 - classification_loss: 0.0320 104/500 [=====>........................] - ETA: 3:19 - loss: 0.4216 - regression_loss: 0.3896 - classification_loss: 0.0320 105/500 [=====>........................] - ETA: 3:19 - loss: 0.4229 - regression_loss: 0.3905 - classification_loss: 0.0324 106/500 [=====>........................] - ETA: 3:18 - loss: 0.4234 - regression_loss: 0.3909 - classification_loss: 0.0325 107/500 [=====>........................] - ETA: 3:18 - loss: 0.4227 - regression_loss: 0.3902 - classification_loss: 0.0325 108/500 [=====>........................] - ETA: 3:17 - loss: 0.4233 - regression_loss: 0.3909 - classification_loss: 0.0323 109/500 [=====>........................] - ETA: 3:17 - loss: 0.4254 - regression_loss: 0.3928 - classification_loss: 0.0326 110/500 [=====>........................] - ETA: 3:16 - loss: 0.4267 - regression_loss: 0.3932 - classification_loss: 0.0335 111/500 [=====>........................] - ETA: 3:16 - loss: 0.4304 - regression_loss: 0.3966 - classification_loss: 0.0338 112/500 [=====>........................] - ETA: 3:15 - loss: 0.4295 - regression_loss: 0.3959 - classification_loss: 0.0336 113/500 [=====>........................] - ETA: 3:15 - loss: 0.4285 - regression_loss: 0.3949 - classification_loss: 0.0335 114/500 [=====>........................] - ETA: 3:14 - loss: 0.4273 - regression_loss: 0.3939 - classification_loss: 0.0334 115/500 [=====>........................] - ETA: 3:14 - loss: 0.4278 - regression_loss: 0.3944 - classification_loss: 0.0334 116/500 [=====>........................] - ETA: 3:13 - loss: 0.4270 - regression_loss: 0.3937 - classification_loss: 0.0333 117/500 [======>.......................] - ETA: 3:13 - loss: 0.4302 - regression_loss: 0.3966 - classification_loss: 0.0336 118/500 [======>.......................] - ETA: 3:12 - loss: 0.4314 - regression_loss: 0.3975 - classification_loss: 0.0339 119/500 [======>.......................] - ETA: 3:11 - loss: 0.4323 - regression_loss: 0.3983 - classification_loss: 0.0340 120/500 [======>.......................] - ETA: 3:11 - loss: 0.4357 - regression_loss: 0.4011 - classification_loss: 0.0346 121/500 [======>.......................] - ETA: 3:10 - loss: 0.4353 - regression_loss: 0.4008 - classification_loss: 0.0345 122/500 [======>.......................] - ETA: 3:10 - loss: 0.4355 - regression_loss: 0.4008 - classification_loss: 0.0347 123/500 [======>.......................] - ETA: 3:09 - loss: 0.4372 - regression_loss: 0.4025 - classification_loss: 0.0347 124/500 [======>.......................] - ETA: 3:09 - loss: 0.4382 - regression_loss: 0.4035 - classification_loss: 0.0347 125/500 [======>.......................] - ETA: 3:08 - loss: 0.4386 - regression_loss: 0.4038 - classification_loss: 0.0348 126/500 [======>.......................] - ETA: 3:08 - loss: 0.4425 - regression_loss: 0.4076 - classification_loss: 0.0350 127/500 [======>.......................] - ETA: 3:07 - loss: 0.4410 - regression_loss: 0.4062 - classification_loss: 0.0347 128/500 [======>.......................] - ETA: 3:07 - loss: 0.4418 - regression_loss: 0.4071 - classification_loss: 0.0348 129/500 [======>.......................] - ETA: 3:06 - loss: 0.4426 - regression_loss: 0.4080 - classification_loss: 0.0347 130/500 [======>.......................] - ETA: 3:05 - loss: 0.4423 - regression_loss: 0.4076 - classification_loss: 0.0347 131/500 [======>.......................] - ETA: 3:05 - loss: 0.4448 - regression_loss: 0.4098 - classification_loss: 0.0350 132/500 [======>.......................] - ETA: 3:04 - loss: 0.4445 - regression_loss: 0.4096 - classification_loss: 0.0349 133/500 [======>.......................] - ETA: 3:04 - loss: 0.4442 - regression_loss: 0.4094 - classification_loss: 0.0348 134/500 [=======>......................] - ETA: 3:03 - loss: 0.4447 - regression_loss: 0.4098 - classification_loss: 0.0348 135/500 [=======>......................] - ETA: 3:03 - loss: 0.4450 - regression_loss: 0.4102 - classification_loss: 0.0347 136/500 [=======>......................] - ETA: 3:02 - loss: 0.4459 - regression_loss: 0.4111 - classification_loss: 0.0347 137/500 [=======>......................] - ETA: 3:02 - loss: 0.4449 - regression_loss: 0.4104 - classification_loss: 0.0345 138/500 [=======>......................] - ETA: 3:01 - loss: 0.4450 - regression_loss: 0.4105 - classification_loss: 0.0346 139/500 [=======>......................] - ETA: 3:01 - loss: 0.4439 - regression_loss: 0.4094 - classification_loss: 0.0345 140/500 [=======>......................] - ETA: 3:00 - loss: 0.4433 - regression_loss: 0.4089 - classification_loss: 0.0345 141/500 [=======>......................] - ETA: 3:00 - loss: 0.4440 - regression_loss: 0.4095 - classification_loss: 0.0345 142/500 [=======>......................] - ETA: 2:59 - loss: 0.4435 - regression_loss: 0.4091 - classification_loss: 0.0344 143/500 [=======>......................] - ETA: 2:59 - loss: 0.4409 - regression_loss: 0.4067 - classification_loss: 0.0342 144/500 [=======>......................] - ETA: 2:58 - loss: 0.4404 - regression_loss: 0.4063 - classification_loss: 0.0342 145/500 [=======>......................] - ETA: 2:58 - loss: 0.4419 - regression_loss: 0.4075 - classification_loss: 0.0344 146/500 [=======>......................] - ETA: 2:57 - loss: 0.4426 - regression_loss: 0.4083 - classification_loss: 0.0343 147/500 [=======>......................] - ETA: 2:57 - loss: 0.4421 - regression_loss: 0.4079 - classification_loss: 0.0342 148/500 [=======>......................] - ETA: 2:56 - loss: 0.4420 - regression_loss: 0.4078 - classification_loss: 0.0341 149/500 [=======>......................] - ETA: 2:56 - loss: 0.4431 - regression_loss: 0.4088 - classification_loss: 0.0343 150/500 [========>.....................] - ETA: 2:55 - loss: 0.4430 - regression_loss: 0.4087 - classification_loss: 0.0343 151/500 [========>.....................] - ETA: 2:55 - loss: 0.4446 - regression_loss: 0.4102 - classification_loss: 0.0344 152/500 [========>.....................] - ETA: 2:54 - loss: 0.4438 - regression_loss: 0.4095 - classification_loss: 0.0343 153/500 [========>.....................] - ETA: 2:54 - loss: 0.4427 - regression_loss: 0.4086 - classification_loss: 0.0341 154/500 [========>.....................] - ETA: 2:53 - loss: 0.4432 - regression_loss: 0.4091 - classification_loss: 0.0342 155/500 [========>.....................] - ETA: 2:53 - loss: 0.4430 - regression_loss: 0.4090 - classification_loss: 0.0341 156/500 [========>.....................] - ETA: 2:52 - loss: 0.4425 - regression_loss: 0.4086 - classification_loss: 0.0340 157/500 [========>.....................] - ETA: 2:52 - loss: 0.4427 - regression_loss: 0.4088 - classification_loss: 0.0340 158/500 [========>.....................] - ETA: 2:51 - loss: 0.4433 - regression_loss: 0.4094 - classification_loss: 0.0340 159/500 [========>.....................] - ETA: 2:51 - loss: 0.4439 - regression_loss: 0.4099 - classification_loss: 0.0340 160/500 [========>.....................] - ETA: 2:50 - loss: 0.4418 - regression_loss: 0.4080 - classification_loss: 0.0338 161/500 [========>.....................] - ETA: 2:50 - loss: 0.4424 - regression_loss: 0.4085 - classification_loss: 0.0339 162/500 [========>.....................] - ETA: 2:49 - loss: 0.4461 - regression_loss: 0.4122 - classification_loss: 0.0339 163/500 [========>.....................] - ETA: 2:49 - loss: 0.4455 - regression_loss: 0.4118 - classification_loss: 0.0338 164/500 [========>.....................] - ETA: 2:48 - loss: 0.4453 - regression_loss: 0.4115 - classification_loss: 0.0338 165/500 [========>.....................] - ETA: 2:48 - loss: 0.4445 - regression_loss: 0.4108 - classification_loss: 0.0337 166/500 [========>.....................] - ETA: 2:47 - loss: 0.4454 - regression_loss: 0.4117 - classification_loss: 0.0338 167/500 [=========>....................] - ETA: 2:47 - loss: 0.4446 - regression_loss: 0.4109 - classification_loss: 0.0337 168/500 [=========>....................] - ETA: 2:46 - loss: 0.4443 - regression_loss: 0.4106 - classification_loss: 0.0337 169/500 [=========>....................] - ETA: 2:46 - loss: 0.4428 - regression_loss: 0.4093 - classification_loss: 0.0335 170/500 [=========>....................] - ETA: 2:45 - loss: 0.4428 - regression_loss: 0.4093 - classification_loss: 0.0335 171/500 [=========>....................] - ETA: 2:45 - loss: 0.4420 - regression_loss: 0.4086 - classification_loss: 0.0334 172/500 [=========>....................] - ETA: 2:44 - loss: 0.4419 - regression_loss: 0.4085 - classification_loss: 0.0334 173/500 [=========>....................] - ETA: 2:44 - loss: 0.4417 - regression_loss: 0.4083 - classification_loss: 0.0334 174/500 [=========>....................] - ETA: 2:43 - loss: 0.4430 - regression_loss: 0.4095 - classification_loss: 0.0335 175/500 [=========>....................] - ETA: 2:43 - loss: 0.4424 - regression_loss: 0.4090 - classification_loss: 0.0334 176/500 [=========>....................] - ETA: 2:42 - loss: 0.4414 - regression_loss: 0.4081 - classification_loss: 0.0333 177/500 [=========>....................] - ETA: 2:42 - loss: 0.4417 - regression_loss: 0.4084 - classification_loss: 0.0333 178/500 [=========>....................] - ETA: 2:41 - loss: 0.4419 - regression_loss: 0.4086 - classification_loss: 0.0333 179/500 [=========>....................] - ETA: 2:41 - loss: 0.4415 - regression_loss: 0.4082 - classification_loss: 0.0333 180/500 [=========>....................] - ETA: 2:40 - loss: 0.4406 - regression_loss: 0.4074 - classification_loss: 0.0332 181/500 [=========>....................] - ETA: 2:40 - loss: 0.4426 - regression_loss: 0.4093 - classification_loss: 0.0333 182/500 [=========>....................] - ETA: 2:39 - loss: 0.4450 - regression_loss: 0.4113 - classification_loss: 0.0336 183/500 [=========>....................] - ETA: 2:39 - loss: 0.4460 - regression_loss: 0.4123 - classification_loss: 0.0337 184/500 [==========>...................] - ETA: 2:38 - loss: 0.4449 - regression_loss: 0.4113 - classification_loss: 0.0336 185/500 [==========>...................] - ETA: 2:38 - loss: 0.4449 - regression_loss: 0.4112 - classification_loss: 0.0337 186/500 [==========>...................] - ETA: 2:37 - loss: 0.4446 - regression_loss: 0.4111 - classification_loss: 0.0335 187/500 [==========>...................] - ETA: 2:37 - loss: 0.4442 - regression_loss: 0.4108 - classification_loss: 0.0334 188/500 [==========>...................] - ETA: 2:36 - loss: 0.4437 - regression_loss: 0.4103 - classification_loss: 0.0334 189/500 [==========>...................] - ETA: 2:36 - loss: 0.4441 - regression_loss: 0.4106 - classification_loss: 0.0335 190/500 [==========>...................] - ETA: 2:35 - loss: 0.4437 - regression_loss: 0.4102 - classification_loss: 0.0335 191/500 [==========>...................] - ETA: 2:35 - loss: 0.4432 - regression_loss: 0.4098 - classification_loss: 0.0334 192/500 [==========>...................] - ETA: 2:34 - loss: 0.4423 - regression_loss: 0.4090 - classification_loss: 0.0333 193/500 [==========>...................] - ETA: 2:34 - loss: 0.4416 - regression_loss: 0.4083 - classification_loss: 0.0333 194/500 [==========>...................] - ETA: 2:33 - loss: 0.4416 - regression_loss: 0.4084 - classification_loss: 0.0332 195/500 [==========>...................] - ETA: 2:33 - loss: 0.4416 - regression_loss: 0.4083 - classification_loss: 0.0332 196/500 [==========>...................] - ETA: 2:32 - loss: 0.4420 - regression_loss: 0.4086 - classification_loss: 0.0334 197/500 [==========>...................] - ETA: 2:32 - loss: 0.4412 - regression_loss: 0.4079 - classification_loss: 0.0333 198/500 [==========>...................] - ETA: 2:31 - loss: 0.4412 - regression_loss: 0.4079 - classification_loss: 0.0332 199/500 [==========>...................] - ETA: 2:31 - loss: 0.4401 - regression_loss: 0.4070 - classification_loss: 0.0332 200/500 [===========>..................] - ETA: 2:30 - loss: 0.4397 - regression_loss: 0.4066 - classification_loss: 0.0331 201/500 [===========>..................] - ETA: 2:30 - loss: 0.4388 - regression_loss: 0.4059 - classification_loss: 0.0330 202/500 [===========>..................] - ETA: 2:29 - loss: 0.4378 - regression_loss: 0.4050 - classification_loss: 0.0328 203/500 [===========>..................] - ETA: 2:29 - loss: 0.4375 - regression_loss: 0.4048 - classification_loss: 0.0327 204/500 [===========>..................] - ETA: 2:28 - loss: 0.4370 - regression_loss: 0.4043 - classification_loss: 0.0327 205/500 [===========>..................] - ETA: 2:28 - loss: 0.4371 - regression_loss: 0.4045 - classification_loss: 0.0326 206/500 [===========>..................] - ETA: 2:27 - loss: 0.4365 - regression_loss: 0.4039 - classification_loss: 0.0325 207/500 [===========>..................] - ETA: 2:27 - loss: 0.4358 - regression_loss: 0.4034 - classification_loss: 0.0325 208/500 [===========>..................] - ETA: 2:26 - loss: 0.4345 - regression_loss: 0.4022 - classification_loss: 0.0323 209/500 [===========>..................] - ETA: 2:26 - loss: 0.4330 - regression_loss: 0.4008 - classification_loss: 0.0322 210/500 [===========>..................] - ETA: 2:25 - loss: 0.4322 - regression_loss: 0.4001 - classification_loss: 0.0321 211/500 [===========>..................] - ETA: 2:25 - loss: 0.4332 - regression_loss: 0.4010 - classification_loss: 0.0322 212/500 [===========>..................] - ETA: 2:24 - loss: 0.4339 - regression_loss: 0.4018 - classification_loss: 0.0322 213/500 [===========>..................] - ETA: 2:24 - loss: 0.4347 - regression_loss: 0.4025 - classification_loss: 0.0322 214/500 [===========>..................] - ETA: 2:23 - loss: 0.4358 - regression_loss: 0.4036 - classification_loss: 0.0322 215/500 [===========>..................] - ETA: 2:23 - loss: 0.4360 - regression_loss: 0.4038 - classification_loss: 0.0322 216/500 [===========>..................] - ETA: 2:22 - loss: 0.4356 - regression_loss: 0.4035 - classification_loss: 0.0321 217/500 [============>.................] - ETA: 2:22 - loss: 0.4380 - regression_loss: 0.4055 - classification_loss: 0.0325 218/500 [============>.................] - ETA: 2:21 - loss: 0.4381 - regression_loss: 0.4057 - classification_loss: 0.0324 219/500 [============>.................] - ETA: 2:21 - loss: 0.4386 - regression_loss: 0.4061 - classification_loss: 0.0324 220/500 [============>.................] - ETA: 2:20 - loss: 0.4377 - regression_loss: 0.4054 - classification_loss: 0.0324 221/500 [============>.................] - ETA: 2:20 - loss: 0.4371 - regression_loss: 0.4048 - classification_loss: 0.0323 222/500 [============>.................] - ETA: 2:19 - loss: 0.4370 - regression_loss: 0.4047 - classification_loss: 0.0323 223/500 [============>.................] - ETA: 2:19 - loss: 0.4375 - regression_loss: 0.4051 - classification_loss: 0.0324 224/500 [============>.................] - ETA: 2:18 - loss: 0.4370 - regression_loss: 0.4046 - classification_loss: 0.0324 225/500 [============>.................] - ETA: 2:18 - loss: 0.4377 - regression_loss: 0.4052 - classification_loss: 0.0325 226/500 [============>.................] - ETA: 2:17 - loss: 0.4367 - regression_loss: 0.4043 - classification_loss: 0.0324 227/500 [============>.................] - ETA: 2:17 - loss: 0.4369 - regression_loss: 0.4045 - classification_loss: 0.0324 228/500 [============>.................] - ETA: 2:16 - loss: 0.4368 - regression_loss: 0.4044 - classification_loss: 0.0323 229/500 [============>.................] - ETA: 2:16 - loss: 0.4364 - regression_loss: 0.4041 - classification_loss: 0.0323 230/500 [============>.................] - ETA: 2:15 - loss: 0.4365 - regression_loss: 0.4042 - classification_loss: 0.0323 231/500 [============>.................] - ETA: 2:15 - loss: 0.4367 - regression_loss: 0.4043 - classification_loss: 0.0324 232/500 [============>.................] - ETA: 2:14 - loss: 0.4370 - regression_loss: 0.4045 - classification_loss: 0.0325 233/500 [============>.................] - ETA: 2:14 - loss: 0.4386 - regression_loss: 0.4060 - classification_loss: 0.0326 234/500 [=============>................] - ETA: 2:13 - loss: 0.4390 - regression_loss: 0.4064 - classification_loss: 0.0326 235/500 [=============>................] - ETA: 2:13 - loss: 0.4397 - regression_loss: 0.4070 - classification_loss: 0.0327 236/500 [=============>................] - ETA: 2:12 - loss: 0.4392 - regression_loss: 0.4065 - classification_loss: 0.0327 237/500 [=============>................] - ETA: 2:12 - loss: 0.4401 - regression_loss: 0.4073 - classification_loss: 0.0328 238/500 [=============>................] - ETA: 2:11 - loss: 0.4407 - regression_loss: 0.4080 - classification_loss: 0.0327 239/500 [=============>................] - ETA: 2:11 - loss: 0.4407 - regression_loss: 0.4080 - classification_loss: 0.0327 240/500 [=============>................] - ETA: 2:10 - loss: 0.4409 - regression_loss: 0.4082 - classification_loss: 0.0327 241/500 [=============>................] - ETA: 2:10 - loss: 0.4419 - regression_loss: 0.4091 - classification_loss: 0.0328 242/500 [=============>................] - ETA: 2:09 - loss: 0.4415 - regression_loss: 0.4088 - classification_loss: 0.0328 243/500 [=============>................] - ETA: 2:09 - loss: 0.4417 - regression_loss: 0.4089 - classification_loss: 0.0328 244/500 [=============>................] - ETA: 2:08 - loss: 0.4423 - regression_loss: 0.4095 - classification_loss: 0.0328 245/500 [=============>................] - ETA: 2:08 - loss: 0.4432 - regression_loss: 0.4103 - classification_loss: 0.0330 246/500 [=============>................] - ETA: 2:07 - loss: 0.4428 - regression_loss: 0.4098 - classification_loss: 0.0329 247/500 [=============>................] - ETA: 2:07 - loss: 0.4443 - regression_loss: 0.4112 - classification_loss: 0.0330 248/500 [=============>................] - ETA: 2:06 - loss: 0.4433 - regression_loss: 0.4103 - classification_loss: 0.0329 249/500 [=============>................] - ETA: 2:06 - loss: 0.4441 - regression_loss: 0.4112 - classification_loss: 0.0329 250/500 [==============>...............] - ETA: 2:05 - loss: 0.4458 - regression_loss: 0.4128 - classification_loss: 0.0330 251/500 [==============>...............] - ETA: 2:05 - loss: 0.4480 - regression_loss: 0.4149 - classification_loss: 0.0331 252/500 [==============>...............] - ETA: 2:04 - loss: 0.4487 - regression_loss: 0.4157 - classification_loss: 0.0330 253/500 [==============>...............] - ETA: 2:04 - loss: 0.4486 - regression_loss: 0.4156 - classification_loss: 0.0330 254/500 [==============>...............] - ETA: 2:03 - loss: 0.4475 - regression_loss: 0.4145 - classification_loss: 0.0330 255/500 [==============>...............] - ETA: 2:03 - loss: 0.4481 - regression_loss: 0.4151 - classification_loss: 0.0330 256/500 [==============>...............] - ETA: 2:02 - loss: 0.4474 - regression_loss: 0.4144 - classification_loss: 0.0330 257/500 [==============>...............] - ETA: 2:02 - loss: 0.4478 - regression_loss: 0.4149 - classification_loss: 0.0330 258/500 [==============>...............] - ETA: 2:01 - loss: 0.4480 - regression_loss: 0.4150 - classification_loss: 0.0329 259/500 [==============>...............] - ETA: 2:01 - loss: 0.4478 - regression_loss: 0.4149 - classification_loss: 0.0329 260/500 [==============>...............] - ETA: 2:00 - loss: 0.4478 - regression_loss: 0.4149 - classification_loss: 0.0328 261/500 [==============>...............] - ETA: 2:00 - loss: 0.4480 - regression_loss: 0.4152 - classification_loss: 0.0328 262/500 [==============>...............] - ETA: 1:59 - loss: 0.4499 - regression_loss: 0.4168 - classification_loss: 0.0331 263/500 [==============>...............] - ETA: 1:59 - loss: 0.4502 - regression_loss: 0.4171 - classification_loss: 0.0331 264/500 [==============>...............] - ETA: 1:58 - loss: 0.4499 - regression_loss: 0.4167 - classification_loss: 0.0331 265/500 [==============>...............] - ETA: 1:58 - loss: 0.4522 - regression_loss: 0.4188 - classification_loss: 0.0334 266/500 [==============>...............] - ETA: 1:57 - loss: 0.4521 - regression_loss: 0.4187 - classification_loss: 0.0334 267/500 [===============>..............] - ETA: 1:57 - loss: 0.4519 - regression_loss: 0.4185 - classification_loss: 0.0334 268/500 [===============>..............] - ETA: 1:56 - loss: 0.4522 - regression_loss: 0.4188 - classification_loss: 0.0334 269/500 [===============>..............] - ETA: 1:56 - loss: 0.4510 - regression_loss: 0.4177 - classification_loss: 0.0333 270/500 [===============>..............] - ETA: 1:55 - loss: 0.4511 - regression_loss: 0.4178 - classification_loss: 0.0333 271/500 [===============>..............] - ETA: 1:55 - loss: 0.4503 - regression_loss: 0.4171 - classification_loss: 0.0332 272/500 [===============>..............] - ETA: 1:54 - loss: 0.4504 - regression_loss: 0.4171 - classification_loss: 0.0333 273/500 [===============>..............] - ETA: 1:54 - loss: 0.4499 - regression_loss: 0.4166 - classification_loss: 0.0333 274/500 [===============>..............] - ETA: 1:53 - loss: 0.4497 - regression_loss: 0.4165 - classification_loss: 0.0332 275/500 [===============>..............] - ETA: 1:53 - loss: 0.4488 - regression_loss: 0.4157 - classification_loss: 0.0331 276/500 [===============>..............] - ETA: 1:52 - loss: 0.4488 - regression_loss: 0.4156 - classification_loss: 0.0331 277/500 [===============>..............] - ETA: 1:52 - loss: 0.4477 - regression_loss: 0.4147 - classification_loss: 0.0330 278/500 [===============>..............] - ETA: 1:51 - loss: 0.4467 - regression_loss: 0.4137 - classification_loss: 0.0330 279/500 [===============>..............] - ETA: 1:51 - loss: 0.4465 - regression_loss: 0.4135 - classification_loss: 0.0329 280/500 [===============>..............] - ETA: 1:50 - loss: 0.4464 - regression_loss: 0.4135 - classification_loss: 0.0329 281/500 [===============>..............] - ETA: 1:50 - loss: 0.4468 - regression_loss: 0.4139 - classification_loss: 0.0329 282/500 [===============>..............] - ETA: 1:49 - loss: 0.4466 - regression_loss: 0.4137 - classification_loss: 0.0328 283/500 [===============>..............] - ETA: 1:49 - loss: 0.4466 - regression_loss: 0.4137 - classification_loss: 0.0329 284/500 [================>.............] - ETA: 1:48 - loss: 0.4460 - regression_loss: 0.4131 - classification_loss: 0.0328 285/500 [================>.............] - ETA: 1:48 - loss: 0.4455 - regression_loss: 0.4127 - classification_loss: 0.0328 286/500 [================>.............] - ETA: 1:47 - loss: 0.4461 - regression_loss: 0.4133 - classification_loss: 0.0327 287/500 [================>.............] - ETA: 1:47 - loss: 0.4460 - regression_loss: 0.4132 - classification_loss: 0.0327 288/500 [================>.............] - ETA: 1:46 - loss: 0.4455 - regression_loss: 0.4128 - classification_loss: 0.0327 289/500 [================>.............] - ETA: 1:46 - loss: 0.4456 - regression_loss: 0.4128 - classification_loss: 0.0327 290/500 [================>.............] - ETA: 1:45 - loss: 0.4444 - regression_loss: 0.4117 - classification_loss: 0.0327 291/500 [================>.............] - ETA: 1:45 - loss: 0.4438 - regression_loss: 0.4112 - classification_loss: 0.0326 292/500 [================>.............] - ETA: 1:44 - loss: 0.4435 - regression_loss: 0.4110 - classification_loss: 0.0326 293/500 [================>.............] - ETA: 1:44 - loss: 0.4433 - regression_loss: 0.4108 - classification_loss: 0.0325 294/500 [================>.............] - ETA: 1:43 - loss: 0.4430 - regression_loss: 0.4105 - classification_loss: 0.0325 295/500 [================>.............] - ETA: 1:43 - loss: 0.4431 - regression_loss: 0.4106 - classification_loss: 0.0325 296/500 [================>.............] - ETA: 1:42 - loss: 0.4423 - regression_loss: 0.4100 - classification_loss: 0.0324 297/500 [================>.............] - ETA: 1:42 - loss: 0.4417 - regression_loss: 0.4094 - classification_loss: 0.0323 298/500 [================>.............] - ETA: 1:41 - loss: 0.4412 - regression_loss: 0.4089 - classification_loss: 0.0323 299/500 [================>.............] - ETA: 1:41 - loss: 0.4421 - regression_loss: 0.4097 - classification_loss: 0.0324 300/500 [=================>............] - ETA: 1:40 - loss: 0.4415 - regression_loss: 0.4091 - classification_loss: 0.0324 301/500 [=================>............] - ETA: 1:40 - loss: 0.4418 - regression_loss: 0.4094 - classification_loss: 0.0324 302/500 [=================>............] - ETA: 1:39 - loss: 0.4425 - regression_loss: 0.4101 - classification_loss: 0.0324 303/500 [=================>............] - ETA: 1:39 - loss: 0.4420 - regression_loss: 0.4096 - classification_loss: 0.0324 304/500 [=================>............] - ETA: 1:38 - loss: 0.4420 - regression_loss: 0.4096 - classification_loss: 0.0324 305/500 [=================>............] - ETA: 1:38 - loss: 0.4419 - regression_loss: 0.4096 - classification_loss: 0.0323 306/500 [=================>............] - ETA: 1:37 - loss: 0.4414 - regression_loss: 0.4091 - classification_loss: 0.0323 307/500 [=================>............] - ETA: 1:37 - loss: 0.4413 - regression_loss: 0.4090 - classification_loss: 0.0323 308/500 [=================>............] - ETA: 1:36 - loss: 0.4413 - regression_loss: 0.4090 - classification_loss: 0.0323 309/500 [=================>............] - ETA: 1:36 - loss: 0.4416 - regression_loss: 0.4092 - classification_loss: 0.0323 310/500 [=================>............] - ETA: 1:35 - loss: 0.4427 - regression_loss: 0.4103 - classification_loss: 0.0324 311/500 [=================>............] - ETA: 1:35 - loss: 0.4431 - regression_loss: 0.4105 - classification_loss: 0.0325 312/500 [=================>............] - ETA: 1:34 - loss: 0.4435 - regression_loss: 0.4110 - classification_loss: 0.0325 313/500 [=================>............] - ETA: 1:34 - loss: 0.4439 - regression_loss: 0.4114 - classification_loss: 0.0325 314/500 [=================>............] - ETA: 1:33 - loss: 0.4454 - regression_loss: 0.4128 - classification_loss: 0.0326 315/500 [=================>............] - ETA: 1:33 - loss: 0.4455 - regression_loss: 0.4129 - classification_loss: 0.0326 316/500 [=================>............] - ETA: 1:32 - loss: 0.4458 - regression_loss: 0.4131 - classification_loss: 0.0327 317/500 [==================>...........] - ETA: 1:32 - loss: 0.4456 - regression_loss: 0.4130 - classification_loss: 0.0326 318/500 [==================>...........] - ETA: 1:31 - loss: 0.4454 - regression_loss: 0.4128 - classification_loss: 0.0326 319/500 [==================>...........] - ETA: 1:31 - loss: 0.4452 - regression_loss: 0.4125 - classification_loss: 0.0327 320/500 [==================>...........] - ETA: 1:30 - loss: 0.4455 - regression_loss: 0.4129 - classification_loss: 0.0326 321/500 [==================>...........] - ETA: 1:30 - loss: 0.4458 - regression_loss: 0.4132 - classification_loss: 0.0326 322/500 [==================>...........] - ETA: 1:29 - loss: 0.4461 - regression_loss: 0.4135 - classification_loss: 0.0326 323/500 [==================>...........] - ETA: 1:29 - loss: 0.4465 - regression_loss: 0.4138 - classification_loss: 0.0327 324/500 [==================>...........] - ETA: 1:28 - loss: 0.4471 - regression_loss: 0.4144 - classification_loss: 0.0326 325/500 [==================>...........] - ETA: 1:28 - loss: 0.4465 - regression_loss: 0.4139 - classification_loss: 0.0326 326/500 [==================>...........] - ETA: 1:27 - loss: 0.4471 - regression_loss: 0.4145 - classification_loss: 0.0326 327/500 [==================>...........] - ETA: 1:27 - loss: 0.4477 - regression_loss: 0.4151 - classification_loss: 0.0326 328/500 [==================>...........] - ETA: 1:26 - loss: 0.4480 - regression_loss: 0.4154 - classification_loss: 0.0326 329/500 [==================>...........] - ETA: 1:26 - loss: 0.4470 - regression_loss: 0.4145 - classification_loss: 0.0326 330/500 [==================>...........] - ETA: 1:25 - loss: 0.4472 - regression_loss: 0.4146 - classification_loss: 0.0326 331/500 [==================>...........] - ETA: 1:25 - loss: 0.4476 - regression_loss: 0.4150 - classification_loss: 0.0325 332/500 [==================>...........] - ETA: 1:24 - loss: 0.4479 - regression_loss: 0.4153 - classification_loss: 0.0326 333/500 [==================>...........] - ETA: 1:24 - loss: 0.4479 - regression_loss: 0.4154 - classification_loss: 0.0325 334/500 [===================>..........] - ETA: 1:23 - loss: 0.4480 - regression_loss: 0.4154 - classification_loss: 0.0326 335/500 [===================>..........] - ETA: 1:23 - loss: 0.4483 - regression_loss: 0.4158 - classification_loss: 0.0326 336/500 [===================>..........] - ETA: 1:22 - loss: 0.4485 - regression_loss: 0.4160 - classification_loss: 0.0325 337/500 [===================>..........] - ETA: 1:22 - loss: 0.4493 - regression_loss: 0.4168 - classification_loss: 0.0326 338/500 [===================>..........] - ETA: 1:21 - loss: 0.4509 - regression_loss: 0.4183 - classification_loss: 0.0326 339/500 [===================>..........] - ETA: 1:21 - loss: 0.4512 - regression_loss: 0.4186 - classification_loss: 0.0326 340/500 [===================>..........] - ETA: 1:20 - loss: 0.4510 - regression_loss: 0.4184 - classification_loss: 0.0326 341/500 [===================>..........] - ETA: 1:20 - loss: 0.4512 - regression_loss: 0.4186 - classification_loss: 0.0326 342/500 [===================>..........] - ETA: 1:19 - loss: 0.4512 - regression_loss: 0.4185 - classification_loss: 0.0326 343/500 [===================>..........] - ETA: 1:18 - loss: 0.4513 - regression_loss: 0.4187 - classification_loss: 0.0326 344/500 [===================>..........] - ETA: 1:18 - loss: 0.4512 - regression_loss: 0.4186 - classification_loss: 0.0326 345/500 [===================>..........] - ETA: 1:17 - loss: 0.4509 - regression_loss: 0.4184 - classification_loss: 0.0325 346/500 [===================>..........] - ETA: 1:17 - loss: 0.4507 - regression_loss: 0.4182 - classification_loss: 0.0325 347/500 [===================>..........] - ETA: 1:16 - loss: 0.4505 - regression_loss: 0.4181 - classification_loss: 0.0324 348/500 [===================>..........] - ETA: 1:16 - loss: 0.4504 - regression_loss: 0.4180 - classification_loss: 0.0324 349/500 [===================>..........] - ETA: 1:15 - loss: 0.4504 - regression_loss: 0.4180 - classification_loss: 0.0324 350/500 [====================>.........] - ETA: 1:15 - loss: 0.4505 - regression_loss: 0.4180 - classification_loss: 0.0325 351/500 [====================>.........] - ETA: 1:14 - loss: 0.4508 - regression_loss: 0.4183 - classification_loss: 0.0325 352/500 [====================>.........] - ETA: 1:14 - loss: 0.4501 - regression_loss: 0.4176 - classification_loss: 0.0325 353/500 [====================>.........] - ETA: 1:13 - loss: 0.4496 - regression_loss: 0.4172 - classification_loss: 0.0324 354/500 [====================>.........] - ETA: 1:13 - loss: 0.4500 - regression_loss: 0.4175 - classification_loss: 0.0325 355/500 [====================>.........] - ETA: 1:12 - loss: 0.4502 - regression_loss: 0.4176 - classification_loss: 0.0326 356/500 [====================>.........] - ETA: 1:12 - loss: 0.4501 - regression_loss: 0.4175 - classification_loss: 0.0326 357/500 [====================>.........] - ETA: 1:11 - loss: 0.4497 - regression_loss: 0.4172 - classification_loss: 0.0325 358/500 [====================>.........] - ETA: 1:11 - loss: 0.4502 - regression_loss: 0.4175 - classification_loss: 0.0327 359/500 [====================>.........] - ETA: 1:10 - loss: 0.4500 - regression_loss: 0.4174 - classification_loss: 0.0326 360/500 [====================>.........] - ETA: 1:10 - loss: 0.4496 - regression_loss: 0.4170 - classification_loss: 0.0326 361/500 [====================>.........] - ETA: 1:09 - loss: 0.4492 - regression_loss: 0.4166 - classification_loss: 0.0326 362/500 [====================>.........] - ETA: 1:09 - loss: 0.4498 - regression_loss: 0.4172 - classification_loss: 0.0325 363/500 [====================>.........] - ETA: 1:08 - loss: 0.4501 - regression_loss: 0.4174 - classification_loss: 0.0326 364/500 [====================>.........] - ETA: 1:08 - loss: 0.4499 - regression_loss: 0.4173 - classification_loss: 0.0326 365/500 [====================>.........] - ETA: 1:07 - loss: 0.4498 - regression_loss: 0.4172 - classification_loss: 0.0326 366/500 [====================>.........] - ETA: 1:07 - loss: 0.4498 - regression_loss: 0.4172 - classification_loss: 0.0325 367/500 [=====================>........] - ETA: 1:06 - loss: 0.4492 - regression_loss: 0.4167 - classification_loss: 0.0325 368/500 [=====================>........] - ETA: 1:06 - loss: 0.4491 - regression_loss: 0.4166 - classification_loss: 0.0324 369/500 [=====================>........] - ETA: 1:05 - loss: 0.4488 - regression_loss: 0.4164 - classification_loss: 0.0324 370/500 [=====================>........] - ETA: 1:05 - loss: 0.4488 - regression_loss: 0.4165 - classification_loss: 0.0323 371/500 [=====================>........] - ETA: 1:04 - loss: 0.4486 - regression_loss: 0.4163 - classification_loss: 0.0323 372/500 [=====================>........] - ETA: 1:04 - loss: 0.4478 - regression_loss: 0.4156 - classification_loss: 0.0322 373/500 [=====================>........] - ETA: 1:03 - loss: 0.4480 - regression_loss: 0.4157 - classification_loss: 0.0322 374/500 [=====================>........] - ETA: 1:03 - loss: 0.4473 - regression_loss: 0.4151 - classification_loss: 0.0322 375/500 [=====================>........] - ETA: 1:02 - loss: 0.4472 - regression_loss: 0.4150 - classification_loss: 0.0321 376/500 [=====================>........] - ETA: 1:02 - loss: 0.4468 - regression_loss: 0.4147 - classification_loss: 0.0321 377/500 [=====================>........] - ETA: 1:01 - loss: 0.4472 - regression_loss: 0.4150 - classification_loss: 0.0322 378/500 [=====================>........] - ETA: 1:01 - loss: 0.4467 - regression_loss: 0.4146 - classification_loss: 0.0322 379/500 [=====================>........] - ETA: 1:00 - loss: 0.4459 - regression_loss: 0.4138 - classification_loss: 0.0321 380/500 [=====================>........] - ETA: 1:00 - loss: 0.4457 - regression_loss: 0.4136 - classification_loss: 0.0321 381/500 [=====================>........] - ETA: 59s - loss: 0.4455 - regression_loss: 0.4135 - classification_loss: 0.0321  382/500 [=====================>........] - ETA: 59s - loss: 0.4449 - regression_loss: 0.4128 - classification_loss: 0.0320 383/500 [=====================>........] - ETA: 58s - loss: 0.4450 - regression_loss: 0.4130 - classification_loss: 0.0320 384/500 [======================>.......] - ETA: 58s - loss: 0.4448 - regression_loss: 0.4128 - classification_loss: 0.0320 385/500 [======================>.......] - ETA: 57s - loss: 0.4446 - regression_loss: 0.4126 - classification_loss: 0.0320 386/500 [======================>.......] - ETA: 57s - loss: 0.4441 - regression_loss: 0.4122 - classification_loss: 0.0319 387/500 [======================>.......] - ETA: 56s - loss: 0.4439 - regression_loss: 0.4120 - classification_loss: 0.0319 388/500 [======================>.......] - ETA: 56s - loss: 0.4438 - regression_loss: 0.4119 - classification_loss: 0.0319 389/500 [======================>.......] - ETA: 55s - loss: 0.4440 - regression_loss: 0.4121 - classification_loss: 0.0319 390/500 [======================>.......] - ETA: 55s - loss: 0.4433 - regression_loss: 0.4114 - classification_loss: 0.0318 391/500 [======================>.......] - ETA: 54s - loss: 0.4436 - regression_loss: 0.4118 - classification_loss: 0.0318 392/500 [======================>.......] - ETA: 54s - loss: 0.4439 - regression_loss: 0.4121 - classification_loss: 0.0318 393/500 [======================>.......] - ETA: 53s - loss: 0.4435 - regression_loss: 0.4118 - classification_loss: 0.0317 394/500 [======================>.......] - ETA: 53s - loss: 0.4438 - regression_loss: 0.4120 - classification_loss: 0.0319 395/500 [======================>.......] - ETA: 52s - loss: 0.4433 - regression_loss: 0.4115 - classification_loss: 0.0318 396/500 [======================>.......] - ETA: 52s - loss: 0.4428 - regression_loss: 0.4110 - classification_loss: 0.0318 397/500 [======================>.......] - ETA: 51s - loss: 0.4422 - regression_loss: 0.4104 - classification_loss: 0.0317 398/500 [======================>.......] - ETA: 51s - loss: 0.4425 - regression_loss: 0.4108 - classification_loss: 0.0318 399/500 [======================>.......] - ETA: 50s - loss: 0.4427 - regression_loss: 0.4109 - classification_loss: 0.0318 400/500 [=======================>......] - ETA: 50s - loss: 0.4425 - regression_loss: 0.4107 - classification_loss: 0.0318 401/500 [=======================>......] - ETA: 49s - loss: 0.4427 - regression_loss: 0.4109 - classification_loss: 0.0318 402/500 [=======================>......] - ETA: 49s - loss: 0.4434 - regression_loss: 0.4115 - classification_loss: 0.0319 403/500 [=======================>......] - ETA: 48s - loss: 0.4434 - regression_loss: 0.4114 - classification_loss: 0.0319 404/500 [=======================>......] - ETA: 48s - loss: 0.4437 - regression_loss: 0.4118 - classification_loss: 0.0320 405/500 [=======================>......] - ETA: 47s - loss: 0.4430 - regression_loss: 0.4111 - classification_loss: 0.0319 406/500 [=======================>......] - ETA: 47s - loss: 0.4427 - regression_loss: 0.4108 - classification_loss: 0.0319 407/500 [=======================>......] - ETA: 46s - loss: 0.4431 - regression_loss: 0.4111 - classification_loss: 0.0319 408/500 [=======================>......] - ETA: 46s - loss: 0.4432 - regression_loss: 0.4112 - classification_loss: 0.0320 409/500 [=======================>......] - ETA: 45s - loss: 0.4433 - regression_loss: 0.4112 - classification_loss: 0.0320 410/500 [=======================>......] - ETA: 45s - loss: 0.4431 - regression_loss: 0.4111 - classification_loss: 0.0320 411/500 [=======================>......] - ETA: 44s - loss: 0.4429 - regression_loss: 0.4109 - classification_loss: 0.0320 412/500 [=======================>......] - ETA: 44s - loss: 0.4429 - regression_loss: 0.4109 - classification_loss: 0.0320 413/500 [=======================>......] - ETA: 43s - loss: 0.4430 - regression_loss: 0.4110 - classification_loss: 0.0320 414/500 [=======================>......] - ETA: 43s - loss: 0.4426 - regression_loss: 0.4106 - classification_loss: 0.0319 415/500 [=======================>......] - ETA: 42s - loss: 0.4425 - regression_loss: 0.4106 - classification_loss: 0.0319 416/500 [=======================>......] - ETA: 42s - loss: 0.4423 - regression_loss: 0.4104 - classification_loss: 0.0319 417/500 [========================>.....] - ETA: 41s - loss: 0.4417 - regression_loss: 0.4098 - classification_loss: 0.0319 418/500 [========================>.....] - ETA: 41s - loss: 0.4412 - regression_loss: 0.4094 - classification_loss: 0.0319 419/500 [========================>.....] - ETA: 40s - loss: 0.4417 - regression_loss: 0.4097 - classification_loss: 0.0319 420/500 [========================>.....] - ETA: 40s - loss: 0.4413 - regression_loss: 0.4094 - classification_loss: 0.0319 421/500 [========================>.....] - ETA: 39s - loss: 0.4414 - regression_loss: 0.4094 - classification_loss: 0.0320 422/500 [========================>.....] - ETA: 39s - loss: 0.4411 - regression_loss: 0.4092 - classification_loss: 0.0319 423/500 [========================>.....] - ETA: 38s - loss: 0.4411 - regression_loss: 0.4092 - classification_loss: 0.0319 424/500 [========================>.....] - ETA: 38s - loss: 0.4412 - regression_loss: 0.4092 - classification_loss: 0.0319 425/500 [========================>.....] - ETA: 37s - loss: 0.4415 - regression_loss: 0.4095 - classification_loss: 0.0320 426/500 [========================>.....] - ETA: 37s - loss: 0.4418 - regression_loss: 0.4098 - classification_loss: 0.0320 427/500 [========================>.....] - ETA: 36s - loss: 0.4414 - regression_loss: 0.4094 - classification_loss: 0.0320 428/500 [========================>.....] - ETA: 36s - loss: 0.4415 - regression_loss: 0.4095 - classification_loss: 0.0320 429/500 [========================>.....] - ETA: 35s - loss: 0.4415 - regression_loss: 0.4095 - classification_loss: 0.0320 430/500 [========================>.....] - ETA: 35s - loss: 0.4416 - regression_loss: 0.4096 - classification_loss: 0.0320 431/500 [========================>.....] - ETA: 34s - loss: 0.4429 - regression_loss: 0.4109 - classification_loss: 0.0321 432/500 [========================>.....] - ETA: 34s - loss: 0.4431 - regression_loss: 0.4110 - classification_loss: 0.0321 433/500 [========================>.....] - ETA: 33s - loss: 0.4435 - regression_loss: 0.4114 - classification_loss: 0.0321 434/500 [=========================>....] - ETA: 33s - loss: 0.4437 - regression_loss: 0.4116 - classification_loss: 0.0321 435/500 [=========================>....] - ETA: 32s - loss: 0.4439 - regression_loss: 0.4118 - classification_loss: 0.0321 436/500 [=========================>....] - ETA: 32s - loss: 0.4437 - regression_loss: 0.4116 - classification_loss: 0.0321 437/500 [=========================>....] - ETA: 31s - loss: 0.4437 - regression_loss: 0.4116 - classification_loss: 0.0321 438/500 [=========================>....] - ETA: 31s - loss: 0.4431 - regression_loss: 0.4111 - classification_loss: 0.0320 439/500 [=========================>....] - ETA: 30s - loss: 0.4431 - regression_loss: 0.4111 - classification_loss: 0.0320 440/500 [=========================>....] - ETA: 30s - loss: 0.4426 - regression_loss: 0.4106 - classification_loss: 0.0320 441/500 [=========================>....] - ETA: 29s - loss: 0.4432 - regression_loss: 0.4112 - classification_loss: 0.0320 442/500 [=========================>....] - ETA: 29s - loss: 0.4429 - regression_loss: 0.4110 - classification_loss: 0.0319 443/500 [=========================>....] - ETA: 28s - loss: 0.4426 - regression_loss: 0.4107 - classification_loss: 0.0319 444/500 [=========================>....] - ETA: 28s - loss: 0.4420 - regression_loss: 0.4102 - classification_loss: 0.0319 445/500 [=========================>....] - ETA: 27s - loss: 0.4422 - regression_loss: 0.4103 - classification_loss: 0.0318 446/500 [=========================>....] - ETA: 27s - loss: 0.4423 - regression_loss: 0.4104 - classification_loss: 0.0318 447/500 [=========================>....] - ETA: 26s - loss: 0.4423 - regression_loss: 0.4105 - classification_loss: 0.0319 448/500 [=========================>....] - ETA: 26s - loss: 0.4425 - regression_loss: 0.4107 - classification_loss: 0.0319 449/500 [=========================>....] - ETA: 25s - loss: 0.4427 - regression_loss: 0.4108 - classification_loss: 0.0319 450/500 [==========================>...] - ETA: 25s - loss: 0.4435 - regression_loss: 0.4115 - classification_loss: 0.0320 451/500 [==========================>...] - ETA: 24s - loss: 0.4430 - regression_loss: 0.4110 - classification_loss: 0.0320 452/500 [==========================>...] - ETA: 24s - loss: 0.4434 - regression_loss: 0.4113 - classification_loss: 0.0321 453/500 [==========================>...] - ETA: 23s - loss: 0.4434 - regression_loss: 0.4113 - classification_loss: 0.0321 454/500 [==========================>...] - ETA: 23s - loss: 0.4431 - regression_loss: 0.4111 - classification_loss: 0.0320 455/500 [==========================>...] - ETA: 22s - loss: 0.4432 - regression_loss: 0.4112 - classification_loss: 0.0320 456/500 [==========================>...] - ETA: 22s - loss: 0.4427 - regression_loss: 0.4107 - classification_loss: 0.0320 457/500 [==========================>...] - ETA: 21s - loss: 0.4426 - regression_loss: 0.4106 - classification_loss: 0.0320 458/500 [==========================>...] - ETA: 21s - loss: 0.4424 - regression_loss: 0.4104 - classification_loss: 0.0320 459/500 [==========================>...] - ETA: 20s - loss: 0.4425 - regression_loss: 0.4105 - classification_loss: 0.0320 460/500 [==========================>...] - ETA: 20s - loss: 0.4426 - regression_loss: 0.4106 - classification_loss: 0.0320 461/500 [==========================>...] - ETA: 19s - loss: 0.4427 - regression_loss: 0.4107 - classification_loss: 0.0320 462/500 [==========================>...] - ETA: 19s - loss: 0.4428 - regression_loss: 0.4108 - classification_loss: 0.0320 463/500 [==========================>...] - ETA: 18s - loss: 0.4426 - regression_loss: 0.4106 - classification_loss: 0.0320 464/500 [==========================>...] - ETA: 18s - loss: 0.4427 - regression_loss: 0.4107 - classification_loss: 0.0320 465/500 [==========================>...] - ETA: 17s - loss: 0.4433 - regression_loss: 0.4113 - classification_loss: 0.0320 466/500 [==========================>...] - ETA: 17s - loss: 0.4442 - regression_loss: 0.4120 - classification_loss: 0.0322 467/500 [===========================>..] - ETA: 16s - loss: 0.4444 - regression_loss: 0.4121 - classification_loss: 0.0322 468/500 [===========================>..] - ETA: 16s - loss: 0.4446 - regression_loss: 0.4124 - classification_loss: 0.0322 469/500 [===========================>..] - ETA: 15s - loss: 0.4443 - regression_loss: 0.4121 - classification_loss: 0.0322 470/500 [===========================>..] - ETA: 15s - loss: 0.4441 - regression_loss: 0.4119 - classification_loss: 0.0322 471/500 [===========================>..] - ETA: 14s - loss: 0.4435 - regression_loss: 0.4114 - classification_loss: 0.0321 472/500 [===========================>..] - ETA: 14s - loss: 0.4437 - regression_loss: 0.4116 - classification_loss: 0.0321 473/500 [===========================>..] - ETA: 13s - loss: 0.4434 - regression_loss: 0.4113 - classification_loss: 0.0321 474/500 [===========================>..] - ETA: 13s - loss: 0.4429 - regression_loss: 0.4108 - classification_loss: 0.0320 475/500 [===========================>..] - ETA: 12s - loss: 0.4430 - regression_loss: 0.4110 - classification_loss: 0.0321 476/500 [===========================>..] - ETA: 12s - loss: 0.4426 - regression_loss: 0.4106 - classification_loss: 0.0320 477/500 [===========================>..] - ETA: 11s - loss: 0.4424 - regression_loss: 0.4104 - classification_loss: 0.0320 478/500 [===========================>..] - ETA: 11s - loss: 0.4424 - regression_loss: 0.4104 - classification_loss: 0.0320 479/500 [===========================>..] - ETA: 10s - loss: 0.4419 - regression_loss: 0.4100 - classification_loss: 0.0319 480/500 [===========================>..] - ETA: 10s - loss: 0.4417 - regression_loss: 0.4097 - classification_loss: 0.0319 481/500 [===========================>..] - ETA: 9s - loss: 0.4412 - regression_loss: 0.4093 - classification_loss: 0.0319  482/500 [===========================>..] - ETA: 9s - loss: 0.4408 - regression_loss: 0.4089 - classification_loss: 0.0319 483/500 [===========================>..] - ETA: 8s - loss: 0.4404 - regression_loss: 0.4086 - classification_loss: 0.0318 484/500 [============================>.] - ETA: 8s - loss: 0.4400 - regression_loss: 0.4082 - classification_loss: 0.0318 485/500 [============================>.] - ETA: 7s - loss: 0.4399 - regression_loss: 0.4081 - classification_loss: 0.0318 486/500 [============================>.] - ETA: 7s - loss: 0.4400 - regression_loss: 0.4081 - classification_loss: 0.0319 487/500 [============================>.] - ETA: 6s - loss: 0.4395 - regression_loss: 0.4076 - classification_loss: 0.0318 488/500 [============================>.] - ETA: 6s - loss: 0.4393 - regression_loss: 0.4075 - classification_loss: 0.0318 489/500 [============================>.] - ETA: 5s - loss: 0.4387 - regression_loss: 0.4069 - classification_loss: 0.0318 490/500 [============================>.] - ETA: 5s - loss: 0.4389 - regression_loss: 0.4071 - classification_loss: 0.0318 491/500 [============================>.] - ETA: 4s - loss: 0.4386 - regression_loss: 0.4068 - classification_loss: 0.0318 492/500 [============================>.] - ETA: 4s - loss: 0.4389 - regression_loss: 0.4070 - classification_loss: 0.0319 493/500 [============================>.] - ETA: 3s - loss: 0.4387 - regression_loss: 0.4069 - classification_loss: 0.0318 494/500 [============================>.] - ETA: 3s - loss: 0.4383 - regression_loss: 0.4065 - classification_loss: 0.0318 495/500 [============================>.] - ETA: 2s - loss: 0.4380 - regression_loss: 0.4063 - classification_loss: 0.0318 496/500 [============================>.] - ETA: 2s - loss: 0.4382 - regression_loss: 0.4065 - classification_loss: 0.0317 497/500 [============================>.] - ETA: 1s - loss: 0.4382 - regression_loss: 0.4065 - classification_loss: 0.0317 498/500 [============================>.] - ETA: 1s - loss: 0.4375 - regression_loss: 0.4058 - classification_loss: 0.0317 499/500 [============================>.] - ETA: 0s - loss: 0.4380 - regression_loss: 0.4063 - classification_loss: 0.0317 500/500 [==============================] - 252s 503ms/step - loss: 0.4383 - regression_loss: 0.4066 - classification_loss: 0.0317 1172 instances of class plum with average precision: 0.7189 mAP: 0.7189 Epoch 00008: saving model to ./training/snapshots/resnet101_pascal_08.h5 Epoch 9/150 1/500 [..............................] - ETA: 4:09 - loss: 0.4217 - regression_loss: 0.4027 - classification_loss: 0.0190 2/500 [..............................] - ETA: 4:06 - loss: 0.3517 - regression_loss: 0.3275 - classification_loss: 0.0242 3/500 [..............................] - ETA: 4:08 - loss: 0.4142 - regression_loss: 0.3884 - classification_loss: 0.0257 4/500 [..............................] - ETA: 4:07 - loss: 0.4233 - regression_loss: 0.3978 - classification_loss: 0.0255 5/500 [..............................] - ETA: 4:09 - loss: 0.4450 - regression_loss: 0.4165 - classification_loss: 0.0285 6/500 [..............................] - ETA: 4:08 - loss: 0.4417 - regression_loss: 0.4103 - classification_loss: 0.0315 7/500 [..............................] - ETA: 4:44 - loss: 0.4364 - regression_loss: 0.4048 - classification_loss: 0.0317 8/500 [..............................] - ETA: 4:39 - loss: 0.4323 - regression_loss: 0.4022 - classification_loss: 0.0301 9/500 [..............................] - ETA: 4:34 - loss: 0.4133 - regression_loss: 0.3844 - classification_loss: 0.0289 10/500 [..............................] - ETA: 4:30 - loss: 0.4103 - regression_loss: 0.3795 - classification_loss: 0.0308 11/500 [..............................] - ETA: 4:27 - loss: 0.3985 - regression_loss: 0.3678 - classification_loss: 0.0307 12/500 [..............................] - ETA: 4:24 - loss: 0.3931 - regression_loss: 0.3629 - classification_loss: 0.0302 13/500 [..............................] - ETA: 4:21 - loss: 0.3907 - regression_loss: 0.3612 - classification_loss: 0.0294 14/500 [..............................] - ETA: 4:19 - loss: 0.3811 - regression_loss: 0.3525 - classification_loss: 0.0286 15/500 [..............................] - ETA: 4:18 - loss: 0.3911 - regression_loss: 0.3611 - classification_loss: 0.0301 16/500 [..............................] - ETA: 4:18 - loss: 0.3804 - regression_loss: 0.3511 - classification_loss: 0.0293 17/500 [>.............................] - ETA: 4:16 - loss: 0.3776 - regression_loss: 0.3487 - classification_loss: 0.0289 18/500 [>.............................] - ETA: 4:14 - loss: 0.3773 - regression_loss: 0.3488 - classification_loss: 0.0286 19/500 [>.............................] - ETA: 4:13 - loss: 0.3874 - regression_loss: 0.3580 - classification_loss: 0.0295 20/500 [>.............................] - ETA: 4:12 - loss: 0.3936 - regression_loss: 0.3640 - classification_loss: 0.0296 21/500 [>.............................] - ETA: 4:11 - loss: 0.3925 - regression_loss: 0.3626 - classification_loss: 0.0299 22/500 [>.............................] - ETA: 4:10 - loss: 0.4000 - regression_loss: 0.3698 - classification_loss: 0.0302 23/500 [>.............................] - ETA: 4:09 - loss: 0.4054 - regression_loss: 0.3739 - classification_loss: 0.0315 24/500 [>.............................] - ETA: 4:08 - loss: 0.4040 - regression_loss: 0.3732 - classification_loss: 0.0308 25/500 [>.............................] - ETA: 4:07 - loss: 0.4030 - regression_loss: 0.3716 - classification_loss: 0.0314 26/500 [>.............................] - ETA: 4:06 - loss: 0.4009 - regression_loss: 0.3697 - classification_loss: 0.0311 27/500 [>.............................] - ETA: 4:05 - loss: 0.4041 - regression_loss: 0.3724 - classification_loss: 0.0318 28/500 [>.............................] - ETA: 4:04 - loss: 0.4010 - regression_loss: 0.3697 - classification_loss: 0.0313 29/500 [>.............................] - ETA: 4:04 - loss: 0.3980 - regression_loss: 0.3671 - classification_loss: 0.0308 30/500 [>.............................] - ETA: 4:03 - loss: 0.3997 - regression_loss: 0.3689 - classification_loss: 0.0308 31/500 [>.............................] - ETA: 4:02 - loss: 0.3981 - regression_loss: 0.3681 - classification_loss: 0.0300 32/500 [>.............................] - ETA: 4:02 - loss: 0.3948 - regression_loss: 0.3651 - classification_loss: 0.0297 33/500 [>.............................] - ETA: 4:01 - loss: 0.4023 - regression_loss: 0.3727 - classification_loss: 0.0296 34/500 [=>............................] - ETA: 4:00 - loss: 0.4019 - regression_loss: 0.3724 - classification_loss: 0.0295 35/500 [=>............................] - ETA: 3:59 - loss: 0.3926 - regression_loss: 0.3637 - classification_loss: 0.0288 36/500 [=>............................] - ETA: 3:59 - loss: 0.3973 - regression_loss: 0.3675 - classification_loss: 0.0298 37/500 [=>............................] - ETA: 3:58 - loss: 0.3898 - regression_loss: 0.3606 - classification_loss: 0.0292 38/500 [=>............................] - ETA: 3:57 - loss: 0.3896 - regression_loss: 0.3605 - classification_loss: 0.0292 39/500 [=>............................] - ETA: 3:57 - loss: 0.3851 - regression_loss: 0.3563 - classification_loss: 0.0288 40/500 [=>............................] - ETA: 3:56 - loss: 0.3802 - regression_loss: 0.3519 - classification_loss: 0.0282 41/500 [=>............................] - ETA: 3:55 - loss: 0.3822 - regression_loss: 0.3535 - classification_loss: 0.0287 42/500 [=>............................] - ETA: 3:55 - loss: 0.3855 - regression_loss: 0.3569 - classification_loss: 0.0286 43/500 [=>............................] - ETA: 3:54 - loss: 0.3851 - regression_loss: 0.3564 - classification_loss: 0.0287 44/500 [=>............................] - ETA: 3:53 - loss: 0.3862 - regression_loss: 0.3571 - classification_loss: 0.0291 45/500 [=>............................] - ETA: 3:53 - loss: 0.3857 - regression_loss: 0.3571 - classification_loss: 0.0286 46/500 [=>............................] - ETA: 3:52 - loss: 0.3829 - regression_loss: 0.3546 - classification_loss: 0.0284 47/500 [=>............................] - ETA: 3:52 - loss: 0.3849 - regression_loss: 0.3563 - classification_loss: 0.0287 48/500 [=>............................] - ETA: 3:51 - loss: 0.3847 - regression_loss: 0.3563 - classification_loss: 0.0284 49/500 [=>............................] - ETA: 3:50 - loss: 0.3852 - regression_loss: 0.3568 - classification_loss: 0.0284 50/500 [==>...........................] - ETA: 3:49 - loss: 0.3921 - regression_loss: 0.3624 - classification_loss: 0.0297 51/500 [==>...........................] - ETA: 3:49 - loss: 0.3926 - regression_loss: 0.3630 - classification_loss: 0.0296 52/500 [==>...........................] - ETA: 3:48 - loss: 0.3904 - regression_loss: 0.3607 - classification_loss: 0.0297 53/500 [==>...........................] - ETA: 3:47 - loss: 0.3893 - regression_loss: 0.3596 - classification_loss: 0.0297 54/500 [==>...........................] - ETA: 3:47 - loss: 0.3897 - regression_loss: 0.3600 - classification_loss: 0.0297 55/500 [==>...........................] - ETA: 3:46 - loss: 0.3913 - regression_loss: 0.3618 - classification_loss: 0.0295 56/500 [==>...........................] - ETA: 3:46 - loss: 0.3910 - regression_loss: 0.3614 - classification_loss: 0.0297 57/500 [==>...........................] - ETA: 3:45 - loss: 0.3882 - regression_loss: 0.3589 - classification_loss: 0.0293 58/500 [==>...........................] - ETA: 3:45 - loss: 0.3908 - regression_loss: 0.3617 - classification_loss: 0.0290 59/500 [==>...........................] - ETA: 3:44 - loss: 0.3915 - regression_loss: 0.3624 - classification_loss: 0.0291 60/500 [==>...........................] - ETA: 3:43 - loss: 0.3899 - regression_loss: 0.3610 - classification_loss: 0.0289 61/500 [==>...........................] - ETA: 3:43 - loss: 0.3888 - regression_loss: 0.3600 - classification_loss: 0.0289 62/500 [==>...........................] - ETA: 3:42 - loss: 0.3876 - regression_loss: 0.3588 - classification_loss: 0.0288 63/500 [==>...........................] - ETA: 3:42 - loss: 0.3887 - regression_loss: 0.3599 - classification_loss: 0.0287 64/500 [==>...........................] - ETA: 3:41 - loss: 0.3847 - regression_loss: 0.3563 - classification_loss: 0.0284 65/500 [==>...........................] - ETA: 3:41 - loss: 0.3827 - regression_loss: 0.3546 - classification_loss: 0.0281 66/500 [==>...........................] - ETA: 3:40 - loss: 0.3848 - regression_loss: 0.3562 - classification_loss: 0.0286 67/500 [===>..........................] - ETA: 3:40 - loss: 0.3859 - regression_loss: 0.3573 - classification_loss: 0.0286 68/500 [===>..........................] - ETA: 3:39 - loss: 0.3846 - regression_loss: 0.3562 - classification_loss: 0.0284 69/500 [===>..........................] - ETA: 3:39 - loss: 0.3895 - regression_loss: 0.3607 - classification_loss: 0.0288 70/500 [===>..........................] - ETA: 3:38 - loss: 0.3921 - regression_loss: 0.3630 - classification_loss: 0.0291 71/500 [===>..........................] - ETA: 3:37 - loss: 0.3903 - regression_loss: 0.3614 - classification_loss: 0.0289 72/500 [===>..........................] - ETA: 3:37 - loss: 0.3895 - regression_loss: 0.3609 - classification_loss: 0.0285 73/500 [===>..........................] - ETA: 3:36 - loss: 0.3943 - regression_loss: 0.3654 - classification_loss: 0.0290 74/500 [===>..........................] - ETA: 3:36 - loss: 0.3941 - regression_loss: 0.3654 - classification_loss: 0.0287 75/500 [===>..........................] - ETA: 3:35 - loss: 0.3911 - regression_loss: 0.3627 - classification_loss: 0.0285 76/500 [===>..........................] - ETA: 3:35 - loss: 0.3880 - regression_loss: 0.3598 - classification_loss: 0.0282 77/500 [===>..........................] - ETA: 3:34 - loss: 0.3863 - regression_loss: 0.3582 - classification_loss: 0.0281 78/500 [===>..........................] - ETA: 3:34 - loss: 0.3843 - regression_loss: 0.3564 - classification_loss: 0.0279 79/500 [===>..........................] - ETA: 3:33 - loss: 0.3846 - regression_loss: 0.3565 - classification_loss: 0.0281 80/500 [===>..........................] - ETA: 3:33 - loss: 0.3857 - regression_loss: 0.3574 - classification_loss: 0.0284 81/500 [===>..........................] - ETA: 3:32 - loss: 0.3878 - regression_loss: 0.3589 - classification_loss: 0.0289 82/500 [===>..........................] - ETA: 3:32 - loss: 0.3877 - regression_loss: 0.3588 - classification_loss: 0.0289 83/500 [===>..........................] - ETA: 3:31 - loss: 0.3852 - regression_loss: 0.3566 - classification_loss: 0.0286 84/500 [====>.........................] - ETA: 3:30 - loss: 0.3862 - regression_loss: 0.3575 - classification_loss: 0.0287 85/500 [====>.........................] - ETA: 3:30 - loss: 0.3869 - regression_loss: 0.3582 - classification_loss: 0.0287 86/500 [====>.........................] - ETA: 3:29 - loss: 0.3848 - regression_loss: 0.3563 - classification_loss: 0.0285 87/500 [====>.........................] - ETA: 3:29 - loss: 0.3876 - regression_loss: 0.3589 - classification_loss: 0.0288 88/500 [====>.........................] - ETA: 3:28 - loss: 0.3896 - regression_loss: 0.3607 - classification_loss: 0.0288 89/500 [====>.........................] - ETA: 3:28 - loss: 0.3871 - regression_loss: 0.3585 - classification_loss: 0.0286 90/500 [====>.........................] - ETA: 3:27 - loss: 0.3867 - regression_loss: 0.3583 - classification_loss: 0.0284 91/500 [====>.........................] - ETA: 3:27 - loss: 0.3883 - regression_loss: 0.3598 - classification_loss: 0.0284 92/500 [====>.........................] - ETA: 3:26 - loss: 0.3889 - regression_loss: 0.3605 - classification_loss: 0.0284 93/500 [====>.........................] - ETA: 3:26 - loss: 0.3887 - regression_loss: 0.3603 - classification_loss: 0.0284 94/500 [====>.........................] - ETA: 3:25 - loss: 0.3889 - regression_loss: 0.3607 - classification_loss: 0.0282 95/500 [====>.........................] - ETA: 3:25 - loss: 0.3884 - regression_loss: 0.3603 - classification_loss: 0.0281 96/500 [====>.........................] - ETA: 3:24 - loss: 0.3868 - regression_loss: 0.3589 - classification_loss: 0.0280 97/500 [====>.........................] - ETA: 3:24 - loss: 0.3858 - regression_loss: 0.3579 - classification_loss: 0.0279 98/500 [====>.........................] - ETA: 3:23 - loss: 0.3845 - regression_loss: 0.3567 - classification_loss: 0.0277 99/500 [====>.........................] - ETA: 3:23 - loss: 0.3830 - regression_loss: 0.3555 - classification_loss: 0.0275 100/500 [=====>........................] - ETA: 3:22 - loss: 0.3848 - regression_loss: 0.3571 - classification_loss: 0.0277 101/500 [=====>........................] - ETA: 3:22 - loss: 0.3847 - regression_loss: 0.3571 - classification_loss: 0.0276 102/500 [=====>........................] - ETA: 3:21 - loss: 0.3857 - regression_loss: 0.3579 - classification_loss: 0.0278 103/500 [=====>........................] - ETA: 3:21 - loss: 0.3859 - regression_loss: 0.3581 - classification_loss: 0.0278 104/500 [=====>........................] - ETA: 3:20 - loss: 0.3844 - regression_loss: 0.3567 - classification_loss: 0.0277 105/500 [=====>........................] - ETA: 3:20 - loss: 0.3819 - regression_loss: 0.3543 - classification_loss: 0.0275 106/500 [=====>........................] - ETA: 3:19 - loss: 0.3800 - regression_loss: 0.3525 - classification_loss: 0.0274 107/500 [=====>........................] - ETA: 3:19 - loss: 0.3779 - regression_loss: 0.3506 - classification_loss: 0.0273 108/500 [=====>........................] - ETA: 3:18 - loss: 0.3754 - regression_loss: 0.3483 - classification_loss: 0.0271 109/500 [=====>........................] - ETA: 3:18 - loss: 0.3748 - regression_loss: 0.3478 - classification_loss: 0.0270 110/500 [=====>........................] - ETA: 3:17 - loss: 0.3731 - regression_loss: 0.3462 - classification_loss: 0.0269 111/500 [=====>........................] - ETA: 3:17 - loss: 0.3724 - regression_loss: 0.3456 - classification_loss: 0.0268 112/500 [=====>........................] - ETA: 3:16 - loss: 0.3733 - regression_loss: 0.3465 - classification_loss: 0.0268 113/500 [=====>........................] - ETA: 3:16 - loss: 0.3725 - regression_loss: 0.3457 - classification_loss: 0.0268 114/500 [=====>........................] - ETA: 3:15 - loss: 0.3744 - regression_loss: 0.3475 - classification_loss: 0.0270 115/500 [=====>........................] - ETA: 3:15 - loss: 0.3736 - regression_loss: 0.3468 - classification_loss: 0.0268 116/500 [=====>........................] - ETA: 3:14 - loss: 0.3728 - regression_loss: 0.3462 - classification_loss: 0.0267 117/500 [======>.......................] - ETA: 3:14 - loss: 0.3772 - regression_loss: 0.3499 - classification_loss: 0.0273 118/500 [======>.......................] - ETA: 3:13 - loss: 0.3806 - regression_loss: 0.3530 - classification_loss: 0.0276 119/500 [======>.......................] - ETA: 3:13 - loss: 0.3852 - regression_loss: 0.3572 - classification_loss: 0.0280 120/500 [======>.......................] - ETA: 3:12 - loss: 0.3850 - regression_loss: 0.3571 - classification_loss: 0.0279 121/500 [======>.......................] - ETA: 3:12 - loss: 0.3853 - regression_loss: 0.3573 - classification_loss: 0.0280 122/500 [======>.......................] - ETA: 3:11 - loss: 0.3844 - regression_loss: 0.3565 - classification_loss: 0.0279 123/500 [======>.......................] - ETA: 3:11 - loss: 0.3861 - regression_loss: 0.3579 - classification_loss: 0.0282 124/500 [======>.......................] - ETA: 3:10 - loss: 0.3845 - regression_loss: 0.3564 - classification_loss: 0.0281 125/500 [======>.......................] - ETA: 3:10 - loss: 0.3831 - regression_loss: 0.3551 - classification_loss: 0.0280 126/500 [======>.......................] - ETA: 3:09 - loss: 0.3825 - regression_loss: 0.3546 - classification_loss: 0.0279 127/500 [======>.......................] - ETA: 3:08 - loss: 0.3820 - regression_loss: 0.3542 - classification_loss: 0.0278 128/500 [======>.......................] - ETA: 3:08 - loss: 0.3847 - regression_loss: 0.3565 - classification_loss: 0.0281 129/500 [======>.......................] - ETA: 3:07 - loss: 0.3856 - regression_loss: 0.3573 - classification_loss: 0.0282 130/500 [======>.......................] - ETA: 3:07 - loss: 0.3882 - regression_loss: 0.3594 - classification_loss: 0.0288 131/500 [======>.......................] - ETA: 3:06 - loss: 0.3910 - regression_loss: 0.3618 - classification_loss: 0.0292 132/500 [======>.......................] - ETA: 3:06 - loss: 0.3901 - regression_loss: 0.3609 - classification_loss: 0.0292 133/500 [======>.......................] - ETA: 3:05 - loss: 0.3920 - regression_loss: 0.3627 - classification_loss: 0.0292 134/500 [=======>......................] - ETA: 3:05 - loss: 0.3930 - regression_loss: 0.3638 - classification_loss: 0.0293 135/500 [=======>......................] - ETA: 3:04 - loss: 0.3936 - regression_loss: 0.3642 - classification_loss: 0.0294 136/500 [=======>......................] - ETA: 3:04 - loss: 0.3932 - regression_loss: 0.3639 - classification_loss: 0.0293 137/500 [=======>......................] - ETA: 3:03 - loss: 0.3955 - regression_loss: 0.3661 - classification_loss: 0.0293 138/500 [=======>......................] - ETA: 3:03 - loss: 0.3942 - regression_loss: 0.3650 - classification_loss: 0.0292 139/500 [=======>......................] - ETA: 3:02 - loss: 0.3946 - regression_loss: 0.3654 - classification_loss: 0.0292 140/500 [=======>......................] - ETA: 3:02 - loss: 0.3931 - regression_loss: 0.3640 - classification_loss: 0.0291 141/500 [=======>......................] - ETA: 3:01 - loss: 0.3931 - regression_loss: 0.3640 - classification_loss: 0.0291 142/500 [=======>......................] - ETA: 3:01 - loss: 0.3953 - regression_loss: 0.3661 - classification_loss: 0.0292 143/500 [=======>......................] - ETA: 3:00 - loss: 0.3933 - regression_loss: 0.3643 - classification_loss: 0.0290 144/500 [=======>......................] - ETA: 3:00 - loss: 0.3928 - regression_loss: 0.3639 - classification_loss: 0.0289 145/500 [=======>......................] - ETA: 2:59 - loss: 0.3942 - regression_loss: 0.3651 - classification_loss: 0.0291 146/500 [=======>......................] - ETA: 2:59 - loss: 0.3937 - regression_loss: 0.3648 - classification_loss: 0.0290 147/500 [=======>......................] - ETA: 2:58 - loss: 0.3935 - regression_loss: 0.3647 - classification_loss: 0.0288 148/500 [=======>......................] - ETA: 2:58 - loss: 0.3947 - regression_loss: 0.3658 - classification_loss: 0.0288 149/500 [=======>......................] - ETA: 2:57 - loss: 0.3944 - regression_loss: 0.3656 - classification_loss: 0.0288 150/500 [========>.....................] - ETA: 2:57 - loss: 0.3952 - regression_loss: 0.3664 - classification_loss: 0.0288 151/500 [========>.....................] - ETA: 2:56 - loss: 0.3941 - regression_loss: 0.3654 - classification_loss: 0.0287 152/500 [========>.....................] - ETA: 2:56 - loss: 0.3958 - regression_loss: 0.3668 - classification_loss: 0.0289 153/500 [========>.....................] - ETA: 2:55 - loss: 0.3952 - regression_loss: 0.3663 - classification_loss: 0.0289 154/500 [========>.....................] - ETA: 2:55 - loss: 0.3952 - regression_loss: 0.3664 - classification_loss: 0.0288 155/500 [========>.....................] - ETA: 2:54 - loss: 0.3966 - regression_loss: 0.3677 - classification_loss: 0.0289 156/500 [========>.....................] - ETA: 2:54 - loss: 0.3967 - regression_loss: 0.3677 - classification_loss: 0.0290 157/500 [========>.....................] - ETA: 2:53 - loss: 0.3958 - regression_loss: 0.3669 - classification_loss: 0.0290 158/500 [========>.....................] - ETA: 2:53 - loss: 0.3991 - regression_loss: 0.3702 - classification_loss: 0.0289 159/500 [========>.....................] - ETA: 2:52 - loss: 0.3987 - regression_loss: 0.3698 - classification_loss: 0.0289 160/500 [========>.....................] - ETA: 2:51 - loss: 0.3994 - regression_loss: 0.3705 - classification_loss: 0.0289 161/500 [========>.....................] - ETA: 2:51 - loss: 0.3992 - regression_loss: 0.3703 - classification_loss: 0.0290 162/500 [========>.....................] - ETA: 2:50 - loss: 0.3980 - regression_loss: 0.3691 - classification_loss: 0.0289 163/500 [========>.....................] - ETA: 2:50 - loss: 0.3963 - regression_loss: 0.3675 - classification_loss: 0.0288 164/500 [========>.....................] - ETA: 2:49 - loss: 0.3963 - regression_loss: 0.3675 - classification_loss: 0.0288 165/500 [========>.....................] - ETA: 2:49 - loss: 0.3963 - regression_loss: 0.3675 - classification_loss: 0.0288 166/500 [========>.....................] - ETA: 2:48 - loss: 0.3960 - regression_loss: 0.3672 - classification_loss: 0.0288 167/500 [=========>....................] - ETA: 2:48 - loss: 0.3963 - regression_loss: 0.3675 - classification_loss: 0.0288 168/500 [=========>....................] - ETA: 2:47 - loss: 0.3962 - regression_loss: 0.3673 - classification_loss: 0.0289 169/500 [=========>....................] - ETA: 2:47 - loss: 0.3962 - regression_loss: 0.3673 - classification_loss: 0.0289 170/500 [=========>....................] - ETA: 2:46 - loss: 0.3953 - regression_loss: 0.3665 - classification_loss: 0.0288 171/500 [=========>....................] - ETA: 2:46 - loss: 0.3942 - regression_loss: 0.3655 - classification_loss: 0.0287 172/500 [=========>....................] - ETA: 2:45 - loss: 0.3943 - regression_loss: 0.3656 - classification_loss: 0.0287 173/500 [=========>....................] - ETA: 2:45 - loss: 0.3932 - regression_loss: 0.3646 - classification_loss: 0.0286 174/500 [=========>....................] - ETA: 2:44 - loss: 0.3937 - regression_loss: 0.3650 - classification_loss: 0.0286 175/500 [=========>....................] - ETA: 2:44 - loss: 0.3927 - regression_loss: 0.3641 - classification_loss: 0.0285 176/500 [=========>....................] - ETA: 2:43 - loss: 0.3915 - regression_loss: 0.3631 - classification_loss: 0.0284 177/500 [=========>....................] - ETA: 2:43 - loss: 0.3912 - regression_loss: 0.3629 - classification_loss: 0.0283 178/500 [=========>....................] - ETA: 2:42 - loss: 0.3920 - regression_loss: 0.3637 - classification_loss: 0.0283 179/500 [=========>....................] - ETA: 2:42 - loss: 0.3940 - regression_loss: 0.3655 - classification_loss: 0.0285 180/500 [=========>....................] - ETA: 2:41 - loss: 0.3935 - regression_loss: 0.3651 - classification_loss: 0.0284 181/500 [=========>....................] - ETA: 2:41 - loss: 0.3934 - regression_loss: 0.3649 - classification_loss: 0.0284 182/500 [=========>....................] - ETA: 2:40 - loss: 0.3943 - regression_loss: 0.3659 - classification_loss: 0.0284 183/500 [=========>....................] - ETA: 2:40 - loss: 0.3942 - regression_loss: 0.3657 - classification_loss: 0.0285 184/500 [==========>...................] - ETA: 2:39 - loss: 0.3943 - regression_loss: 0.3657 - classification_loss: 0.0286 185/500 [==========>...................] - ETA: 2:39 - loss: 0.3951 - regression_loss: 0.3664 - classification_loss: 0.0286 186/500 [==========>...................] - ETA: 2:38 - loss: 0.3941 - regression_loss: 0.3656 - classification_loss: 0.0285 187/500 [==========>...................] - ETA: 2:38 - loss: 0.3947 - regression_loss: 0.3662 - classification_loss: 0.0285 188/500 [==========>...................] - ETA: 2:37 - loss: 0.3953 - regression_loss: 0.3667 - classification_loss: 0.0285 189/500 [==========>...................] - ETA: 2:37 - loss: 0.3949 - regression_loss: 0.3664 - classification_loss: 0.0285 190/500 [==========>...................] - ETA: 2:36 - loss: 0.3938 - regression_loss: 0.3654 - classification_loss: 0.0284 191/500 [==========>...................] - ETA: 2:36 - loss: 0.3933 - regression_loss: 0.3650 - classification_loss: 0.0283 192/500 [==========>...................] - ETA: 2:35 - loss: 0.3940 - regression_loss: 0.3656 - classification_loss: 0.0285 193/500 [==========>...................] - ETA: 2:35 - loss: 0.3947 - regression_loss: 0.3661 - classification_loss: 0.0287 194/500 [==========>...................] - ETA: 2:34 - loss: 0.3946 - regression_loss: 0.3659 - classification_loss: 0.0286 195/500 [==========>...................] - ETA: 2:34 - loss: 0.3945 - regression_loss: 0.3658 - classification_loss: 0.0287 196/500 [==========>...................] - ETA: 2:33 - loss: 0.3940 - regression_loss: 0.3654 - classification_loss: 0.0286 197/500 [==========>...................] - ETA: 2:33 - loss: 0.3938 - regression_loss: 0.3653 - classification_loss: 0.0285 198/500 [==========>...................] - ETA: 2:32 - loss: 0.3938 - regression_loss: 0.3653 - classification_loss: 0.0285 199/500 [==========>...................] - ETA: 2:32 - loss: 0.3931 - regression_loss: 0.3647 - classification_loss: 0.0284 200/500 [===========>..................] - ETA: 2:31 - loss: 0.3934 - regression_loss: 0.3650 - classification_loss: 0.0284 201/500 [===========>..................] - ETA: 2:31 - loss: 0.3945 - regression_loss: 0.3661 - classification_loss: 0.0285 202/500 [===========>..................] - ETA: 2:30 - loss: 0.3944 - regression_loss: 0.3660 - classification_loss: 0.0284 203/500 [===========>..................] - ETA: 2:30 - loss: 0.3947 - regression_loss: 0.3663 - classification_loss: 0.0284 204/500 [===========>..................] - ETA: 2:29 - loss: 0.3950 - regression_loss: 0.3665 - classification_loss: 0.0285 205/500 [===========>..................] - ETA: 2:29 - loss: 0.3947 - regression_loss: 0.3662 - classification_loss: 0.0285 206/500 [===========>..................] - ETA: 2:28 - loss: 0.3957 - regression_loss: 0.3672 - classification_loss: 0.0285 207/500 [===========>..................] - ETA: 2:28 - loss: 0.3952 - regression_loss: 0.3668 - classification_loss: 0.0284 208/500 [===========>..................] - ETA: 2:27 - loss: 0.3950 - regression_loss: 0.3666 - classification_loss: 0.0284 209/500 [===========>..................] - ETA: 2:27 - loss: 0.3946 - regression_loss: 0.3662 - classification_loss: 0.0284 210/500 [===========>..................] - ETA: 2:26 - loss: 0.3947 - regression_loss: 0.3664 - classification_loss: 0.0283 211/500 [===========>..................] - ETA: 2:26 - loss: 0.3952 - regression_loss: 0.3669 - classification_loss: 0.0283 212/500 [===========>..................] - ETA: 2:25 - loss: 0.3964 - regression_loss: 0.3680 - classification_loss: 0.0283 213/500 [===========>..................] - ETA: 2:25 - loss: 0.3974 - regression_loss: 0.3691 - classification_loss: 0.0283 214/500 [===========>..................] - ETA: 2:24 - loss: 0.3974 - regression_loss: 0.3691 - classification_loss: 0.0283 215/500 [===========>..................] - ETA: 2:23 - loss: 0.3982 - regression_loss: 0.3698 - classification_loss: 0.0284 216/500 [===========>..................] - ETA: 2:23 - loss: 0.3981 - regression_loss: 0.3699 - classification_loss: 0.0283 217/500 [============>.................] - ETA: 2:22 - loss: 0.3976 - regression_loss: 0.3694 - classification_loss: 0.0283 218/500 [============>.................] - ETA: 2:22 - loss: 0.3974 - regression_loss: 0.3691 - classification_loss: 0.0283 219/500 [============>.................] - ETA: 2:21 - loss: 0.3977 - regression_loss: 0.3694 - classification_loss: 0.0283 220/500 [============>.................] - ETA: 2:21 - loss: 0.3972 - regression_loss: 0.3689 - classification_loss: 0.0283 221/500 [============>.................] - ETA: 2:20 - loss: 0.3968 - regression_loss: 0.3685 - classification_loss: 0.0282 222/500 [============>.................] - ETA: 2:20 - loss: 0.3965 - regression_loss: 0.3683 - classification_loss: 0.0282 223/500 [============>.................] - ETA: 2:19 - loss: 0.3984 - regression_loss: 0.3699 - classification_loss: 0.0285 224/500 [============>.................] - ETA: 2:19 - loss: 0.3981 - regression_loss: 0.3696 - classification_loss: 0.0285 225/500 [============>.................] - ETA: 2:18 - loss: 0.3986 - regression_loss: 0.3701 - classification_loss: 0.0285 226/500 [============>.................] - ETA: 2:18 - loss: 0.3989 - regression_loss: 0.3703 - classification_loss: 0.0286 227/500 [============>.................] - ETA: 2:17 - loss: 0.3984 - regression_loss: 0.3698 - classification_loss: 0.0285 228/500 [============>.................] - ETA: 2:17 - loss: 0.3974 - regression_loss: 0.3690 - classification_loss: 0.0285 229/500 [============>.................] - ETA: 2:16 - loss: 0.3974 - regression_loss: 0.3690 - classification_loss: 0.0284 230/500 [============>.................] - ETA: 2:16 - loss: 0.3974 - regression_loss: 0.3689 - classification_loss: 0.0285 231/500 [============>.................] - ETA: 2:15 - loss: 0.3983 - regression_loss: 0.3697 - classification_loss: 0.0286 232/500 [============>.................] - ETA: 2:15 - loss: 0.3976 - regression_loss: 0.3691 - classification_loss: 0.0286 233/500 [============>.................] - ETA: 2:14 - loss: 0.3976 - regression_loss: 0.3691 - classification_loss: 0.0285 234/500 [=============>................] - ETA: 2:14 - loss: 0.3971 - regression_loss: 0.3687 - classification_loss: 0.0284 235/500 [=============>................] - ETA: 2:13 - loss: 0.3975 - regression_loss: 0.3691 - classification_loss: 0.0284 236/500 [=============>................] - ETA: 2:13 - loss: 0.3988 - regression_loss: 0.3703 - classification_loss: 0.0285 237/500 [=============>................] - ETA: 2:12 - loss: 0.3986 - regression_loss: 0.3702 - classification_loss: 0.0285 238/500 [=============>................] - ETA: 2:12 - loss: 0.3987 - regression_loss: 0.3703 - classification_loss: 0.0285 239/500 [=============>................] - ETA: 2:11 - loss: 0.3985 - regression_loss: 0.3700 - classification_loss: 0.0284 240/500 [=============>................] - ETA: 2:11 - loss: 0.3985 - regression_loss: 0.3700 - classification_loss: 0.0285 241/500 [=============>................] - ETA: 2:10 - loss: 0.3989 - regression_loss: 0.3704 - classification_loss: 0.0285 242/500 [=============>................] - ETA: 2:10 - loss: 0.3996 - regression_loss: 0.3711 - classification_loss: 0.0286 243/500 [=============>................] - ETA: 2:09 - loss: 0.3984 - regression_loss: 0.3699 - classification_loss: 0.0285 244/500 [=============>................] - ETA: 2:09 - loss: 0.3986 - regression_loss: 0.3702 - classification_loss: 0.0284 245/500 [=============>................] - ETA: 2:08 - loss: 0.3984 - regression_loss: 0.3700 - classification_loss: 0.0284 246/500 [=============>................] - ETA: 2:08 - loss: 0.3986 - regression_loss: 0.3703 - classification_loss: 0.0283 247/500 [=============>................] - ETA: 2:07 - loss: 0.4000 - regression_loss: 0.3716 - classification_loss: 0.0284 248/500 [=============>................] - ETA: 2:07 - loss: 0.3995 - regression_loss: 0.3711 - classification_loss: 0.0284 249/500 [=============>................] - ETA: 2:06 - loss: 0.4001 - regression_loss: 0.3717 - classification_loss: 0.0283 250/500 [==============>...............] - ETA: 2:06 - loss: 0.3994 - regression_loss: 0.3711 - classification_loss: 0.0283 251/500 [==============>...............] - ETA: 2:05 - loss: 0.3997 - regression_loss: 0.3715 - classification_loss: 0.0283 252/500 [==============>...............] - ETA: 2:05 - loss: 0.3997 - regression_loss: 0.3714 - classification_loss: 0.0282 253/500 [==============>...............] - ETA: 2:04 - loss: 0.3990 - regression_loss: 0.3709 - classification_loss: 0.0281 254/500 [==============>...............] - ETA: 2:04 - loss: 0.4008 - regression_loss: 0.3726 - classification_loss: 0.0283 255/500 [==============>...............] - ETA: 2:03 - loss: 0.4018 - regression_loss: 0.3734 - classification_loss: 0.0284 256/500 [==============>...............] - ETA: 2:03 - loss: 0.4024 - regression_loss: 0.3739 - classification_loss: 0.0285 257/500 [==============>...............] - ETA: 2:02 - loss: 0.4026 - regression_loss: 0.3741 - classification_loss: 0.0285 258/500 [==============>...............] - ETA: 2:02 - loss: 0.4026 - regression_loss: 0.3740 - classification_loss: 0.0286 259/500 [==============>...............] - ETA: 2:01 - loss: 0.4026 - regression_loss: 0.3740 - classification_loss: 0.0286 260/500 [==============>...............] - ETA: 2:01 - loss: 0.4026 - regression_loss: 0.3740 - classification_loss: 0.0286 261/500 [==============>...............] - ETA: 2:00 - loss: 0.4031 - regression_loss: 0.3744 - classification_loss: 0.0287 262/500 [==============>...............] - ETA: 2:00 - loss: 0.4035 - regression_loss: 0.3748 - classification_loss: 0.0287 263/500 [==============>...............] - ETA: 1:59 - loss: 0.4040 - regression_loss: 0.3752 - classification_loss: 0.0288 264/500 [==============>...............] - ETA: 1:59 - loss: 0.4055 - regression_loss: 0.3767 - classification_loss: 0.0288 265/500 [==============>...............] - ETA: 1:58 - loss: 0.4052 - regression_loss: 0.3765 - classification_loss: 0.0288 266/500 [==============>...............] - ETA: 1:58 - loss: 0.4049 - regression_loss: 0.3762 - classification_loss: 0.0287 267/500 [===============>..............] - ETA: 1:57 - loss: 0.4050 - regression_loss: 0.3763 - classification_loss: 0.0287 268/500 [===============>..............] - ETA: 1:57 - loss: 0.4060 - regression_loss: 0.3772 - classification_loss: 0.0288 269/500 [===============>..............] - ETA: 1:56 - loss: 0.4051 - regression_loss: 0.3764 - classification_loss: 0.0287 270/500 [===============>..............] - ETA: 1:56 - loss: 0.4055 - regression_loss: 0.3768 - classification_loss: 0.0287 271/500 [===============>..............] - ETA: 1:55 - loss: 0.4043 - regression_loss: 0.3757 - classification_loss: 0.0286 272/500 [===============>..............] - ETA: 1:54 - loss: 0.4042 - regression_loss: 0.3756 - classification_loss: 0.0286 273/500 [===============>..............] - ETA: 1:54 - loss: 0.4043 - regression_loss: 0.3757 - classification_loss: 0.0286 274/500 [===============>..............] - ETA: 1:53 - loss: 0.4042 - regression_loss: 0.3755 - classification_loss: 0.0286 275/500 [===============>..............] - ETA: 1:53 - loss: 0.4035 - regression_loss: 0.3749 - classification_loss: 0.0286 276/500 [===============>..............] - ETA: 1:52 - loss: 0.4044 - regression_loss: 0.3758 - classification_loss: 0.0285 277/500 [===============>..............] - ETA: 1:52 - loss: 0.4053 - regression_loss: 0.3768 - classification_loss: 0.0285 278/500 [===============>..............] - ETA: 1:51 - loss: 0.4058 - regression_loss: 0.3773 - classification_loss: 0.0285 279/500 [===============>..............] - ETA: 1:51 - loss: 0.4057 - regression_loss: 0.3772 - classification_loss: 0.0284 280/500 [===============>..............] - ETA: 1:50 - loss: 0.4064 - regression_loss: 0.3779 - classification_loss: 0.0285 281/500 [===============>..............] - ETA: 1:50 - loss: 0.4058 - regression_loss: 0.3773 - classification_loss: 0.0285 282/500 [===============>..............] - ETA: 1:49 - loss: 0.4054 - regression_loss: 0.3770 - classification_loss: 0.0284 283/500 [===============>..............] - ETA: 1:49 - loss: 0.4054 - regression_loss: 0.3770 - classification_loss: 0.0284 284/500 [================>.............] - ETA: 1:48 - loss: 0.4061 - regression_loss: 0.3776 - classification_loss: 0.0284 285/500 [================>.............] - ETA: 1:48 - loss: 0.4065 - regression_loss: 0.3781 - classification_loss: 0.0284 286/500 [================>.............] - ETA: 1:47 - loss: 0.4067 - regression_loss: 0.3783 - classification_loss: 0.0284 287/500 [================>.............] - ETA: 1:47 - loss: 0.4060 - regression_loss: 0.3777 - classification_loss: 0.0284 288/500 [================>.............] - ETA: 1:46 - loss: 0.4063 - regression_loss: 0.3779 - classification_loss: 0.0283 289/500 [================>.............] - ETA: 1:46 - loss: 0.4060 - regression_loss: 0.3778 - classification_loss: 0.0283 290/500 [================>.............] - ETA: 1:45 - loss: 0.4059 - regression_loss: 0.3777 - classification_loss: 0.0282 291/500 [================>.............] - ETA: 1:45 - loss: 0.4070 - regression_loss: 0.3787 - classification_loss: 0.0283 292/500 [================>.............] - ETA: 1:44 - loss: 0.4078 - regression_loss: 0.3794 - classification_loss: 0.0283 293/500 [================>.............] - ETA: 1:44 - loss: 0.4078 - regression_loss: 0.3795 - classification_loss: 0.0283 294/500 [================>.............] - ETA: 1:43 - loss: 0.4080 - regression_loss: 0.3797 - classification_loss: 0.0283 295/500 [================>.............] - ETA: 1:43 - loss: 0.4086 - regression_loss: 0.3802 - classification_loss: 0.0284 296/500 [================>.............] - ETA: 1:42 - loss: 0.4080 - regression_loss: 0.3797 - classification_loss: 0.0283 297/500 [================>.............] - ETA: 1:42 - loss: 0.4075 - regression_loss: 0.3792 - classification_loss: 0.0283 298/500 [================>.............] - ETA: 1:41 - loss: 0.4069 - regression_loss: 0.3787 - classification_loss: 0.0282 299/500 [================>.............] - ETA: 1:41 - loss: 0.4068 - regression_loss: 0.3786 - classification_loss: 0.0282 300/500 [=================>............] - ETA: 1:40 - loss: 0.4066 - regression_loss: 0.3785 - classification_loss: 0.0282 301/500 [=================>............] - ETA: 1:40 - loss: 0.4066 - regression_loss: 0.3785 - classification_loss: 0.0281 302/500 [=================>............] - ETA: 1:39 - loss: 0.4065 - regression_loss: 0.3784 - classification_loss: 0.0281 303/500 [=================>............] - ETA: 1:39 - loss: 0.4060 - regression_loss: 0.3779 - classification_loss: 0.0281 304/500 [=================>............] - ETA: 1:38 - loss: 0.4075 - regression_loss: 0.3794 - classification_loss: 0.0281 305/500 [=================>............] - ETA: 1:38 - loss: 0.4072 - regression_loss: 0.3791 - classification_loss: 0.0281 306/500 [=================>............] - ETA: 1:37 - loss: 0.4067 - regression_loss: 0.3787 - classification_loss: 0.0281 307/500 [=================>............] - ETA: 1:37 - loss: 0.4071 - regression_loss: 0.3790 - classification_loss: 0.0281 308/500 [=================>............] - ETA: 1:36 - loss: 0.4073 - regression_loss: 0.3792 - classification_loss: 0.0281 309/500 [=================>............] - ETA: 1:36 - loss: 0.4076 - regression_loss: 0.3794 - classification_loss: 0.0282 310/500 [=================>............] - ETA: 1:35 - loss: 0.4075 - regression_loss: 0.3794 - classification_loss: 0.0281 311/500 [=================>............] - ETA: 1:35 - loss: 0.4080 - regression_loss: 0.3798 - classification_loss: 0.0282 312/500 [=================>............] - ETA: 1:34 - loss: 0.4083 - regression_loss: 0.3801 - classification_loss: 0.0283 313/500 [=================>............] - ETA: 1:34 - loss: 0.4079 - regression_loss: 0.3797 - classification_loss: 0.0282 314/500 [=================>............] - ETA: 1:33 - loss: 0.4078 - regression_loss: 0.3796 - classification_loss: 0.0282 315/500 [=================>............] - ETA: 1:33 - loss: 0.4080 - regression_loss: 0.3798 - classification_loss: 0.0282 316/500 [=================>............] - ETA: 1:32 - loss: 0.4080 - regression_loss: 0.3798 - classification_loss: 0.0281 317/500 [==================>...........] - ETA: 1:32 - loss: 0.4078 - regression_loss: 0.3797 - classification_loss: 0.0281 318/500 [==================>...........] - ETA: 1:31 - loss: 0.4080 - regression_loss: 0.3799 - classification_loss: 0.0281 319/500 [==================>...........] - ETA: 1:31 - loss: 0.4077 - regression_loss: 0.3796 - classification_loss: 0.0281 320/500 [==================>...........] - ETA: 1:30 - loss: 0.4077 - regression_loss: 0.3796 - classification_loss: 0.0281 321/500 [==================>...........] - ETA: 1:30 - loss: 0.4075 - regression_loss: 0.3795 - classification_loss: 0.0281 322/500 [==================>...........] - ETA: 1:29 - loss: 0.4070 - regression_loss: 0.3790 - classification_loss: 0.0280 323/500 [==================>...........] - ETA: 1:29 - loss: 0.4080 - regression_loss: 0.3799 - classification_loss: 0.0281 324/500 [==================>...........] - ETA: 1:28 - loss: 0.4083 - regression_loss: 0.3801 - classification_loss: 0.0282 325/500 [==================>...........] - ETA: 1:28 - loss: 0.4080 - regression_loss: 0.3799 - classification_loss: 0.0281 326/500 [==================>...........] - ETA: 1:27 - loss: 0.4075 - regression_loss: 0.3794 - classification_loss: 0.0281 327/500 [==================>...........] - ETA: 1:27 - loss: 0.4069 - regression_loss: 0.3789 - classification_loss: 0.0280 328/500 [==================>...........] - ETA: 1:26 - loss: 0.4063 - regression_loss: 0.3784 - classification_loss: 0.0279 329/500 [==================>...........] - ETA: 1:26 - loss: 0.4057 - regression_loss: 0.3778 - classification_loss: 0.0279 330/500 [==================>...........] - ETA: 1:25 - loss: 0.4051 - regression_loss: 0.3773 - classification_loss: 0.0278 331/500 [==================>...........] - ETA: 1:25 - loss: 0.4047 - regression_loss: 0.3769 - classification_loss: 0.0278 332/500 [==================>...........] - ETA: 1:24 - loss: 0.4053 - regression_loss: 0.3774 - classification_loss: 0.0279 333/500 [==================>...........] - ETA: 1:24 - loss: 0.4050 - regression_loss: 0.3772 - classification_loss: 0.0278 334/500 [===================>..........] - ETA: 1:23 - loss: 0.4047 - regression_loss: 0.3769 - classification_loss: 0.0278 335/500 [===================>..........] - ETA: 1:23 - loss: 0.4041 - regression_loss: 0.3764 - classification_loss: 0.0278 336/500 [===================>..........] - ETA: 1:22 - loss: 0.4037 - regression_loss: 0.3760 - classification_loss: 0.0277 337/500 [===================>..........] - ETA: 1:22 - loss: 0.4043 - regression_loss: 0.3764 - classification_loss: 0.0279 338/500 [===================>..........] - ETA: 1:21 - loss: 0.4038 - regression_loss: 0.3759 - classification_loss: 0.0278 339/500 [===================>..........] - ETA: 1:21 - loss: 0.4034 - regression_loss: 0.3756 - classification_loss: 0.0278 340/500 [===================>..........] - ETA: 1:20 - loss: 0.4030 - regression_loss: 0.3752 - classification_loss: 0.0278 341/500 [===================>..........] - ETA: 1:20 - loss: 0.4028 - regression_loss: 0.3750 - classification_loss: 0.0277 342/500 [===================>..........] - ETA: 1:19 - loss: 0.4023 - regression_loss: 0.3746 - classification_loss: 0.0277 343/500 [===================>..........] - ETA: 1:19 - loss: 0.4022 - regression_loss: 0.3746 - classification_loss: 0.0277 344/500 [===================>..........] - ETA: 1:18 - loss: 0.4022 - regression_loss: 0.3745 - classification_loss: 0.0277 345/500 [===================>..........] - ETA: 1:18 - loss: 0.4020 - regression_loss: 0.3744 - classification_loss: 0.0277 346/500 [===================>..........] - ETA: 1:17 - loss: 0.4016 - regression_loss: 0.3740 - classification_loss: 0.0276 347/500 [===================>..........] - ETA: 1:17 - loss: 0.4014 - regression_loss: 0.3737 - classification_loss: 0.0276 348/500 [===================>..........] - ETA: 1:16 - loss: 0.4010 - regression_loss: 0.3735 - classification_loss: 0.0275 349/500 [===================>..........] - ETA: 1:16 - loss: 0.4014 - regression_loss: 0.3738 - classification_loss: 0.0275 350/500 [====================>.........] - ETA: 1:15 - loss: 0.4014 - regression_loss: 0.3738 - classification_loss: 0.0276 351/500 [====================>.........] - ETA: 1:15 - loss: 0.4017 - regression_loss: 0.3740 - classification_loss: 0.0276 352/500 [====================>.........] - ETA: 1:14 - loss: 0.4019 - regression_loss: 0.3742 - classification_loss: 0.0277 353/500 [====================>.........] - ETA: 1:14 - loss: 0.4022 - regression_loss: 0.3744 - classification_loss: 0.0278 354/500 [====================>.........] - ETA: 1:13 - loss: 0.4019 - regression_loss: 0.3741 - classification_loss: 0.0278 355/500 [====================>.........] - ETA: 1:13 - loss: 0.4017 - regression_loss: 0.3740 - classification_loss: 0.0278 356/500 [====================>.........] - ETA: 1:12 - loss: 0.4014 - regression_loss: 0.3736 - classification_loss: 0.0277 357/500 [====================>.........] - ETA: 1:12 - loss: 0.4015 - regression_loss: 0.3738 - classification_loss: 0.0277 358/500 [====================>.........] - ETA: 1:11 - loss: 0.4028 - regression_loss: 0.3749 - classification_loss: 0.0279 359/500 [====================>.........] - ETA: 1:11 - loss: 0.4045 - regression_loss: 0.3764 - classification_loss: 0.0281 360/500 [====================>.........] - ETA: 1:10 - loss: 0.4042 - regression_loss: 0.3761 - classification_loss: 0.0281 361/500 [====================>.........] - ETA: 1:10 - loss: 0.4041 - regression_loss: 0.3760 - classification_loss: 0.0281 362/500 [====================>.........] - ETA: 1:09 - loss: 0.4040 - regression_loss: 0.3760 - classification_loss: 0.0281 363/500 [====================>.........] - ETA: 1:08 - loss: 0.4042 - regression_loss: 0.3762 - classification_loss: 0.0281 364/500 [====================>.........] - ETA: 1:08 - loss: 0.4040 - regression_loss: 0.3760 - classification_loss: 0.0280 365/500 [====================>.........] - ETA: 1:07 - loss: 0.4041 - regression_loss: 0.3761 - classification_loss: 0.0280 366/500 [====================>.........] - ETA: 1:07 - loss: 0.4039 - regression_loss: 0.3759 - classification_loss: 0.0280 367/500 [=====================>........] - ETA: 1:06 - loss: 0.4048 - regression_loss: 0.3767 - classification_loss: 0.0281 368/500 [=====================>........] - ETA: 1:06 - loss: 0.4049 - regression_loss: 0.3768 - classification_loss: 0.0281 369/500 [=====================>........] - ETA: 1:05 - loss: 0.4049 - regression_loss: 0.3768 - classification_loss: 0.0281 370/500 [=====================>........] - ETA: 1:05 - loss: 0.4045 - regression_loss: 0.3764 - classification_loss: 0.0281 371/500 [=====================>........] - ETA: 1:04 - loss: 0.4041 - regression_loss: 0.3761 - classification_loss: 0.0281 372/500 [=====================>........] - ETA: 1:04 - loss: 0.4040 - regression_loss: 0.3760 - classification_loss: 0.0281 373/500 [=====================>........] - ETA: 1:03 - loss: 0.4040 - regression_loss: 0.3760 - classification_loss: 0.0280 374/500 [=====================>........] - ETA: 1:03 - loss: 0.4042 - regression_loss: 0.3761 - classification_loss: 0.0281 375/500 [=====================>........] - ETA: 1:02 - loss: 0.4041 - regression_loss: 0.3760 - classification_loss: 0.0281 376/500 [=====================>........] - ETA: 1:02 - loss: 0.4044 - regression_loss: 0.3763 - classification_loss: 0.0281 377/500 [=====================>........] - ETA: 1:01 - loss: 0.4042 - regression_loss: 0.3761 - classification_loss: 0.0281 378/500 [=====================>........] - ETA: 1:01 - loss: 0.4039 - regression_loss: 0.3758 - classification_loss: 0.0281 379/500 [=====================>........] - ETA: 1:00 - loss: 0.4032 - regression_loss: 0.3751 - classification_loss: 0.0280 380/500 [=====================>........] - ETA: 1:00 - loss: 0.4040 - regression_loss: 0.3758 - classification_loss: 0.0281 381/500 [=====================>........] - ETA: 59s - loss: 0.4037 - regression_loss: 0.3756 - classification_loss: 0.0281  382/500 [=====================>........] - ETA: 59s - loss: 0.4043 - regression_loss: 0.3762 - classification_loss: 0.0282 383/500 [=====================>........] - ETA: 58s - loss: 0.4045 - regression_loss: 0.3763 - classification_loss: 0.0282 384/500 [======================>.......] - ETA: 58s - loss: 0.4042 - regression_loss: 0.3761 - classification_loss: 0.0282 385/500 [======================>.......] - ETA: 57s - loss: 0.4038 - regression_loss: 0.3757 - classification_loss: 0.0282 386/500 [======================>.......] - ETA: 57s - loss: 0.4038 - regression_loss: 0.3756 - classification_loss: 0.0282 387/500 [======================>.......] - ETA: 56s - loss: 0.4031 - regression_loss: 0.3749 - classification_loss: 0.0282 388/500 [======================>.......] - ETA: 56s - loss: 0.4031 - regression_loss: 0.3749 - classification_loss: 0.0282 389/500 [======================>.......] - ETA: 55s - loss: 0.4029 - regression_loss: 0.3748 - classification_loss: 0.0282 390/500 [======================>.......] - ETA: 55s - loss: 0.4027 - regression_loss: 0.3746 - classification_loss: 0.0281 391/500 [======================>.......] - ETA: 54s - loss: 0.4026 - regression_loss: 0.3744 - classification_loss: 0.0282 392/500 [======================>.......] - ETA: 54s - loss: 0.4021 - regression_loss: 0.3739 - classification_loss: 0.0281 393/500 [======================>.......] - ETA: 53s - loss: 0.4022 - regression_loss: 0.3741 - classification_loss: 0.0281 394/500 [======================>.......] - ETA: 53s - loss: 0.4018 - regression_loss: 0.3737 - classification_loss: 0.0281 395/500 [======================>.......] - ETA: 52s - loss: 0.4014 - regression_loss: 0.3734 - classification_loss: 0.0281 396/500 [======================>.......] - ETA: 52s - loss: 0.4015 - regression_loss: 0.3735 - classification_loss: 0.0280 397/500 [======================>.......] - ETA: 51s - loss: 0.4025 - regression_loss: 0.3743 - classification_loss: 0.0281 398/500 [======================>.......] - ETA: 51s - loss: 0.4032 - regression_loss: 0.3750 - classification_loss: 0.0282 399/500 [======================>.......] - ETA: 50s - loss: 0.4044 - regression_loss: 0.3761 - classification_loss: 0.0284 400/500 [=======================>......] - ETA: 50s - loss: 0.4039 - regression_loss: 0.3756 - classification_loss: 0.0283 401/500 [=======================>......] - ETA: 49s - loss: 0.4041 - regression_loss: 0.3758 - classification_loss: 0.0283 402/500 [=======================>......] - ETA: 49s - loss: 0.4037 - regression_loss: 0.3754 - classification_loss: 0.0283 403/500 [=======================>......] - ETA: 48s - loss: 0.4039 - regression_loss: 0.3755 - classification_loss: 0.0283 404/500 [=======================>......] - ETA: 48s - loss: 0.4038 - regression_loss: 0.3754 - classification_loss: 0.0284 405/500 [=======================>......] - ETA: 47s - loss: 0.4032 - regression_loss: 0.3748 - classification_loss: 0.0283 406/500 [=======================>......] - ETA: 47s - loss: 0.4036 - regression_loss: 0.3753 - classification_loss: 0.0283 407/500 [=======================>......] - ETA: 46s - loss: 0.4041 - regression_loss: 0.3756 - classification_loss: 0.0284 408/500 [=======================>......] - ETA: 46s - loss: 0.4038 - regression_loss: 0.3754 - classification_loss: 0.0284 409/500 [=======================>......] - ETA: 45s - loss: 0.4035 - regression_loss: 0.3752 - classification_loss: 0.0284 410/500 [=======================>......] - ETA: 45s - loss: 0.4040 - regression_loss: 0.3757 - classification_loss: 0.0284 411/500 [=======================>......] - ETA: 44s - loss: 0.4039 - regression_loss: 0.3756 - classification_loss: 0.0283 412/500 [=======================>......] - ETA: 44s - loss: 0.4049 - regression_loss: 0.3765 - classification_loss: 0.0284 413/500 [=======================>......] - ETA: 43s - loss: 0.4060 - regression_loss: 0.3775 - classification_loss: 0.0284 414/500 [=======================>......] - ETA: 43s - loss: 0.4057 - regression_loss: 0.3773 - classification_loss: 0.0284 415/500 [=======================>......] - ETA: 42s - loss: 0.4064 - regression_loss: 0.3779 - classification_loss: 0.0285 416/500 [=======================>......] - ETA: 42s - loss: 0.4065 - regression_loss: 0.3780 - classification_loss: 0.0285 417/500 [========================>.....] - ETA: 41s - loss: 0.4059 - regression_loss: 0.3774 - classification_loss: 0.0285 418/500 [========================>.....] - ETA: 41s - loss: 0.4058 - regression_loss: 0.3773 - classification_loss: 0.0285 419/500 [========================>.....] - ETA: 40s - loss: 0.4062 - regression_loss: 0.3778 - classification_loss: 0.0285 420/500 [========================>.....] - ETA: 40s - loss: 0.4066 - regression_loss: 0.3781 - classification_loss: 0.0284 421/500 [========================>.....] - ETA: 39s - loss: 0.4069 - regression_loss: 0.3785 - classification_loss: 0.0285 422/500 [========================>.....] - ETA: 39s - loss: 0.4071 - regression_loss: 0.3787 - classification_loss: 0.0284 423/500 [========================>.....] - ETA: 38s - loss: 0.4081 - regression_loss: 0.3796 - classification_loss: 0.0285 424/500 [========================>.....] - ETA: 38s - loss: 0.4086 - regression_loss: 0.3801 - classification_loss: 0.0285 425/500 [========================>.....] - ETA: 37s - loss: 0.4084 - regression_loss: 0.3799 - classification_loss: 0.0285 426/500 [========================>.....] - ETA: 37s - loss: 0.4083 - regression_loss: 0.3798 - classification_loss: 0.0285 427/500 [========================>.....] - ETA: 36s - loss: 0.4082 - regression_loss: 0.3797 - classification_loss: 0.0284 428/500 [========================>.....] - ETA: 36s - loss: 0.4084 - regression_loss: 0.3800 - classification_loss: 0.0284 429/500 [========================>.....] - ETA: 35s - loss: 0.4080 - regression_loss: 0.3796 - classification_loss: 0.0284 430/500 [========================>.....] - ETA: 35s - loss: 0.4085 - regression_loss: 0.3801 - classification_loss: 0.0284 431/500 [========================>.....] - ETA: 34s - loss: 0.4082 - regression_loss: 0.3798 - classification_loss: 0.0284 432/500 [========================>.....] - ETA: 34s - loss: 0.4088 - regression_loss: 0.3804 - classification_loss: 0.0284 433/500 [========================>.....] - ETA: 33s - loss: 0.4093 - regression_loss: 0.3809 - classification_loss: 0.0284 434/500 [=========================>....] - ETA: 33s - loss: 0.4095 - regression_loss: 0.3811 - classification_loss: 0.0284 435/500 [=========================>....] - ETA: 32s - loss: 0.4100 - regression_loss: 0.3816 - classification_loss: 0.0285 436/500 [=========================>....] - ETA: 32s - loss: 0.4098 - regression_loss: 0.3814 - classification_loss: 0.0284 437/500 [=========================>....] - ETA: 31s - loss: 0.4098 - regression_loss: 0.3813 - classification_loss: 0.0284 438/500 [=========================>....] - ETA: 31s - loss: 0.4098 - regression_loss: 0.3814 - classification_loss: 0.0284 439/500 [=========================>....] - ETA: 30s - loss: 0.4105 - regression_loss: 0.3821 - classification_loss: 0.0284 440/500 [=========================>....] - ETA: 30s - loss: 0.4107 - regression_loss: 0.3823 - classification_loss: 0.0284 441/500 [=========================>....] - ETA: 29s - loss: 0.4115 - regression_loss: 0.3830 - classification_loss: 0.0285 442/500 [=========================>....] - ETA: 29s - loss: 0.4122 - regression_loss: 0.3837 - classification_loss: 0.0285 443/500 [=========================>....] - ETA: 28s - loss: 0.4129 - regression_loss: 0.3843 - classification_loss: 0.0285 444/500 [=========================>....] - ETA: 28s - loss: 0.4135 - regression_loss: 0.3849 - classification_loss: 0.0286 445/500 [=========================>....] - ETA: 27s - loss: 0.4140 - regression_loss: 0.3854 - classification_loss: 0.0286 446/500 [=========================>....] - ETA: 27s - loss: 0.4138 - regression_loss: 0.3852 - classification_loss: 0.0286 447/500 [=========================>....] - ETA: 26s - loss: 0.4147 - regression_loss: 0.3860 - classification_loss: 0.0288 448/500 [=========================>....] - ETA: 26s - loss: 0.4146 - regression_loss: 0.3859 - classification_loss: 0.0287 449/500 [=========================>....] - ETA: 25s - loss: 0.4145 - regression_loss: 0.3858 - classification_loss: 0.0287 450/500 [==========================>...] - ETA: 25s - loss: 0.4152 - regression_loss: 0.3864 - classification_loss: 0.0288 451/500 [==========================>...] - ETA: 24s - loss: 0.4154 - regression_loss: 0.3866 - classification_loss: 0.0288 452/500 [==========================>...] - ETA: 24s - loss: 0.4152 - regression_loss: 0.3864 - classification_loss: 0.0288 453/500 [==========================>...] - ETA: 23s - loss: 0.4156 - regression_loss: 0.3867 - classification_loss: 0.0288 454/500 [==========================>...] - ETA: 23s - loss: 0.4158 - regression_loss: 0.3870 - classification_loss: 0.0288 455/500 [==========================>...] - ETA: 22s - loss: 0.4162 - regression_loss: 0.3873 - classification_loss: 0.0289 456/500 [==========================>...] - ETA: 22s - loss: 0.4160 - regression_loss: 0.3872 - classification_loss: 0.0288 457/500 [==========================>...] - ETA: 21s - loss: 0.4166 - regression_loss: 0.3877 - classification_loss: 0.0289 458/500 [==========================>...] - ETA: 21s - loss: 0.4162 - regression_loss: 0.3874 - classification_loss: 0.0289 459/500 [==========================>...] - ETA: 20s - loss: 0.4161 - regression_loss: 0.3872 - classification_loss: 0.0289 460/500 [==========================>...] - ETA: 20s - loss: 0.4161 - regression_loss: 0.3872 - classification_loss: 0.0289 461/500 [==========================>...] - ETA: 19s - loss: 0.4165 - regression_loss: 0.3876 - classification_loss: 0.0290 462/500 [==========================>...] - ETA: 19s - loss: 0.4163 - regression_loss: 0.3873 - classification_loss: 0.0289 463/500 [==========================>...] - ETA: 18s - loss: 0.4162 - regression_loss: 0.3872 - classification_loss: 0.0289 464/500 [==========================>...] - ETA: 18s - loss: 0.4160 - regression_loss: 0.3870 - classification_loss: 0.0289 465/500 [==========================>...] - ETA: 17s - loss: 0.4160 - regression_loss: 0.3870 - classification_loss: 0.0290 466/500 [==========================>...] - ETA: 17s - loss: 0.4155 - regression_loss: 0.3866 - classification_loss: 0.0289 467/500 [===========================>..] - ETA: 16s - loss: 0.4153 - regression_loss: 0.3864 - classification_loss: 0.0289 468/500 [===========================>..] - ETA: 16s - loss: 0.4158 - regression_loss: 0.3869 - classification_loss: 0.0289 469/500 [===========================>..] - ETA: 15s - loss: 0.4156 - regression_loss: 0.3867 - classification_loss: 0.0289 470/500 [===========================>..] - ETA: 15s - loss: 0.4152 - regression_loss: 0.3863 - classification_loss: 0.0288 471/500 [===========================>..] - ETA: 14s - loss: 0.4148 - regression_loss: 0.3860 - classification_loss: 0.0288 472/500 [===========================>..] - ETA: 14s - loss: 0.4147 - regression_loss: 0.3860 - classification_loss: 0.0288 473/500 [===========================>..] - ETA: 13s - loss: 0.4147 - regression_loss: 0.3859 - classification_loss: 0.0287 474/500 [===========================>..] - ETA: 13s - loss: 0.4146 - regression_loss: 0.3858 - classification_loss: 0.0288 475/500 [===========================>..] - ETA: 12s - loss: 0.4151 - regression_loss: 0.3863 - classification_loss: 0.0288 476/500 [===========================>..] - ETA: 12s - loss: 0.4153 - regression_loss: 0.3865 - classification_loss: 0.0288 477/500 [===========================>..] - ETA: 11s - loss: 0.4152 - regression_loss: 0.3863 - classification_loss: 0.0289 478/500 [===========================>..] - ETA: 11s - loss: 0.4149 - regression_loss: 0.3860 - classification_loss: 0.0288 479/500 [===========================>..] - ETA: 10s - loss: 0.4150 - regression_loss: 0.3861 - classification_loss: 0.0288 480/500 [===========================>..] - ETA: 10s - loss: 0.4147 - regression_loss: 0.3859 - classification_loss: 0.0288 481/500 [===========================>..] - ETA: 9s - loss: 0.4149 - regression_loss: 0.3861 - classification_loss: 0.0288  482/500 [===========================>..] - ETA: 9s - loss: 0.4152 - regression_loss: 0.3864 - classification_loss: 0.0288 483/500 [===========================>..] - ETA: 8s - loss: 0.4149 - regression_loss: 0.3861 - classification_loss: 0.0288 484/500 [============================>.] - ETA: 8s - loss: 0.4147 - regression_loss: 0.3859 - classification_loss: 0.0288 485/500 [============================>.] - ETA: 7s - loss: 0.4153 - regression_loss: 0.3864 - classification_loss: 0.0289 486/500 [============================>.] - ETA: 7s - loss: 0.4156 - regression_loss: 0.3867 - classification_loss: 0.0289 487/500 [============================>.] - ETA: 6s - loss: 0.4158 - regression_loss: 0.3869 - classification_loss: 0.0289 488/500 [============================>.] - ETA: 6s - loss: 0.4158 - regression_loss: 0.3869 - classification_loss: 0.0289 489/500 [============================>.] - ETA: 5s - loss: 0.4161 - regression_loss: 0.3872 - classification_loss: 0.0289 490/500 [============================>.] - ETA: 5s - loss: 0.4159 - regression_loss: 0.3869 - classification_loss: 0.0289 491/500 [============================>.] - ETA: 4s - loss: 0.4155 - regression_loss: 0.3866 - classification_loss: 0.0289 492/500 [============================>.] - ETA: 4s - loss: 0.4156 - regression_loss: 0.3867 - classification_loss: 0.0289 493/500 [============================>.] - ETA: 3s - loss: 0.4158 - regression_loss: 0.3868 - classification_loss: 0.0290 494/500 [============================>.] - ETA: 3s - loss: 0.4154 - regression_loss: 0.3865 - classification_loss: 0.0289 495/500 [============================>.] - ETA: 2s - loss: 0.4154 - regression_loss: 0.3865 - classification_loss: 0.0289 496/500 [============================>.] - ETA: 2s - loss: 0.4154 - regression_loss: 0.3865 - classification_loss: 0.0289 497/500 [============================>.] - ETA: 1s - loss: 0.4151 - regression_loss: 0.3862 - classification_loss: 0.0289 498/500 [============================>.] - ETA: 1s - loss: 0.4149 - regression_loss: 0.3860 - classification_loss: 0.0289 499/500 [============================>.] - ETA: 0s - loss: 0.4146 - regression_loss: 0.3857 - classification_loss: 0.0289 500/500 [==============================] - 251s 503ms/step - loss: 0.4144 - regression_loss: 0.3856 - classification_loss: 0.0289 1172 instances of class plum with average precision: 0.7388 mAP: 0.7388 Epoch 00009: saving model to ./training/snapshots/resnet101_pascal_09.h5 Epoch 10/150 1/500 [..............................] - ETA: 4:06 - loss: 0.3438 - regression_loss: 0.3253 - classification_loss: 0.0185 2/500 [..............................] - ETA: 4:05 - loss: 0.3243 - regression_loss: 0.3010 - classification_loss: 0.0232 3/500 [..............................] - ETA: 4:06 - loss: 0.3805 - regression_loss: 0.3513 - classification_loss: 0.0292 4/500 [..............................] - ETA: 4:09 - loss: 0.3577 - regression_loss: 0.3290 - classification_loss: 0.0287 5/500 [..............................] - ETA: 4:10 - loss: 0.4025 - regression_loss: 0.3733 - classification_loss: 0.0292 6/500 [..............................] - ETA: 4:11 - loss: 0.3838 - regression_loss: 0.3562 - classification_loss: 0.0276 7/500 [..............................] - ETA: 4:10 - loss: 0.3502 - regression_loss: 0.3252 - classification_loss: 0.0250 8/500 [..............................] - ETA: 4:10 - loss: 0.3371 - regression_loss: 0.3134 - classification_loss: 0.0237 9/500 [..............................] - ETA: 4:09 - loss: 0.3455 - regression_loss: 0.3192 - classification_loss: 0.0262 10/500 [..............................] - ETA: 4:07 - loss: 0.3719 - regression_loss: 0.3437 - classification_loss: 0.0282 11/500 [..............................] - ETA: 4:06 - loss: 0.3625 - regression_loss: 0.3349 - classification_loss: 0.0276 12/500 [..............................] - ETA: 4:06 - loss: 0.3615 - regression_loss: 0.3337 - classification_loss: 0.0278 13/500 [..............................] - ETA: 4:05 - loss: 0.3719 - regression_loss: 0.3415 - classification_loss: 0.0304 14/500 [..............................] - ETA: 4:04 - loss: 0.3683 - regression_loss: 0.3389 - classification_loss: 0.0294 15/500 [..............................] - ETA: 4:04 - loss: 0.3606 - regression_loss: 0.3325 - classification_loss: 0.0281 16/500 [..............................] - ETA: 4:03 - loss: 0.3535 - regression_loss: 0.3266 - classification_loss: 0.0269 17/500 [>.............................] - ETA: 4:02 - loss: 0.3456 - regression_loss: 0.3195 - classification_loss: 0.0261 18/500 [>.............................] - ETA: 4:01 - loss: 0.3522 - regression_loss: 0.3224 - classification_loss: 0.0298 19/500 [>.............................] - ETA: 4:01 - loss: 0.3453 - regression_loss: 0.3169 - classification_loss: 0.0284 20/500 [>.............................] - ETA: 4:01 - loss: 0.3561 - regression_loss: 0.3264 - classification_loss: 0.0297 21/500 [>.............................] - ETA: 4:00 - loss: 0.3552 - regression_loss: 0.3267 - classification_loss: 0.0285 22/500 [>.............................] - ETA: 4:00 - loss: 0.3481 - regression_loss: 0.3206 - classification_loss: 0.0275 23/500 [>.............................] - ETA: 3:59 - loss: 0.3588 - regression_loss: 0.3310 - classification_loss: 0.0279 24/500 [>.............................] - ETA: 3:58 - loss: 0.3652 - regression_loss: 0.3371 - classification_loss: 0.0281 25/500 [>.............................] - ETA: 3:58 - loss: 0.3722 - regression_loss: 0.3441 - classification_loss: 0.0281 26/500 [>.............................] - ETA: 3:58 - loss: 0.3698 - regression_loss: 0.3420 - classification_loss: 0.0278 27/500 [>.............................] - ETA: 3:57 - loss: 0.3666 - regression_loss: 0.3389 - classification_loss: 0.0277 28/500 [>.............................] - ETA: 3:56 - loss: 0.3618 - regression_loss: 0.3346 - classification_loss: 0.0271 29/500 [>.............................] - ETA: 3:56 - loss: 0.3676 - regression_loss: 0.3404 - classification_loss: 0.0272 30/500 [>.............................] - ETA: 3:56 - loss: 0.3613 - regression_loss: 0.3346 - classification_loss: 0.0267 31/500 [>.............................] - ETA: 3:55 - loss: 0.3644 - regression_loss: 0.3374 - classification_loss: 0.0270 32/500 [>.............................] - ETA: 3:54 - loss: 0.3766 - regression_loss: 0.3473 - classification_loss: 0.0293 33/500 [>.............................] - ETA: 3:54 - loss: 0.3784 - regression_loss: 0.3492 - classification_loss: 0.0292 34/500 [=>............................] - ETA: 3:54 - loss: 0.3838 - regression_loss: 0.3543 - classification_loss: 0.0295 35/500 [=>............................] - ETA: 3:53 - loss: 0.3866 - regression_loss: 0.3556 - classification_loss: 0.0310 36/500 [=>............................] - ETA: 3:52 - loss: 0.4020 - regression_loss: 0.3701 - classification_loss: 0.0319 37/500 [=>............................] - ETA: 3:52 - loss: 0.4012 - regression_loss: 0.3694 - classification_loss: 0.0318 38/500 [=>............................] - ETA: 3:51 - loss: 0.4053 - regression_loss: 0.3734 - classification_loss: 0.0319 39/500 [=>............................] - ETA: 3:51 - loss: 0.4024 - regression_loss: 0.3709 - classification_loss: 0.0315 40/500 [=>............................] - ETA: 3:50 - loss: 0.3984 - regression_loss: 0.3674 - classification_loss: 0.0310 41/500 [=>............................] - ETA: 3:50 - loss: 0.3993 - regression_loss: 0.3687 - classification_loss: 0.0307 42/500 [=>............................] - ETA: 3:50 - loss: 0.3993 - regression_loss: 0.3687 - classification_loss: 0.0306 43/500 [=>............................] - ETA: 3:49 - loss: 0.4002 - regression_loss: 0.3695 - classification_loss: 0.0307 44/500 [=>............................] - ETA: 3:49 - loss: 0.3986 - regression_loss: 0.3679 - classification_loss: 0.0307 45/500 [=>............................] - ETA: 3:48 - loss: 0.4059 - regression_loss: 0.3750 - classification_loss: 0.0309 46/500 [=>............................] - ETA: 3:47 - loss: 0.4084 - regression_loss: 0.3770 - classification_loss: 0.0314 47/500 [=>............................] - ETA: 3:47 - loss: 0.4101 - regression_loss: 0.3785 - classification_loss: 0.0316 48/500 [=>............................] - ETA: 3:47 - loss: 0.4118 - regression_loss: 0.3796 - classification_loss: 0.0322 49/500 [=>............................] - ETA: 3:46 - loss: 0.4117 - regression_loss: 0.3796 - classification_loss: 0.0320 50/500 [==>...........................] - ETA: 3:46 - loss: 0.4077 - regression_loss: 0.3761 - classification_loss: 0.0317 51/500 [==>...........................] - ETA: 3:45 - loss: 0.4056 - regression_loss: 0.3744 - classification_loss: 0.0312 52/500 [==>...........................] - ETA: 3:45 - loss: 0.4042 - regression_loss: 0.3730 - classification_loss: 0.0312 53/500 [==>...........................] - ETA: 3:44 - loss: 0.4005 - regression_loss: 0.3697 - classification_loss: 0.0308 54/500 [==>...........................] - ETA: 3:44 - loss: 0.3996 - regression_loss: 0.3690 - classification_loss: 0.0307 55/500 [==>...........................] - ETA: 3:43 - loss: 0.3990 - regression_loss: 0.3683 - classification_loss: 0.0307 56/500 [==>...........................] - ETA: 3:43 - loss: 0.3962 - regression_loss: 0.3658 - classification_loss: 0.0304 57/500 [==>...........................] - ETA: 3:42 - loss: 0.3927 - regression_loss: 0.3626 - classification_loss: 0.0301 58/500 [==>...........................] - ETA: 3:42 - loss: 0.3940 - regression_loss: 0.3637 - classification_loss: 0.0303 59/500 [==>...........................] - ETA: 3:41 - loss: 0.3979 - regression_loss: 0.3674 - classification_loss: 0.0305 60/500 [==>...........................] - ETA: 3:41 - loss: 0.3968 - regression_loss: 0.3665 - classification_loss: 0.0303 61/500 [==>...........................] - ETA: 3:40 - loss: 0.3962 - regression_loss: 0.3660 - classification_loss: 0.0303 62/500 [==>...........................] - ETA: 3:40 - loss: 0.3975 - regression_loss: 0.3666 - classification_loss: 0.0309 63/500 [==>...........................] - ETA: 3:39 - loss: 0.3961 - regression_loss: 0.3656 - classification_loss: 0.0305 64/500 [==>...........................] - ETA: 3:39 - loss: 0.3920 - regression_loss: 0.3618 - classification_loss: 0.0302 65/500 [==>...........................] - ETA: 3:38 - loss: 0.3901 - regression_loss: 0.3601 - classification_loss: 0.0300 66/500 [==>...........................] - ETA: 3:38 - loss: 0.3893 - regression_loss: 0.3594 - classification_loss: 0.0299 67/500 [===>..........................] - ETA: 3:37 - loss: 0.3874 - regression_loss: 0.3576 - classification_loss: 0.0298 68/500 [===>..........................] - ETA: 3:37 - loss: 0.3852 - regression_loss: 0.3557 - classification_loss: 0.0295 69/500 [===>..........................] - ETA: 3:36 - loss: 0.3848 - regression_loss: 0.3553 - classification_loss: 0.0295 70/500 [===>..........................] - ETA: 3:36 - loss: 0.3859 - regression_loss: 0.3561 - classification_loss: 0.0298 71/500 [===>..........................] - ETA: 3:35 - loss: 0.3843 - regression_loss: 0.3546 - classification_loss: 0.0297 72/500 [===>..........................] - ETA: 3:35 - loss: 0.3864 - regression_loss: 0.3567 - classification_loss: 0.0298 73/500 [===>..........................] - ETA: 3:34 - loss: 0.3861 - regression_loss: 0.3565 - classification_loss: 0.0296 74/500 [===>..........................] - ETA: 3:34 - loss: 0.3870 - regression_loss: 0.3573 - classification_loss: 0.0296 75/500 [===>..........................] - ETA: 3:33 - loss: 0.3882 - regression_loss: 0.3585 - classification_loss: 0.0297 76/500 [===>..........................] - ETA: 3:33 - loss: 0.3874 - regression_loss: 0.3577 - classification_loss: 0.0297 77/500 [===>..........................] - ETA: 3:32 - loss: 0.3867 - regression_loss: 0.3570 - classification_loss: 0.0297 78/500 [===>..........................] - ETA: 3:32 - loss: 0.3860 - regression_loss: 0.3565 - classification_loss: 0.0295 79/500 [===>..........................] - ETA: 3:31 - loss: 0.3864 - regression_loss: 0.3568 - classification_loss: 0.0297 80/500 [===>..........................] - ETA: 3:30 - loss: 0.3849 - regression_loss: 0.3554 - classification_loss: 0.0295 81/500 [===>..........................] - ETA: 3:30 - loss: 0.3876 - regression_loss: 0.3579 - classification_loss: 0.0297 82/500 [===>..........................] - ETA: 3:30 - loss: 0.3877 - regression_loss: 0.3580 - classification_loss: 0.0297 83/500 [===>..........................] - ETA: 3:29 - loss: 0.3873 - regression_loss: 0.3574 - classification_loss: 0.0299 84/500 [====>.........................] - ETA: 3:29 - loss: 0.3856 - regression_loss: 0.3558 - classification_loss: 0.0298 85/500 [====>.........................] - ETA: 3:28 - loss: 0.3877 - regression_loss: 0.3577 - classification_loss: 0.0299 86/500 [====>.........................] - ETA: 3:28 - loss: 0.3899 - regression_loss: 0.3598 - classification_loss: 0.0300 87/500 [====>.........................] - ETA: 3:27 - loss: 0.3920 - regression_loss: 0.3616 - classification_loss: 0.0305 88/500 [====>.........................] - ETA: 3:27 - loss: 0.3908 - regression_loss: 0.3604 - classification_loss: 0.0304 89/500 [====>.........................] - ETA: 3:26 - loss: 0.3897 - regression_loss: 0.3596 - classification_loss: 0.0301 90/500 [====>.........................] - ETA: 3:26 - loss: 0.3887 - regression_loss: 0.3587 - classification_loss: 0.0300 91/500 [====>.........................] - ETA: 3:25 - loss: 0.3900 - regression_loss: 0.3597 - classification_loss: 0.0303 92/500 [====>.........................] - ETA: 3:25 - loss: 0.3901 - regression_loss: 0.3599 - classification_loss: 0.0302 93/500 [====>.........................] - ETA: 3:24 - loss: 0.3932 - regression_loss: 0.3631 - classification_loss: 0.0301 94/500 [====>.........................] - ETA: 3:24 - loss: 0.3933 - regression_loss: 0.3634 - classification_loss: 0.0299 95/500 [====>.........................] - ETA: 3:23 - loss: 0.3937 - regression_loss: 0.3640 - classification_loss: 0.0297 96/500 [====>.........................] - ETA: 3:23 - loss: 0.3954 - regression_loss: 0.3655 - classification_loss: 0.0299 97/500 [====>.........................] - ETA: 3:22 - loss: 0.3942 - regression_loss: 0.3644 - classification_loss: 0.0298 98/500 [====>.........................] - ETA: 3:22 - loss: 0.3931 - regression_loss: 0.3634 - classification_loss: 0.0297 99/500 [====>.........................] - ETA: 3:21 - loss: 0.3913 - regression_loss: 0.3618 - classification_loss: 0.0295 100/500 [=====>........................] - ETA: 3:21 - loss: 0.3910 - regression_loss: 0.3615 - classification_loss: 0.0295 101/500 [=====>........................] - ETA: 3:20 - loss: 0.3917 - regression_loss: 0.3619 - classification_loss: 0.0298 102/500 [=====>........................] - ETA: 3:19 - loss: 0.3923 - regression_loss: 0.3627 - classification_loss: 0.0296 103/500 [=====>........................] - ETA: 3:19 - loss: 0.3904 - regression_loss: 0.3610 - classification_loss: 0.0294 104/500 [=====>........................] - ETA: 3:19 - loss: 0.3897 - regression_loss: 0.3603 - classification_loss: 0.0294 105/500 [=====>........................] - ETA: 3:18 - loss: 0.3916 - regression_loss: 0.3621 - classification_loss: 0.0295 106/500 [=====>........................] - ETA: 3:18 - loss: 0.3897 - regression_loss: 0.3604 - classification_loss: 0.0293 107/500 [=====>........................] - ETA: 3:17 - loss: 0.3900 - regression_loss: 0.3609 - classification_loss: 0.0291 108/500 [=====>........................] - ETA: 3:17 - loss: 0.3908 - regression_loss: 0.3616 - classification_loss: 0.0291 109/500 [=====>........................] - ETA: 3:16 - loss: 0.3897 - regression_loss: 0.3606 - classification_loss: 0.0291 110/500 [=====>........................] - ETA: 3:16 - loss: 0.3905 - regression_loss: 0.3614 - classification_loss: 0.0291 111/500 [=====>........................] - ETA: 3:15 - loss: 0.3892 - regression_loss: 0.3602 - classification_loss: 0.0290 112/500 [=====>........................] - ETA: 3:15 - loss: 0.3895 - regression_loss: 0.3606 - classification_loss: 0.0288 113/500 [=====>........................] - ETA: 3:14 - loss: 0.3913 - regression_loss: 0.3625 - classification_loss: 0.0287 114/500 [=====>........................] - ETA: 3:14 - loss: 0.3948 - regression_loss: 0.3655 - classification_loss: 0.0293 115/500 [=====>........................] - ETA: 3:13 - loss: 0.3950 - regression_loss: 0.3657 - classification_loss: 0.0293 116/500 [=====>........................] - ETA: 3:13 - loss: 0.3937 - regression_loss: 0.3644 - classification_loss: 0.0292 117/500 [======>.......................] - ETA: 3:12 - loss: 0.3927 - regression_loss: 0.3636 - classification_loss: 0.0292 118/500 [======>.......................] - ETA: 3:12 - loss: 0.3913 - regression_loss: 0.3622 - classification_loss: 0.0290 119/500 [======>.......................] - ETA: 3:11 - loss: 0.3890 - regression_loss: 0.3602 - classification_loss: 0.0289 120/500 [======>.......................] - ETA: 3:11 - loss: 0.3908 - regression_loss: 0.3619 - classification_loss: 0.0289 121/500 [======>.......................] - ETA: 3:10 - loss: 0.3895 - regression_loss: 0.3607 - classification_loss: 0.0288 122/500 [======>.......................] - ETA: 3:10 - loss: 0.3904 - regression_loss: 0.3616 - classification_loss: 0.0288 123/500 [======>.......................] - ETA: 3:09 - loss: 0.3917 - regression_loss: 0.3628 - classification_loss: 0.0289 124/500 [======>.......................] - ETA: 3:09 - loss: 0.3914 - regression_loss: 0.3626 - classification_loss: 0.0288 125/500 [======>.......................] - ETA: 3:08 - loss: 0.3896 - regression_loss: 0.3609 - classification_loss: 0.0287 126/500 [======>.......................] - ETA: 3:08 - loss: 0.3911 - regression_loss: 0.3623 - classification_loss: 0.0287 127/500 [======>.......................] - ETA: 3:07 - loss: 0.3912 - regression_loss: 0.3624 - classification_loss: 0.0288 128/500 [======>.......................] - ETA: 3:07 - loss: 0.3918 - regression_loss: 0.3630 - classification_loss: 0.0288 129/500 [======>.......................] - ETA: 3:06 - loss: 0.3907 - regression_loss: 0.3620 - classification_loss: 0.0287 130/500 [======>.......................] - ETA: 3:06 - loss: 0.3904 - regression_loss: 0.3618 - classification_loss: 0.0286 131/500 [======>.......................] - ETA: 3:05 - loss: 0.3891 - regression_loss: 0.3606 - classification_loss: 0.0285 132/500 [======>.......................] - ETA: 3:05 - loss: 0.3915 - regression_loss: 0.3630 - classification_loss: 0.0286 133/500 [======>.......................] - ETA: 3:04 - loss: 0.3933 - regression_loss: 0.3644 - classification_loss: 0.0288 134/500 [=======>......................] - ETA: 3:04 - loss: 0.3928 - regression_loss: 0.3641 - classification_loss: 0.0287 135/500 [=======>......................] - ETA: 3:03 - loss: 0.3915 - regression_loss: 0.3629 - classification_loss: 0.0286 136/500 [=======>......................] - ETA: 3:03 - loss: 0.3939 - regression_loss: 0.3651 - classification_loss: 0.0288 137/500 [=======>......................] - ETA: 3:02 - loss: 0.3969 - regression_loss: 0.3678 - classification_loss: 0.0291 138/500 [=======>......................] - ETA: 3:02 - loss: 0.3979 - regression_loss: 0.3688 - classification_loss: 0.0291 139/500 [=======>......................] - ETA: 3:01 - loss: 0.4004 - regression_loss: 0.3711 - classification_loss: 0.0293 140/500 [=======>......................] - ETA: 3:01 - loss: 0.4014 - regression_loss: 0.3719 - classification_loss: 0.0295 141/500 [=======>......................] - ETA: 3:00 - loss: 0.4018 - regression_loss: 0.3723 - classification_loss: 0.0295 142/500 [=======>......................] - ETA: 3:00 - loss: 0.4021 - regression_loss: 0.3724 - classification_loss: 0.0297 143/500 [=======>......................] - ETA: 2:59 - loss: 0.4023 - regression_loss: 0.3726 - classification_loss: 0.0297 144/500 [=======>......................] - ETA: 2:59 - loss: 0.4014 - regression_loss: 0.3719 - classification_loss: 0.0295 145/500 [=======>......................] - ETA: 2:58 - loss: 0.4008 - regression_loss: 0.3714 - classification_loss: 0.0294 146/500 [=======>......................] - ETA: 2:58 - loss: 0.4023 - regression_loss: 0.3727 - classification_loss: 0.0295 147/500 [=======>......................] - ETA: 2:57 - loss: 0.4014 - regression_loss: 0.3719 - classification_loss: 0.0295 148/500 [=======>......................] - ETA: 2:57 - loss: 0.4025 - regression_loss: 0.3727 - classification_loss: 0.0298 149/500 [=======>......................] - ETA: 2:56 - loss: 0.4032 - regression_loss: 0.3733 - classification_loss: 0.0298 150/500 [========>.....................] - ETA: 2:56 - loss: 0.4019 - regression_loss: 0.3722 - classification_loss: 0.0297 151/500 [========>.....................] - ETA: 2:55 - loss: 0.4020 - regression_loss: 0.3723 - classification_loss: 0.0297 152/500 [========>.....................] - ETA: 2:55 - loss: 0.4021 - regression_loss: 0.3724 - classification_loss: 0.0297 153/500 [========>.....................] - ETA: 2:54 - loss: 0.4021 - regression_loss: 0.3723 - classification_loss: 0.0298 154/500 [========>.....................] - ETA: 2:54 - loss: 0.4034 - regression_loss: 0.3734 - classification_loss: 0.0300 155/500 [========>.....................] - ETA: 2:53 - loss: 0.4031 - regression_loss: 0.3732 - classification_loss: 0.0298 156/500 [========>.....................] - ETA: 2:53 - loss: 0.4041 - regression_loss: 0.3742 - classification_loss: 0.0299 157/500 [========>.....................] - ETA: 2:52 - loss: 0.4056 - regression_loss: 0.3756 - classification_loss: 0.0300 158/500 [========>.....................] - ETA: 2:52 - loss: 0.4051 - regression_loss: 0.3752 - classification_loss: 0.0299 159/500 [========>.....................] - ETA: 2:51 - loss: 0.4049 - regression_loss: 0.3751 - classification_loss: 0.0299 160/500 [========>.....................] - ETA: 2:51 - loss: 0.4046 - regression_loss: 0.3748 - classification_loss: 0.0298 161/500 [========>.....................] - ETA: 2:50 - loss: 0.4046 - regression_loss: 0.3748 - classification_loss: 0.0297 162/500 [========>.....................] - ETA: 2:50 - loss: 0.4034 - regression_loss: 0.3738 - classification_loss: 0.0296 163/500 [========>.....................] - ETA: 2:49 - loss: 0.4027 - regression_loss: 0.3731 - classification_loss: 0.0296 164/500 [========>.....................] - ETA: 2:49 - loss: 0.4025 - regression_loss: 0.3729 - classification_loss: 0.0296 165/500 [========>.....................] - ETA: 2:48 - loss: 0.4011 - regression_loss: 0.3716 - classification_loss: 0.0295 166/500 [========>.....................] - ETA: 2:48 - loss: 0.4028 - regression_loss: 0.3732 - classification_loss: 0.0296 167/500 [=========>....................] - ETA: 2:47 - loss: 0.4036 - regression_loss: 0.3740 - classification_loss: 0.0296 168/500 [=========>....................] - ETA: 2:47 - loss: 0.4043 - regression_loss: 0.3744 - classification_loss: 0.0299 169/500 [=========>....................] - ETA: 2:46 - loss: 0.4042 - regression_loss: 0.3745 - classification_loss: 0.0297 170/500 [=========>....................] - ETA: 2:46 - loss: 0.4034 - regression_loss: 0.3738 - classification_loss: 0.0296 171/500 [=========>....................] - ETA: 2:45 - loss: 0.4033 - regression_loss: 0.3737 - classification_loss: 0.0296 172/500 [=========>....................] - ETA: 2:45 - loss: 0.4034 - regression_loss: 0.3739 - classification_loss: 0.0295 173/500 [=========>....................] - ETA: 2:44 - loss: 0.4036 - regression_loss: 0.3739 - classification_loss: 0.0297 174/500 [=========>....................] - ETA: 2:44 - loss: 0.4023 - regression_loss: 0.3727 - classification_loss: 0.0296 175/500 [=========>....................] - ETA: 2:43 - loss: 0.4013 - regression_loss: 0.3719 - classification_loss: 0.0294 176/500 [=========>....................] - ETA: 2:43 - loss: 0.4016 - regression_loss: 0.3721 - classification_loss: 0.0295 177/500 [=========>....................] - ETA: 2:42 - loss: 0.4010 - regression_loss: 0.3716 - classification_loss: 0.0294 178/500 [=========>....................] - ETA: 2:42 - loss: 0.4008 - regression_loss: 0.3715 - classification_loss: 0.0293 179/500 [=========>....................] - ETA: 2:41 - loss: 0.4004 - regression_loss: 0.3712 - classification_loss: 0.0292 180/500 [=========>....................] - ETA: 2:41 - loss: 0.4003 - regression_loss: 0.3711 - classification_loss: 0.0292 181/500 [=========>....................] - ETA: 2:40 - loss: 0.4005 - regression_loss: 0.3713 - classification_loss: 0.0292 182/500 [=========>....................] - ETA: 2:40 - loss: 0.4002 - regression_loss: 0.3711 - classification_loss: 0.0291 183/500 [=========>....................] - ETA: 2:39 - loss: 0.3991 - regression_loss: 0.3700 - classification_loss: 0.0291 184/500 [==========>...................] - ETA: 2:39 - loss: 0.3982 - regression_loss: 0.3692 - classification_loss: 0.0291 185/500 [==========>...................] - ETA: 2:38 - loss: 0.3971 - regression_loss: 0.3682 - classification_loss: 0.0289 186/500 [==========>...................] - ETA: 2:38 - loss: 0.3970 - regression_loss: 0.3680 - classification_loss: 0.0289 187/500 [==========>...................] - ETA: 2:37 - loss: 0.3965 - regression_loss: 0.3677 - classification_loss: 0.0288 188/500 [==========>...................] - ETA: 2:37 - loss: 0.3961 - regression_loss: 0.3672 - classification_loss: 0.0289 189/500 [==========>...................] - ETA: 2:36 - loss: 0.3958 - regression_loss: 0.3669 - classification_loss: 0.0288 190/500 [==========>...................] - ETA: 2:36 - loss: 0.3961 - regression_loss: 0.3672 - classification_loss: 0.0289 191/500 [==========>...................] - ETA: 2:35 - loss: 0.3955 - regression_loss: 0.3666 - classification_loss: 0.0288 192/500 [==========>...................] - ETA: 2:35 - loss: 0.3942 - regression_loss: 0.3654 - classification_loss: 0.0288 193/500 [==========>...................] - ETA: 2:34 - loss: 0.3945 - regression_loss: 0.3657 - classification_loss: 0.0287 194/500 [==========>...................] - ETA: 2:34 - loss: 0.3943 - regression_loss: 0.3656 - classification_loss: 0.0287 195/500 [==========>...................] - ETA: 2:33 - loss: 0.3947 - regression_loss: 0.3659 - classification_loss: 0.0288 196/500 [==========>...................] - ETA: 2:33 - loss: 0.3949 - regression_loss: 0.3661 - classification_loss: 0.0288 197/500 [==========>...................] - ETA: 2:32 - loss: 0.3961 - regression_loss: 0.3670 - classification_loss: 0.0291 198/500 [==========>...................] - ETA: 2:32 - loss: 0.3947 - regression_loss: 0.3657 - classification_loss: 0.0290 199/500 [==========>...................] - ETA: 2:31 - loss: 0.3940 - regression_loss: 0.3650 - classification_loss: 0.0290 200/500 [===========>..................] - ETA: 2:31 - loss: 0.3930 - regression_loss: 0.3641 - classification_loss: 0.0289 201/500 [===========>..................] - ETA: 2:30 - loss: 0.3921 - regression_loss: 0.3633 - classification_loss: 0.0288 202/500 [===========>..................] - ETA: 2:30 - loss: 0.3928 - regression_loss: 0.3639 - classification_loss: 0.0288 203/500 [===========>..................] - ETA: 2:29 - loss: 0.3915 - regression_loss: 0.3627 - classification_loss: 0.0287 204/500 [===========>..................] - ETA: 2:29 - loss: 0.3912 - regression_loss: 0.3625 - classification_loss: 0.0287 205/500 [===========>..................] - ETA: 2:28 - loss: 0.3923 - regression_loss: 0.3635 - classification_loss: 0.0288 206/500 [===========>..................] - ETA: 2:28 - loss: 0.3933 - regression_loss: 0.3642 - classification_loss: 0.0290 207/500 [===========>..................] - ETA: 2:27 - loss: 0.3923 - regression_loss: 0.3633 - classification_loss: 0.0289 208/500 [===========>..................] - ETA: 2:27 - loss: 0.3925 - regression_loss: 0.3636 - classification_loss: 0.0290 209/500 [===========>..................] - ETA: 2:26 - loss: 0.3925 - regression_loss: 0.3636 - classification_loss: 0.0289 210/500 [===========>..................] - ETA: 2:26 - loss: 0.3931 - regression_loss: 0.3642 - classification_loss: 0.0289 211/500 [===========>..................] - ETA: 2:25 - loss: 0.3933 - regression_loss: 0.3645 - classification_loss: 0.0288 212/500 [===========>..................] - ETA: 2:25 - loss: 0.3928 - regression_loss: 0.3640 - classification_loss: 0.0288 213/500 [===========>..................] - ETA: 2:24 - loss: 0.3927 - regression_loss: 0.3639 - classification_loss: 0.0288 214/500 [===========>..................] - ETA: 2:24 - loss: 0.3923 - regression_loss: 0.3635 - classification_loss: 0.0287 215/500 [===========>..................] - ETA: 2:23 - loss: 0.3911 - regression_loss: 0.3625 - classification_loss: 0.0286 216/500 [===========>..................] - ETA: 2:23 - loss: 0.3912 - regression_loss: 0.3624 - classification_loss: 0.0288 217/500 [============>.................] - ETA: 2:22 - loss: 0.3916 - regression_loss: 0.3628 - classification_loss: 0.0288 218/500 [============>.................] - ETA: 2:22 - loss: 0.3929 - regression_loss: 0.3638 - classification_loss: 0.0290 219/500 [============>.................] - ETA: 2:21 - loss: 0.3932 - regression_loss: 0.3642 - classification_loss: 0.0290 220/500 [============>.................] - ETA: 2:21 - loss: 0.3935 - regression_loss: 0.3645 - classification_loss: 0.0290 221/500 [============>.................] - ETA: 2:20 - loss: 0.3931 - regression_loss: 0.3642 - classification_loss: 0.0290 222/500 [============>.................] - ETA: 2:20 - loss: 0.3942 - regression_loss: 0.3652 - classification_loss: 0.0290 223/500 [============>.................] - ETA: 2:19 - loss: 0.3945 - regression_loss: 0.3655 - classification_loss: 0.0290 224/500 [============>.................] - ETA: 2:18 - loss: 0.3971 - regression_loss: 0.3676 - classification_loss: 0.0296 225/500 [============>.................] - ETA: 2:18 - loss: 0.3975 - regression_loss: 0.3679 - classification_loss: 0.0296 226/500 [============>.................] - ETA: 2:17 - loss: 0.3979 - regression_loss: 0.3683 - classification_loss: 0.0296 227/500 [============>.................] - ETA: 2:17 - loss: 0.3979 - regression_loss: 0.3681 - classification_loss: 0.0297 228/500 [============>.................] - ETA: 2:16 - loss: 0.3974 - regression_loss: 0.3677 - classification_loss: 0.0296 229/500 [============>.................] - ETA: 2:16 - loss: 0.3963 - regression_loss: 0.3667 - classification_loss: 0.0296 230/500 [============>.................] - ETA: 2:15 - loss: 0.3953 - regression_loss: 0.3658 - classification_loss: 0.0295 231/500 [============>.................] - ETA: 2:15 - loss: 0.3954 - regression_loss: 0.3659 - classification_loss: 0.0295 232/500 [============>.................] - ETA: 2:14 - loss: 0.3951 - regression_loss: 0.3657 - classification_loss: 0.0294 233/500 [============>.................] - ETA: 2:14 - loss: 0.3942 - regression_loss: 0.3649 - classification_loss: 0.0293 234/500 [=============>................] - ETA: 2:13 - loss: 0.3956 - regression_loss: 0.3660 - classification_loss: 0.0296 235/500 [=============>................] - ETA: 2:13 - loss: 0.3960 - regression_loss: 0.3665 - classification_loss: 0.0296 236/500 [=============>................] - ETA: 2:12 - loss: 0.3973 - regression_loss: 0.3677 - classification_loss: 0.0296 237/500 [=============>................] - ETA: 2:12 - loss: 0.3969 - regression_loss: 0.3673 - classification_loss: 0.0296 238/500 [=============>................] - ETA: 2:11 - loss: 0.3967 - regression_loss: 0.3671 - classification_loss: 0.0296 239/500 [=============>................] - ETA: 2:11 - loss: 0.3962 - regression_loss: 0.3667 - classification_loss: 0.0295 240/500 [=============>................] - ETA: 2:10 - loss: 0.3958 - regression_loss: 0.3664 - classification_loss: 0.0295 241/500 [=============>................] - ETA: 2:10 - loss: 0.3959 - regression_loss: 0.3664 - classification_loss: 0.0295 242/500 [=============>................] - ETA: 2:09 - loss: 0.3949 - regression_loss: 0.3655 - classification_loss: 0.0294 243/500 [=============>................] - ETA: 2:09 - loss: 0.3951 - regression_loss: 0.3657 - classification_loss: 0.0294 244/500 [=============>................] - ETA: 2:08 - loss: 0.3953 - regression_loss: 0.3659 - classification_loss: 0.0294 245/500 [=============>................] - ETA: 2:08 - loss: 0.3951 - regression_loss: 0.3656 - classification_loss: 0.0295 246/500 [=============>................] - ETA: 2:07 - loss: 0.3948 - regression_loss: 0.3653 - classification_loss: 0.0295 247/500 [=============>................] - ETA: 2:07 - loss: 0.3944 - regression_loss: 0.3649 - classification_loss: 0.0294 248/500 [=============>................] - ETA: 2:06 - loss: 0.3941 - regression_loss: 0.3647 - classification_loss: 0.0295 249/500 [=============>................] - ETA: 2:06 - loss: 0.3945 - regression_loss: 0.3650 - classification_loss: 0.0295 250/500 [==============>...............] - ETA: 2:05 - loss: 0.3937 - regression_loss: 0.3642 - classification_loss: 0.0294 251/500 [==============>...............] - ETA: 2:05 - loss: 0.3944 - regression_loss: 0.3649 - classification_loss: 0.0295 252/500 [==============>...............] - ETA: 2:04 - loss: 0.3953 - regression_loss: 0.3658 - classification_loss: 0.0295 253/500 [==============>...............] - ETA: 2:04 - loss: 0.3950 - regression_loss: 0.3656 - classification_loss: 0.0294 254/500 [==============>...............] - ETA: 2:03 - loss: 0.3951 - regression_loss: 0.3658 - classification_loss: 0.0294 255/500 [==============>...............] - ETA: 2:03 - loss: 0.3958 - regression_loss: 0.3664 - classification_loss: 0.0294 256/500 [==============>...............] - ETA: 2:02 - loss: 0.3957 - regression_loss: 0.3663 - classification_loss: 0.0294 257/500 [==============>...............] - ETA: 2:02 - loss: 0.3955 - regression_loss: 0.3660 - classification_loss: 0.0295 258/500 [==============>...............] - ETA: 2:01 - loss: 0.3951 - regression_loss: 0.3656 - classification_loss: 0.0295 259/500 [==============>...............] - ETA: 2:01 - loss: 0.3952 - regression_loss: 0.3657 - classification_loss: 0.0295 260/500 [==============>...............] - ETA: 2:00 - loss: 0.3945 - regression_loss: 0.3651 - classification_loss: 0.0294 261/500 [==============>...............] - ETA: 2:00 - loss: 0.3935 - regression_loss: 0.3641 - classification_loss: 0.0293 262/500 [==============>...............] - ETA: 1:59 - loss: 0.3925 - regression_loss: 0.3633 - classification_loss: 0.0292 263/500 [==============>...............] - ETA: 1:59 - loss: 0.3928 - regression_loss: 0.3635 - classification_loss: 0.0293 264/500 [==============>...............] - ETA: 1:58 - loss: 0.3924 - regression_loss: 0.3631 - classification_loss: 0.0293 265/500 [==============>...............] - ETA: 1:58 - loss: 0.3920 - regression_loss: 0.3627 - classification_loss: 0.0293 266/500 [==============>...............] - ETA: 1:57 - loss: 0.3922 - regression_loss: 0.3629 - classification_loss: 0.0293 267/500 [===============>..............] - ETA: 1:57 - loss: 0.3928 - regression_loss: 0.3634 - classification_loss: 0.0294 268/500 [===============>..............] - ETA: 1:56 - loss: 0.3928 - regression_loss: 0.3634 - classification_loss: 0.0294 269/500 [===============>..............] - ETA: 1:56 - loss: 0.3934 - regression_loss: 0.3640 - classification_loss: 0.0295 270/500 [===============>..............] - ETA: 1:55 - loss: 0.3930 - regression_loss: 0.3636 - classification_loss: 0.0294 271/500 [===============>..............] - ETA: 1:55 - loss: 0.3931 - regression_loss: 0.3637 - classification_loss: 0.0294 272/500 [===============>..............] - ETA: 1:54 - loss: 0.3935 - regression_loss: 0.3641 - classification_loss: 0.0294 273/500 [===============>..............] - ETA: 1:54 - loss: 0.3929 - regression_loss: 0.3636 - classification_loss: 0.0293 274/500 [===============>..............] - ETA: 1:53 - loss: 0.3923 - regression_loss: 0.3630 - classification_loss: 0.0293 275/500 [===============>..............] - ETA: 1:53 - loss: 0.3922 - regression_loss: 0.3630 - classification_loss: 0.0293 276/500 [===============>..............] - ETA: 1:52 - loss: 0.3921 - regression_loss: 0.3628 - classification_loss: 0.0293 277/500 [===============>..............] - ETA: 1:52 - loss: 0.3918 - regression_loss: 0.3626 - classification_loss: 0.0292 278/500 [===============>..............] - ETA: 1:51 - loss: 0.3924 - regression_loss: 0.3631 - classification_loss: 0.0293 279/500 [===============>..............] - ETA: 1:51 - loss: 0.3923 - regression_loss: 0.3630 - classification_loss: 0.0293 280/500 [===============>..............] - ETA: 1:50 - loss: 0.3925 - regression_loss: 0.3632 - classification_loss: 0.0293 281/500 [===============>..............] - ETA: 1:50 - loss: 0.3925 - regression_loss: 0.3632 - classification_loss: 0.0293 282/500 [===============>..............] - ETA: 1:49 - loss: 0.3922 - regression_loss: 0.3629 - classification_loss: 0.0293 283/500 [===============>..............] - ETA: 1:49 - loss: 0.3916 - regression_loss: 0.3624 - classification_loss: 0.0292 284/500 [================>.............] - ETA: 1:48 - loss: 0.3909 - regression_loss: 0.3618 - classification_loss: 0.0291 285/500 [================>.............] - ETA: 1:48 - loss: 0.3907 - regression_loss: 0.3617 - classification_loss: 0.0291 286/500 [================>.............] - ETA: 1:47 - loss: 0.3910 - regression_loss: 0.3619 - classification_loss: 0.0290 287/500 [================>.............] - ETA: 1:47 - loss: 0.3914 - regression_loss: 0.3624 - classification_loss: 0.0290 288/500 [================>.............] - ETA: 1:46 - loss: 0.3913 - regression_loss: 0.3623 - classification_loss: 0.0290 289/500 [================>.............] - ETA: 1:46 - loss: 0.3917 - regression_loss: 0.3627 - classification_loss: 0.0290 290/500 [================>.............] - ETA: 1:45 - loss: 0.3918 - regression_loss: 0.3629 - classification_loss: 0.0290 291/500 [================>.............] - ETA: 1:45 - loss: 0.3912 - regression_loss: 0.3623 - classification_loss: 0.0289 292/500 [================>.............] - ETA: 1:44 - loss: 0.3913 - regression_loss: 0.3625 - classification_loss: 0.0288 293/500 [================>.............] - ETA: 1:44 - loss: 0.3929 - regression_loss: 0.3639 - classification_loss: 0.0290 294/500 [================>.............] - ETA: 1:43 - loss: 0.3923 - regression_loss: 0.3633 - classification_loss: 0.0290 295/500 [================>.............] - ETA: 1:43 - loss: 0.3919 - regression_loss: 0.3630 - classification_loss: 0.0289 296/500 [================>.............] - ETA: 1:42 - loss: 0.3925 - regression_loss: 0.3635 - classification_loss: 0.0290 297/500 [================>.............] - ETA: 1:42 - loss: 0.3919 - regression_loss: 0.3630 - classification_loss: 0.0289 298/500 [================>.............] - ETA: 1:41 - loss: 0.3922 - regression_loss: 0.3633 - classification_loss: 0.0289 299/500 [================>.............] - ETA: 1:41 - loss: 0.3930 - regression_loss: 0.3640 - classification_loss: 0.0290 300/500 [=================>............] - ETA: 1:40 - loss: 0.3932 - regression_loss: 0.3642 - classification_loss: 0.0290 301/500 [=================>............] - ETA: 1:40 - loss: 0.3929 - regression_loss: 0.3639 - classification_loss: 0.0290 302/500 [=================>............] - ETA: 1:39 - loss: 0.3924 - regression_loss: 0.3634 - classification_loss: 0.0289 303/500 [=================>............] - ETA: 1:39 - loss: 0.3927 - regression_loss: 0.3636 - classification_loss: 0.0290 304/500 [=================>............] - ETA: 1:38 - loss: 0.3925 - regression_loss: 0.3635 - classification_loss: 0.0290 305/500 [=================>............] - ETA: 1:38 - loss: 0.3931 - regression_loss: 0.3639 - classification_loss: 0.0292 306/500 [=================>............] - ETA: 1:37 - loss: 0.3929 - regression_loss: 0.3637 - classification_loss: 0.0292 307/500 [=================>............] - ETA: 1:37 - loss: 0.3928 - regression_loss: 0.3636 - classification_loss: 0.0292 308/500 [=================>............] - ETA: 1:36 - loss: 0.3929 - regression_loss: 0.3638 - classification_loss: 0.0292 309/500 [=================>............] - ETA: 1:36 - loss: 0.3927 - regression_loss: 0.3635 - classification_loss: 0.0291 310/500 [=================>............] - ETA: 1:35 - loss: 0.3923 - regression_loss: 0.3632 - classification_loss: 0.0291 311/500 [=================>............] - ETA: 1:35 - loss: 0.3923 - regression_loss: 0.3632 - classification_loss: 0.0291 312/500 [=================>............] - ETA: 1:34 - loss: 0.3918 - regression_loss: 0.3627 - classification_loss: 0.0291 313/500 [=================>............] - ETA: 1:34 - loss: 0.3920 - regression_loss: 0.3629 - classification_loss: 0.0291 314/500 [=================>............] - ETA: 1:33 - loss: 0.3914 - regression_loss: 0.3623 - classification_loss: 0.0291 315/500 [=================>............] - ETA: 1:33 - loss: 0.3909 - regression_loss: 0.3619 - classification_loss: 0.0290 316/500 [=================>............] - ETA: 1:32 - loss: 0.3910 - regression_loss: 0.3620 - classification_loss: 0.0290 317/500 [==================>...........] - ETA: 1:32 - loss: 0.3915 - regression_loss: 0.3625 - classification_loss: 0.0290 318/500 [==================>...........] - ETA: 1:31 - loss: 0.3917 - regression_loss: 0.3627 - classification_loss: 0.0291 319/500 [==================>...........] - ETA: 1:31 - loss: 0.3913 - regression_loss: 0.3622 - classification_loss: 0.0291 320/500 [==================>...........] - ETA: 1:30 - loss: 0.3918 - regression_loss: 0.3627 - classification_loss: 0.0291 321/500 [==================>...........] - ETA: 1:30 - loss: 0.3922 - regression_loss: 0.3630 - classification_loss: 0.0291 322/500 [==================>...........] - ETA: 1:29 - loss: 0.3920 - regression_loss: 0.3629 - classification_loss: 0.0291 323/500 [==================>...........] - ETA: 1:29 - loss: 0.3915 - regression_loss: 0.3624 - classification_loss: 0.0291 324/500 [==================>...........] - ETA: 1:28 - loss: 0.3912 - regression_loss: 0.3622 - classification_loss: 0.0290 325/500 [==================>...........] - ETA: 1:28 - loss: 0.3917 - regression_loss: 0.3626 - classification_loss: 0.0290 326/500 [==================>...........] - ETA: 1:27 - loss: 0.3923 - regression_loss: 0.3631 - classification_loss: 0.0292 327/500 [==================>...........] - ETA: 1:27 - loss: 0.3927 - regression_loss: 0.3636 - classification_loss: 0.0292 328/500 [==================>...........] - ETA: 1:26 - loss: 0.3934 - regression_loss: 0.3641 - classification_loss: 0.0293 329/500 [==================>...........] - ETA: 1:26 - loss: 0.3940 - regression_loss: 0.3646 - classification_loss: 0.0294 330/500 [==================>...........] - ETA: 1:25 - loss: 0.3938 - regression_loss: 0.3645 - classification_loss: 0.0293 331/500 [==================>...........] - ETA: 1:25 - loss: 0.3940 - regression_loss: 0.3647 - classification_loss: 0.0293 332/500 [==================>...........] - ETA: 1:24 - loss: 0.3934 - regression_loss: 0.3641 - classification_loss: 0.0293 333/500 [==================>...........] - ETA: 1:24 - loss: 0.3930 - regression_loss: 0.3638 - classification_loss: 0.0293 334/500 [===================>..........] - ETA: 1:23 - loss: 0.3926 - regression_loss: 0.3634 - classification_loss: 0.0292 335/500 [===================>..........] - ETA: 1:23 - loss: 0.3929 - regression_loss: 0.3636 - classification_loss: 0.0292 336/500 [===================>..........] - ETA: 1:22 - loss: 0.3925 - regression_loss: 0.3633 - classification_loss: 0.0292 337/500 [===================>..........] - ETA: 1:21 - loss: 0.3928 - regression_loss: 0.3636 - classification_loss: 0.0292 338/500 [===================>..........] - ETA: 1:21 - loss: 0.3926 - regression_loss: 0.3634 - classification_loss: 0.0291 339/500 [===================>..........] - ETA: 1:21 - loss: 0.3926 - regression_loss: 0.3634 - classification_loss: 0.0292 340/500 [===================>..........] - ETA: 1:20 - loss: 0.3931 - regression_loss: 0.3637 - classification_loss: 0.0293 341/500 [===================>..........] - ETA: 1:19 - loss: 0.3927 - regression_loss: 0.3634 - classification_loss: 0.0293 342/500 [===================>..........] - ETA: 1:19 - loss: 0.3928 - regression_loss: 0.3635 - classification_loss: 0.0293 343/500 [===================>..........] - ETA: 1:18 - loss: 0.3931 - regression_loss: 0.3637 - classification_loss: 0.0294 344/500 [===================>..........] - ETA: 1:18 - loss: 0.3938 - regression_loss: 0.3643 - classification_loss: 0.0295 345/500 [===================>..........] - ETA: 1:17 - loss: 0.3936 - regression_loss: 0.3641 - classification_loss: 0.0295 346/500 [===================>..........] - ETA: 1:17 - loss: 0.3937 - regression_loss: 0.3642 - classification_loss: 0.0295 347/500 [===================>..........] - ETA: 1:16 - loss: 0.3928 - regression_loss: 0.3634 - classification_loss: 0.0294 348/500 [===================>..........] - ETA: 1:16 - loss: 0.3927 - regression_loss: 0.3633 - classification_loss: 0.0294 349/500 [===================>..........] - ETA: 1:15 - loss: 0.3925 - regression_loss: 0.3631 - classification_loss: 0.0294 350/500 [====================>.........] - ETA: 1:15 - loss: 0.3921 - regression_loss: 0.3628 - classification_loss: 0.0294 351/500 [====================>.........] - ETA: 1:14 - loss: 0.3920 - regression_loss: 0.3627 - classification_loss: 0.0293 352/500 [====================>.........] - ETA: 1:14 - loss: 0.3931 - regression_loss: 0.3638 - classification_loss: 0.0293 353/500 [====================>.........] - ETA: 1:13 - loss: 0.3932 - regression_loss: 0.3639 - classification_loss: 0.0294 354/500 [====================>.........] - ETA: 1:13 - loss: 0.3931 - regression_loss: 0.3637 - classification_loss: 0.0294 355/500 [====================>.........] - ETA: 1:12 - loss: 0.3926 - regression_loss: 0.3632 - classification_loss: 0.0293 356/500 [====================>.........] - ETA: 1:12 - loss: 0.3928 - regression_loss: 0.3634 - classification_loss: 0.0294 357/500 [====================>.........] - ETA: 1:11 - loss: 0.3926 - regression_loss: 0.3632 - classification_loss: 0.0294 358/500 [====================>.........] - ETA: 1:11 - loss: 0.3919 - regression_loss: 0.3626 - classification_loss: 0.0293 359/500 [====================>.........] - ETA: 1:10 - loss: 0.3916 - regression_loss: 0.3623 - classification_loss: 0.0293 360/500 [====================>.........] - ETA: 1:10 - loss: 0.3911 - regression_loss: 0.3618 - classification_loss: 0.0293 361/500 [====================>.........] - ETA: 1:09 - loss: 0.3910 - regression_loss: 0.3618 - classification_loss: 0.0292 362/500 [====================>.........] - ETA: 1:09 - loss: 0.3910 - regression_loss: 0.3617 - classification_loss: 0.0292 363/500 [====================>.........] - ETA: 1:08 - loss: 0.3905 - regression_loss: 0.3613 - classification_loss: 0.0292 364/500 [====================>.........] - ETA: 1:08 - loss: 0.3902 - regression_loss: 0.3610 - classification_loss: 0.0291 365/500 [====================>.........] - ETA: 1:07 - loss: 0.3900 - regression_loss: 0.3609 - classification_loss: 0.0291 366/500 [====================>.........] - ETA: 1:07 - loss: 0.3900 - regression_loss: 0.3608 - classification_loss: 0.0291 367/500 [=====================>........] - ETA: 1:06 - loss: 0.3897 - regression_loss: 0.3606 - classification_loss: 0.0291 368/500 [=====================>........] - ETA: 1:06 - loss: 0.3898 - regression_loss: 0.3607 - classification_loss: 0.0291 369/500 [=====================>........] - ETA: 1:05 - loss: 0.3892 - regression_loss: 0.3601 - classification_loss: 0.0290 370/500 [=====================>........] - ETA: 1:05 - loss: 0.3900 - regression_loss: 0.3608 - classification_loss: 0.0292 371/500 [=====================>........] - ETA: 1:04 - loss: 0.3902 - regression_loss: 0.3610 - classification_loss: 0.0292 372/500 [=====================>........] - ETA: 1:04 - loss: 0.3894 - regression_loss: 0.3603 - classification_loss: 0.0291 373/500 [=====================>........] - ETA: 1:03 - loss: 0.3890 - regression_loss: 0.3599 - classification_loss: 0.0290 374/500 [=====================>........] - ETA: 1:03 - loss: 0.3886 - regression_loss: 0.3595 - classification_loss: 0.0291 375/500 [=====================>........] - ETA: 1:02 - loss: 0.3884 - regression_loss: 0.3593 - classification_loss: 0.0290 376/500 [=====================>........] - ETA: 1:02 - loss: 0.3888 - regression_loss: 0.3597 - classification_loss: 0.0291 377/500 [=====================>........] - ETA: 1:01 - loss: 0.3885 - regression_loss: 0.3595 - classification_loss: 0.0291 378/500 [=====================>........] - ETA: 1:01 - loss: 0.3880 - regression_loss: 0.3590 - classification_loss: 0.0290 379/500 [=====================>........] - ETA: 1:00 - loss: 0.3877 - regression_loss: 0.3587 - classification_loss: 0.0290 380/500 [=====================>........] - ETA: 1:00 - loss: 0.3877 - regression_loss: 0.3587 - classification_loss: 0.0290 381/500 [=====================>........] - ETA: 59s - loss: 0.3875 - regression_loss: 0.3586 - classification_loss: 0.0290  382/500 [=====================>........] - ETA: 59s - loss: 0.3875 - regression_loss: 0.3585 - classification_loss: 0.0290 383/500 [=====================>........] - ETA: 58s - loss: 0.3880 - regression_loss: 0.3590 - classification_loss: 0.0290 384/500 [======================>.......] - ETA: 58s - loss: 0.3877 - regression_loss: 0.3587 - classification_loss: 0.0289 385/500 [======================>.......] - ETA: 57s - loss: 0.3877 - regression_loss: 0.3587 - classification_loss: 0.0290 386/500 [======================>.......] - ETA: 57s - loss: 0.3879 - regression_loss: 0.3590 - classification_loss: 0.0290 387/500 [======================>.......] - ETA: 56s - loss: 0.3880 - regression_loss: 0.3590 - classification_loss: 0.0289 388/500 [======================>.......] - ETA: 56s - loss: 0.3877 - regression_loss: 0.3588 - classification_loss: 0.0289 389/500 [======================>.......] - ETA: 55s - loss: 0.3882 - regression_loss: 0.3593 - classification_loss: 0.0289 390/500 [======================>.......] - ETA: 55s - loss: 0.3884 - regression_loss: 0.3595 - classification_loss: 0.0289 391/500 [======================>.......] - ETA: 54s - loss: 0.3880 - regression_loss: 0.3591 - classification_loss: 0.0289 392/500 [======================>.......] - ETA: 54s - loss: 0.3879 - regression_loss: 0.3590 - classification_loss: 0.0289 393/500 [======================>.......] - ETA: 53s - loss: 0.3876 - regression_loss: 0.3587 - classification_loss: 0.0288 394/500 [======================>.......] - ETA: 53s - loss: 0.3873 - regression_loss: 0.3585 - classification_loss: 0.0288 395/500 [======================>.......] - ETA: 52s - loss: 0.3871 - regression_loss: 0.3584 - classification_loss: 0.0288 396/500 [======================>.......] - ETA: 52s - loss: 0.3869 - regression_loss: 0.3581 - classification_loss: 0.0288 397/500 [======================>.......] - ETA: 51s - loss: 0.3871 - regression_loss: 0.3583 - classification_loss: 0.0288 398/500 [======================>.......] - ETA: 51s - loss: 0.3865 - regression_loss: 0.3578 - classification_loss: 0.0287 399/500 [======================>.......] - ETA: 50s - loss: 0.3865 - regression_loss: 0.3578 - classification_loss: 0.0287 400/500 [=======================>......] - ETA: 50s - loss: 0.3865 - regression_loss: 0.3578 - classification_loss: 0.0287 401/500 [=======================>......] - ETA: 49s - loss: 0.3865 - regression_loss: 0.3577 - classification_loss: 0.0288 402/500 [=======================>......] - ETA: 49s - loss: 0.3866 - regression_loss: 0.3578 - classification_loss: 0.0288 403/500 [=======================>......] - ETA: 48s - loss: 0.3868 - regression_loss: 0.3580 - classification_loss: 0.0287 404/500 [=======================>......] - ETA: 48s - loss: 0.3863 - regression_loss: 0.3576 - classification_loss: 0.0287 405/500 [=======================>......] - ETA: 47s - loss: 0.3865 - regression_loss: 0.3578 - classification_loss: 0.0287 406/500 [=======================>......] - ETA: 47s - loss: 0.3859 - regression_loss: 0.3572 - classification_loss: 0.0287 407/500 [=======================>......] - ETA: 46s - loss: 0.3851 - regression_loss: 0.3565 - classification_loss: 0.0286 408/500 [=======================>......] - ETA: 46s - loss: 0.3851 - regression_loss: 0.3565 - classification_loss: 0.0286 409/500 [=======================>......] - ETA: 45s - loss: 0.3856 - regression_loss: 0.3569 - classification_loss: 0.0287 410/500 [=======================>......] - ETA: 45s - loss: 0.3849 - regression_loss: 0.3563 - classification_loss: 0.0286 411/500 [=======================>......] - ETA: 44s - loss: 0.3849 - regression_loss: 0.3563 - classification_loss: 0.0286 412/500 [=======================>......] - ETA: 44s - loss: 0.3849 - regression_loss: 0.3563 - classification_loss: 0.0286 413/500 [=======================>......] - ETA: 43s - loss: 0.3853 - regression_loss: 0.3566 - classification_loss: 0.0287 414/500 [=======================>......] - ETA: 43s - loss: 0.3850 - regression_loss: 0.3563 - classification_loss: 0.0287 415/500 [=======================>......] - ETA: 42s - loss: 0.3851 - regression_loss: 0.3564 - classification_loss: 0.0287 416/500 [=======================>......] - ETA: 42s - loss: 0.3846 - regression_loss: 0.3560 - classification_loss: 0.0286 417/500 [========================>.....] - ETA: 41s - loss: 0.3845 - regression_loss: 0.3559 - classification_loss: 0.0286 418/500 [========================>.....] - ETA: 41s - loss: 0.3843 - regression_loss: 0.3557 - classification_loss: 0.0286 419/500 [========================>.....] - ETA: 40s - loss: 0.3840 - regression_loss: 0.3554 - classification_loss: 0.0286 420/500 [========================>.....] - ETA: 40s - loss: 0.3836 - regression_loss: 0.3550 - classification_loss: 0.0286 421/500 [========================>.....] - ETA: 39s - loss: 0.3837 - regression_loss: 0.3551 - classification_loss: 0.0286 422/500 [========================>.....] - ETA: 39s - loss: 0.3838 - regression_loss: 0.3552 - classification_loss: 0.0286 423/500 [========================>.....] - ETA: 38s - loss: 0.3835 - regression_loss: 0.3550 - classification_loss: 0.0285 424/500 [========================>.....] - ETA: 38s - loss: 0.3829 - regression_loss: 0.3545 - classification_loss: 0.0285 425/500 [========================>.....] - ETA: 37s - loss: 0.3828 - regression_loss: 0.3544 - classification_loss: 0.0284 426/500 [========================>.....] - ETA: 37s - loss: 0.3825 - regression_loss: 0.3541 - classification_loss: 0.0284 427/500 [========================>.....] - ETA: 36s - loss: 0.3824 - regression_loss: 0.3541 - classification_loss: 0.0284 428/500 [========================>.....] - ETA: 36s - loss: 0.3819 - regression_loss: 0.3536 - classification_loss: 0.0283 429/500 [========================>.....] - ETA: 35s - loss: 0.3822 - regression_loss: 0.3539 - classification_loss: 0.0283 430/500 [========================>.....] - ETA: 35s - loss: 0.3820 - regression_loss: 0.3537 - classification_loss: 0.0283 431/500 [========================>.....] - ETA: 34s - loss: 0.3819 - regression_loss: 0.3537 - classification_loss: 0.0283 432/500 [========================>.....] - ETA: 34s - loss: 0.3820 - regression_loss: 0.3537 - classification_loss: 0.0283 433/500 [========================>.....] - ETA: 33s - loss: 0.3824 - regression_loss: 0.3540 - classification_loss: 0.0284 434/500 [=========================>....] - ETA: 33s - loss: 0.3825 - regression_loss: 0.3542 - classification_loss: 0.0284 435/500 [=========================>....] - ETA: 32s - loss: 0.3827 - regression_loss: 0.3543 - classification_loss: 0.0284 436/500 [=========================>....] - ETA: 32s - loss: 0.3831 - regression_loss: 0.3548 - classification_loss: 0.0283 437/500 [=========================>....] - ETA: 31s - loss: 0.3834 - regression_loss: 0.3550 - classification_loss: 0.0284 438/500 [=========================>....] - ETA: 31s - loss: 0.3835 - regression_loss: 0.3551 - classification_loss: 0.0284 439/500 [=========================>....] - ETA: 30s - loss: 0.3830 - regression_loss: 0.3546 - classification_loss: 0.0284 440/500 [=========================>....] - ETA: 30s - loss: 0.3831 - regression_loss: 0.3547 - classification_loss: 0.0284 441/500 [=========================>....] - ETA: 29s - loss: 0.3835 - regression_loss: 0.3550 - classification_loss: 0.0285 442/500 [=========================>....] - ETA: 29s - loss: 0.3835 - regression_loss: 0.3550 - classification_loss: 0.0285 443/500 [=========================>....] - ETA: 28s - loss: 0.3842 - regression_loss: 0.3556 - classification_loss: 0.0286 444/500 [=========================>....] - ETA: 28s - loss: 0.3840 - regression_loss: 0.3554 - classification_loss: 0.0286 445/500 [=========================>....] - ETA: 27s - loss: 0.3840 - regression_loss: 0.3555 - classification_loss: 0.0286 446/500 [=========================>....] - ETA: 27s - loss: 0.3844 - regression_loss: 0.3559 - classification_loss: 0.0285 447/500 [=========================>....] - ETA: 26s - loss: 0.3840 - regression_loss: 0.3555 - classification_loss: 0.0285 448/500 [=========================>....] - ETA: 26s - loss: 0.3837 - regression_loss: 0.3552 - classification_loss: 0.0285 449/500 [=========================>....] - ETA: 25s - loss: 0.3834 - regression_loss: 0.3550 - classification_loss: 0.0285 450/500 [==========================>...] - ETA: 25s - loss: 0.3836 - regression_loss: 0.3551 - classification_loss: 0.0285 451/500 [==========================>...] - ETA: 24s - loss: 0.3835 - regression_loss: 0.3550 - classification_loss: 0.0284 452/500 [==========================>...] - ETA: 24s - loss: 0.3833 - regression_loss: 0.3549 - classification_loss: 0.0284 453/500 [==========================>...] - ETA: 23s - loss: 0.3836 - regression_loss: 0.3552 - classification_loss: 0.0284 454/500 [==========================>...] - ETA: 23s - loss: 0.3838 - regression_loss: 0.3555 - classification_loss: 0.0284 455/500 [==========================>...] - ETA: 22s - loss: 0.3836 - regression_loss: 0.3552 - classification_loss: 0.0283 456/500 [==========================>...] - ETA: 22s - loss: 0.3834 - regression_loss: 0.3551 - classification_loss: 0.0283 457/500 [==========================>...] - ETA: 21s - loss: 0.3835 - regression_loss: 0.3552 - classification_loss: 0.0283 458/500 [==========================>...] - ETA: 21s - loss: 0.3837 - regression_loss: 0.3554 - classification_loss: 0.0283 459/500 [==========================>...] - ETA: 20s - loss: 0.3830 - regression_loss: 0.3547 - classification_loss: 0.0283 460/500 [==========================>...] - ETA: 20s - loss: 0.3828 - regression_loss: 0.3545 - classification_loss: 0.0282 461/500 [==========================>...] - ETA: 19s - loss: 0.3830 - regression_loss: 0.3548 - classification_loss: 0.0282 462/500 [==========================>...] - ETA: 19s - loss: 0.3830 - regression_loss: 0.3548 - classification_loss: 0.0282 463/500 [==========================>...] - ETA: 18s - loss: 0.3827 - regression_loss: 0.3545 - classification_loss: 0.0282 464/500 [==========================>...] - ETA: 18s - loss: 0.3828 - regression_loss: 0.3546 - classification_loss: 0.0282 465/500 [==========================>...] - ETA: 17s - loss: 0.3825 - regression_loss: 0.3544 - classification_loss: 0.0281 466/500 [==========================>...] - ETA: 17s - loss: 0.3824 - regression_loss: 0.3543 - classification_loss: 0.0281 467/500 [===========================>..] - ETA: 16s - loss: 0.3820 - regression_loss: 0.3539 - classification_loss: 0.0281 468/500 [===========================>..] - ETA: 16s - loss: 0.3817 - regression_loss: 0.3537 - classification_loss: 0.0281 469/500 [===========================>..] - ETA: 15s - loss: 0.3815 - regression_loss: 0.3535 - classification_loss: 0.0280 470/500 [===========================>..] - ETA: 15s - loss: 0.3813 - regression_loss: 0.3533 - classification_loss: 0.0280 471/500 [===========================>..] - ETA: 14s - loss: 0.3810 - regression_loss: 0.3530 - classification_loss: 0.0280 472/500 [===========================>..] - ETA: 14s - loss: 0.3808 - regression_loss: 0.3529 - classification_loss: 0.0279 473/500 [===========================>..] - ETA: 13s - loss: 0.3808 - regression_loss: 0.3529 - classification_loss: 0.0279 474/500 [===========================>..] - ETA: 13s - loss: 0.3806 - regression_loss: 0.3527 - classification_loss: 0.0279 475/500 [===========================>..] - ETA: 12s - loss: 0.3807 - regression_loss: 0.3528 - classification_loss: 0.0279 476/500 [===========================>..] - ETA: 12s - loss: 0.3808 - regression_loss: 0.3529 - classification_loss: 0.0279 477/500 [===========================>..] - ETA: 11s - loss: 0.3809 - regression_loss: 0.3530 - classification_loss: 0.0279 478/500 [===========================>..] - ETA: 11s - loss: 0.3806 - regression_loss: 0.3527 - classification_loss: 0.0279 479/500 [===========================>..] - ETA: 10s - loss: 0.3806 - regression_loss: 0.3527 - classification_loss: 0.0279 480/500 [===========================>..] - ETA: 10s - loss: 0.3803 - regression_loss: 0.3524 - classification_loss: 0.0279 481/500 [===========================>..] - ETA: 9s - loss: 0.3801 - regression_loss: 0.3523 - classification_loss: 0.0278  482/500 [===========================>..] - ETA: 9s - loss: 0.3798 - regression_loss: 0.3520 - classification_loss: 0.0278 483/500 [===========================>..] - ETA: 8s - loss: 0.3797 - regression_loss: 0.3519 - classification_loss: 0.0278 484/500 [============================>.] - ETA: 8s - loss: 0.3794 - regression_loss: 0.3516 - classification_loss: 0.0278 485/500 [============================>.] - ETA: 7s - loss: 0.3792 - regression_loss: 0.3514 - classification_loss: 0.0277 486/500 [============================>.] - ETA: 7s - loss: 0.3793 - regression_loss: 0.3516 - classification_loss: 0.0278 487/500 [============================>.] - ETA: 6s - loss: 0.3794 - regression_loss: 0.3516 - classification_loss: 0.0278 488/500 [============================>.] - ETA: 6s - loss: 0.3792 - regression_loss: 0.3515 - classification_loss: 0.0278 489/500 [============================>.] - ETA: 5s - loss: 0.3793 - regression_loss: 0.3515 - classification_loss: 0.0278 490/500 [============================>.] - ETA: 5s - loss: 0.3802 - regression_loss: 0.3523 - classification_loss: 0.0279 491/500 [============================>.] - ETA: 4s - loss: 0.3801 - regression_loss: 0.3522 - classification_loss: 0.0279 492/500 [============================>.] - ETA: 4s - loss: 0.3805 - regression_loss: 0.3526 - classification_loss: 0.0279 493/500 [============================>.] - ETA: 3s - loss: 0.3799 - regression_loss: 0.3521 - classification_loss: 0.0279 494/500 [============================>.] - ETA: 3s - loss: 0.3803 - regression_loss: 0.3524 - classification_loss: 0.0279 495/500 [============================>.] - ETA: 2s - loss: 0.3804 - regression_loss: 0.3526 - classification_loss: 0.0279 496/500 [============================>.] - ETA: 2s - loss: 0.3804 - regression_loss: 0.3526 - classification_loss: 0.0279 497/500 [============================>.] - ETA: 1s - loss: 0.3803 - regression_loss: 0.3524 - classification_loss: 0.0279 498/500 [============================>.] - ETA: 1s - loss: 0.3805 - regression_loss: 0.3526 - classification_loss: 0.0279 499/500 [============================>.] - ETA: 0s - loss: 0.3801 - regression_loss: 0.3522 - classification_loss: 0.0279 500/500 [==============================] - 251s 502ms/step - loss: 0.3804 - regression_loss: 0.3525 - classification_loss: 0.0279 1172 instances of class plum with average precision: 0.7396 mAP: 0.7396 Epoch 00010: saving model to ./training/snapshots/resnet101_pascal_10.h5 Epoch 11/150 1/500 [..............................] - ETA: 4:34 - loss: 0.3761 - regression_loss: 0.3533 - classification_loss: 0.0228 2/500 [..............................] - ETA: 4:30 - loss: 0.4100 - regression_loss: 0.3837 - classification_loss: 0.0263 3/500 [..............................] - ETA: 4:21 - loss: 0.4894 - regression_loss: 0.4630 - classification_loss: 0.0263 4/500 [..............................] - ETA: 4:20 - loss: 0.4392 - regression_loss: 0.4168 - classification_loss: 0.0224 5/500 [..............................] - ETA: 4:21 - loss: 0.4089 - regression_loss: 0.3901 - classification_loss: 0.0188 6/500 [..............................] - ETA: 4:20 - loss: 0.4457 - regression_loss: 0.4236 - classification_loss: 0.0221 7/500 [..............................] - ETA: 4:20 - loss: 0.4031 - regression_loss: 0.3828 - classification_loss: 0.0203 8/500 [..............................] - ETA: 4:19 - loss: 0.4013 - regression_loss: 0.3805 - classification_loss: 0.0208 9/500 [..............................] - ETA: 4:17 - loss: 0.3767 - regression_loss: 0.3575 - classification_loss: 0.0192 10/500 [..............................] - ETA: 4:16 - loss: 0.3645 - regression_loss: 0.3457 - classification_loss: 0.0188 11/500 [..............................] - ETA: 4:13 - loss: 0.3758 - regression_loss: 0.3550 - classification_loss: 0.0209 12/500 [..............................] - ETA: 4:12 - loss: 0.3575 - regression_loss: 0.3377 - classification_loss: 0.0197 13/500 [..............................] - ETA: 4:11 - loss: 0.3502 - regression_loss: 0.3312 - classification_loss: 0.0190 14/500 [..............................] - ETA: 4:09 - loss: 0.3372 - regression_loss: 0.3188 - classification_loss: 0.0185 15/500 [..............................] - ETA: 4:09 - loss: 0.3531 - regression_loss: 0.3322 - classification_loss: 0.0210 16/500 [..............................] - ETA: 4:08 - loss: 0.3404 - regression_loss: 0.3203 - classification_loss: 0.0201 17/500 [>.............................] - ETA: 4:06 - loss: 0.3524 - regression_loss: 0.3318 - classification_loss: 0.0206 18/500 [>.............................] - ETA: 4:06 - loss: 0.3548 - regression_loss: 0.3339 - classification_loss: 0.0209 19/500 [>.............................] - ETA: 4:05 - loss: 0.3691 - regression_loss: 0.3427 - classification_loss: 0.0264 20/500 [>.............................] - ETA: 4:04 - loss: 0.3836 - regression_loss: 0.3552 - classification_loss: 0.0284 21/500 [>.............................] - ETA: 4:03 - loss: 0.3774 - regression_loss: 0.3497 - classification_loss: 0.0277 22/500 [>.............................] - ETA: 4:02 - loss: 0.3690 - regression_loss: 0.3421 - classification_loss: 0.0269 23/500 [>.............................] - ETA: 4:02 - loss: 0.3641 - regression_loss: 0.3379 - classification_loss: 0.0262 24/500 [>.............................] - ETA: 4:01 - loss: 0.3831 - regression_loss: 0.3537 - classification_loss: 0.0294 25/500 [>.............................] - ETA: 4:01 - loss: 0.3812 - regression_loss: 0.3520 - classification_loss: 0.0292 26/500 [>.............................] - ETA: 4:00 - loss: 0.3868 - regression_loss: 0.3564 - classification_loss: 0.0304 27/500 [>.............................] - ETA: 4:00 - loss: 0.3934 - regression_loss: 0.3624 - classification_loss: 0.0311 28/500 [>.............................] - ETA: 3:59 - loss: 0.3890 - regression_loss: 0.3582 - classification_loss: 0.0307 29/500 [>.............................] - ETA: 3:58 - loss: 0.4019 - regression_loss: 0.3714 - classification_loss: 0.0306 30/500 [>.............................] - ETA: 3:58 - loss: 0.4048 - regression_loss: 0.3740 - classification_loss: 0.0308 31/500 [>.............................] - ETA: 3:57 - loss: 0.4015 - regression_loss: 0.3709 - classification_loss: 0.0306 32/500 [>.............................] - ETA: 3:57 - loss: 0.4026 - regression_loss: 0.3715 - classification_loss: 0.0310 33/500 [>.............................] - ETA: 3:56 - loss: 0.3994 - regression_loss: 0.3682 - classification_loss: 0.0312 34/500 [=>............................] - ETA: 3:56 - loss: 0.3965 - regression_loss: 0.3660 - classification_loss: 0.0304 35/500 [=>............................] - ETA: 3:55 - loss: 0.3939 - regression_loss: 0.3636 - classification_loss: 0.0303 36/500 [=>............................] - ETA: 3:54 - loss: 0.3905 - regression_loss: 0.3609 - classification_loss: 0.0296 37/500 [=>............................] - ETA: 3:54 - loss: 0.3917 - regression_loss: 0.3624 - classification_loss: 0.0293 38/500 [=>............................] - ETA: 3:53 - loss: 0.3904 - regression_loss: 0.3613 - classification_loss: 0.0290 39/500 [=>............................] - ETA: 3:53 - loss: 0.3867 - regression_loss: 0.3582 - classification_loss: 0.0285 40/500 [=>............................] - ETA: 3:52 - loss: 0.3869 - regression_loss: 0.3586 - classification_loss: 0.0282 41/500 [=>............................] - ETA: 3:52 - loss: 0.3873 - regression_loss: 0.3589 - classification_loss: 0.0283 42/500 [=>............................] - ETA: 3:51 - loss: 0.3854 - regression_loss: 0.3570 - classification_loss: 0.0283 43/500 [=>............................] - ETA: 3:51 - loss: 0.3843 - regression_loss: 0.3554 - classification_loss: 0.0289 44/500 [=>............................] - ETA: 3:50 - loss: 0.3817 - regression_loss: 0.3531 - classification_loss: 0.0286 45/500 [=>............................] - ETA: 3:50 - loss: 0.3803 - regression_loss: 0.3519 - classification_loss: 0.0284 46/500 [=>............................] - ETA: 3:49 - loss: 0.3812 - regression_loss: 0.3523 - classification_loss: 0.0289 47/500 [=>............................] - ETA: 3:49 - loss: 0.3782 - regression_loss: 0.3498 - classification_loss: 0.0284 48/500 [=>............................] - ETA: 3:48 - loss: 0.3811 - regression_loss: 0.3525 - classification_loss: 0.0286 49/500 [=>............................] - ETA: 3:48 - loss: 0.3807 - regression_loss: 0.3523 - classification_loss: 0.0283 50/500 [==>...........................] - ETA: 3:47 - loss: 0.3876 - regression_loss: 0.3581 - classification_loss: 0.0296 51/500 [==>...........................] - ETA: 3:47 - loss: 0.3882 - regression_loss: 0.3587 - classification_loss: 0.0295 52/500 [==>...........................] - ETA: 3:46 - loss: 0.3886 - regression_loss: 0.3591 - classification_loss: 0.0295 53/500 [==>...........................] - ETA: 3:46 - loss: 0.3925 - regression_loss: 0.3622 - classification_loss: 0.0302 54/500 [==>...........................] - ETA: 3:45 - loss: 0.3920 - regression_loss: 0.3614 - classification_loss: 0.0306 55/500 [==>...........................] - ETA: 3:45 - loss: 0.3890 - regression_loss: 0.3587 - classification_loss: 0.0303 56/500 [==>...........................] - ETA: 3:44 - loss: 0.3875 - regression_loss: 0.3572 - classification_loss: 0.0303 57/500 [==>...........................] - ETA: 3:44 - loss: 0.3932 - regression_loss: 0.3625 - classification_loss: 0.0306 58/500 [==>...........................] - ETA: 3:43 - loss: 0.3932 - regression_loss: 0.3625 - classification_loss: 0.0307 59/500 [==>...........................] - ETA: 3:43 - loss: 0.3887 - regression_loss: 0.3583 - classification_loss: 0.0304 60/500 [==>...........................] - ETA: 3:43 - loss: 0.3901 - regression_loss: 0.3594 - classification_loss: 0.0307 61/500 [==>...........................] - ETA: 3:42 - loss: 0.3890 - regression_loss: 0.3584 - classification_loss: 0.0305 62/500 [==>...........................] - ETA: 3:42 - loss: 0.3954 - regression_loss: 0.3649 - classification_loss: 0.0305 63/500 [==>...........................] - ETA: 3:41 - loss: 0.4013 - regression_loss: 0.3711 - classification_loss: 0.0302 64/500 [==>...........................] - ETA: 3:41 - loss: 0.3992 - regression_loss: 0.3692 - classification_loss: 0.0300 65/500 [==>...........................] - ETA: 3:41 - loss: 0.3980 - regression_loss: 0.3682 - classification_loss: 0.0297 66/500 [==>...........................] - ETA: 3:40 - loss: 0.3958 - regression_loss: 0.3646 - classification_loss: 0.0312 67/500 [===>..........................] - ETA: 3:40 - loss: 0.3952 - regression_loss: 0.3641 - classification_loss: 0.0311 68/500 [===>..........................] - ETA: 3:39 - loss: 0.3947 - regression_loss: 0.3637 - classification_loss: 0.0310 69/500 [===>..........................] - ETA: 3:39 - loss: 0.3950 - regression_loss: 0.3638 - classification_loss: 0.0312 70/500 [===>..........................] - ETA: 3:38 - loss: 0.3948 - regression_loss: 0.3638 - classification_loss: 0.0310 71/500 [===>..........................] - ETA: 3:38 - loss: 0.3952 - regression_loss: 0.3642 - classification_loss: 0.0310 72/500 [===>..........................] - ETA: 3:37 - loss: 0.3949 - regression_loss: 0.3639 - classification_loss: 0.0310 73/500 [===>..........................] - ETA: 3:37 - loss: 0.3936 - regression_loss: 0.3626 - classification_loss: 0.0310 74/500 [===>..........................] - ETA: 3:36 - loss: 0.3920 - regression_loss: 0.3609 - classification_loss: 0.0311 75/500 [===>..........................] - ETA: 3:36 - loss: 0.3895 - regression_loss: 0.3586 - classification_loss: 0.0310 76/500 [===>..........................] - ETA: 3:35 - loss: 0.3874 - regression_loss: 0.3566 - classification_loss: 0.0307 77/500 [===>..........................] - ETA: 3:35 - loss: 0.3900 - regression_loss: 0.3589 - classification_loss: 0.0311 78/500 [===>..........................] - ETA: 3:34 - loss: 0.3889 - regression_loss: 0.3580 - classification_loss: 0.0309 79/500 [===>..........................] - ETA: 3:34 - loss: 0.3896 - regression_loss: 0.3588 - classification_loss: 0.0308 80/500 [===>..........................] - ETA: 3:33 - loss: 0.3885 - regression_loss: 0.3578 - classification_loss: 0.0307 81/500 [===>..........................] - ETA: 3:32 - loss: 0.3871 - regression_loss: 0.3566 - classification_loss: 0.0305 82/500 [===>..........................] - ETA: 3:32 - loss: 0.3878 - regression_loss: 0.3572 - classification_loss: 0.0305 83/500 [===>..........................] - ETA: 3:31 - loss: 0.3890 - regression_loss: 0.3585 - classification_loss: 0.0305 84/500 [====>.........................] - ETA: 3:30 - loss: 0.3884 - regression_loss: 0.3580 - classification_loss: 0.0304 85/500 [====>.........................] - ETA: 3:30 - loss: 0.3883 - regression_loss: 0.3580 - classification_loss: 0.0303 86/500 [====>.........................] - ETA: 3:29 - loss: 0.3867 - regression_loss: 0.3564 - classification_loss: 0.0302 87/500 [====>.........................] - ETA: 3:29 - loss: 0.3848 - regression_loss: 0.3548 - classification_loss: 0.0301 88/500 [====>.........................] - ETA: 3:28 - loss: 0.3837 - regression_loss: 0.3537 - classification_loss: 0.0299 89/500 [====>.........................] - ETA: 3:28 - loss: 0.3849 - regression_loss: 0.3551 - classification_loss: 0.0298 90/500 [====>.........................] - ETA: 3:27 - loss: 0.3828 - regression_loss: 0.3532 - classification_loss: 0.0296 91/500 [====>.........................] - ETA: 3:27 - loss: 0.3811 - regression_loss: 0.3516 - classification_loss: 0.0295 92/500 [====>.........................] - ETA: 3:26 - loss: 0.3791 - regression_loss: 0.3499 - classification_loss: 0.0293 93/500 [====>.........................] - ETA: 3:26 - loss: 0.3767 - regression_loss: 0.3477 - classification_loss: 0.0290 94/500 [====>.........................] - ETA: 3:25 - loss: 0.3767 - regression_loss: 0.3476 - classification_loss: 0.0291 95/500 [====>.........................] - ETA: 3:24 - loss: 0.3754 - regression_loss: 0.3464 - classification_loss: 0.0290 96/500 [====>.........................] - ETA: 3:24 - loss: 0.3744 - regression_loss: 0.3456 - classification_loss: 0.0288 97/500 [====>.........................] - ETA: 3:24 - loss: 0.3739 - regression_loss: 0.3451 - classification_loss: 0.0287 98/500 [====>.........................] - ETA: 3:23 - loss: 0.3716 - regression_loss: 0.3431 - classification_loss: 0.0285 99/500 [====>.........................] - ETA: 3:22 - loss: 0.3708 - regression_loss: 0.3424 - classification_loss: 0.0285 100/500 [=====>........................] - ETA: 3:22 - loss: 0.3705 - regression_loss: 0.3420 - classification_loss: 0.0285 101/500 [=====>........................] - ETA: 3:21 - loss: 0.3695 - regression_loss: 0.3412 - classification_loss: 0.0283 102/500 [=====>........................] - ETA: 3:21 - loss: 0.3708 - regression_loss: 0.3424 - classification_loss: 0.0284 103/500 [=====>........................] - ETA: 3:20 - loss: 0.3702 - regression_loss: 0.3419 - classification_loss: 0.0283 104/500 [=====>........................] - ETA: 3:20 - loss: 0.3736 - regression_loss: 0.3451 - classification_loss: 0.0285 105/500 [=====>........................] - ETA: 3:19 - loss: 0.3732 - regression_loss: 0.3448 - classification_loss: 0.0284 106/500 [=====>........................] - ETA: 3:19 - loss: 0.3737 - regression_loss: 0.3452 - classification_loss: 0.0284 107/500 [=====>........................] - ETA: 3:18 - loss: 0.3741 - regression_loss: 0.3457 - classification_loss: 0.0285 108/500 [=====>........................] - ETA: 3:18 - loss: 0.3743 - regression_loss: 0.3458 - classification_loss: 0.0285 109/500 [=====>........................] - ETA: 3:17 - loss: 0.3759 - regression_loss: 0.3473 - classification_loss: 0.0287 110/500 [=====>........................] - ETA: 3:17 - loss: 0.3748 - regression_loss: 0.3462 - classification_loss: 0.0285 111/500 [=====>........................] - ETA: 3:16 - loss: 0.3769 - regression_loss: 0.3483 - classification_loss: 0.0287 112/500 [=====>........................] - ETA: 3:16 - loss: 0.3754 - regression_loss: 0.3469 - classification_loss: 0.0285 113/500 [=====>........................] - ETA: 3:15 - loss: 0.3754 - regression_loss: 0.3471 - classification_loss: 0.0283 114/500 [=====>........................] - ETA: 3:15 - loss: 0.3731 - regression_loss: 0.3449 - classification_loss: 0.0282 115/500 [=====>........................] - ETA: 3:14 - loss: 0.3764 - regression_loss: 0.3479 - classification_loss: 0.0285 116/500 [=====>........................] - ETA: 3:14 - loss: 0.3771 - regression_loss: 0.3485 - classification_loss: 0.0285 117/500 [======>.......................] - ETA: 3:13 - loss: 0.3798 - regression_loss: 0.3510 - classification_loss: 0.0288 118/500 [======>.......................] - ETA: 3:13 - loss: 0.3793 - regression_loss: 0.3505 - classification_loss: 0.0287 119/500 [======>.......................] - ETA: 3:12 - loss: 0.3781 - regression_loss: 0.3495 - classification_loss: 0.0286 120/500 [======>.......................] - ETA: 3:11 - loss: 0.3791 - regression_loss: 0.3504 - classification_loss: 0.0287 121/500 [======>.......................] - ETA: 3:11 - loss: 0.3795 - regression_loss: 0.3507 - classification_loss: 0.0288 122/500 [======>.......................] - ETA: 3:10 - loss: 0.3787 - regression_loss: 0.3499 - classification_loss: 0.0288 123/500 [======>.......................] - ETA: 3:10 - loss: 0.3782 - regression_loss: 0.3495 - classification_loss: 0.0287 124/500 [======>.......................] - ETA: 3:10 - loss: 0.3768 - regression_loss: 0.3483 - classification_loss: 0.0285 125/500 [======>.......................] - ETA: 3:09 - loss: 0.3774 - regression_loss: 0.3489 - classification_loss: 0.0285 126/500 [======>.......................] - ETA: 3:08 - loss: 0.3779 - regression_loss: 0.3494 - classification_loss: 0.0285 127/500 [======>.......................] - ETA: 3:08 - loss: 0.3815 - regression_loss: 0.3525 - classification_loss: 0.0290 128/500 [======>.......................] - ETA: 3:08 - loss: 0.3822 - regression_loss: 0.3531 - classification_loss: 0.0290 129/500 [======>.......................] - ETA: 3:07 - loss: 0.3807 - regression_loss: 0.3518 - classification_loss: 0.0289 130/500 [======>.......................] - ETA: 3:06 - loss: 0.3790 - regression_loss: 0.3503 - classification_loss: 0.0287 131/500 [======>.......................] - ETA: 3:06 - loss: 0.3800 - regression_loss: 0.3512 - classification_loss: 0.0289 132/500 [======>.......................] - ETA: 3:05 - loss: 0.3792 - regression_loss: 0.3504 - classification_loss: 0.0287 133/500 [======>.......................] - ETA: 3:05 - loss: 0.3781 - regression_loss: 0.3494 - classification_loss: 0.0287 134/500 [=======>......................] - ETA: 3:04 - loss: 0.3773 - regression_loss: 0.3487 - classification_loss: 0.0286 135/500 [=======>......................] - ETA: 3:04 - loss: 0.3763 - regression_loss: 0.3478 - classification_loss: 0.0285 136/500 [=======>......................] - ETA: 3:03 - loss: 0.3769 - regression_loss: 0.3484 - classification_loss: 0.0284 137/500 [=======>......................] - ETA: 3:03 - loss: 0.3780 - regression_loss: 0.3496 - classification_loss: 0.0285 138/500 [=======>......................] - ETA: 3:02 - loss: 0.3777 - regression_loss: 0.3492 - classification_loss: 0.0285 139/500 [=======>......................] - ETA: 3:02 - loss: 0.3762 - regression_loss: 0.3478 - classification_loss: 0.0284 140/500 [=======>......................] - ETA: 3:01 - loss: 0.3740 - regression_loss: 0.3457 - classification_loss: 0.0282 141/500 [=======>......................] - ETA: 3:01 - loss: 0.3741 - regression_loss: 0.3459 - classification_loss: 0.0282 142/500 [=======>......................] - ETA: 3:00 - loss: 0.3744 - regression_loss: 0.3462 - classification_loss: 0.0283 143/500 [=======>......................] - ETA: 3:00 - loss: 0.3746 - regression_loss: 0.3464 - classification_loss: 0.0282 144/500 [=======>......................] - ETA: 2:59 - loss: 0.3758 - regression_loss: 0.3473 - classification_loss: 0.0285 145/500 [=======>......................] - ETA: 2:59 - loss: 0.3754 - regression_loss: 0.3470 - classification_loss: 0.0284 146/500 [=======>......................] - ETA: 2:58 - loss: 0.3737 - regression_loss: 0.3454 - classification_loss: 0.0283 147/500 [=======>......................] - ETA: 2:58 - loss: 0.3739 - regression_loss: 0.3456 - classification_loss: 0.0284 148/500 [=======>......................] - ETA: 2:57 - loss: 0.3745 - regression_loss: 0.3461 - classification_loss: 0.0284 149/500 [=======>......................] - ETA: 2:57 - loss: 0.3737 - regression_loss: 0.3454 - classification_loss: 0.0283 150/500 [========>.....................] - ETA: 2:56 - loss: 0.3738 - regression_loss: 0.3456 - classification_loss: 0.0282 151/500 [========>.....................] - ETA: 2:56 - loss: 0.3747 - regression_loss: 0.3465 - classification_loss: 0.0282 152/500 [========>.....................] - ETA: 2:55 - loss: 0.3758 - regression_loss: 0.3475 - classification_loss: 0.0283 153/500 [========>.....................] - ETA: 2:55 - loss: 0.3750 - regression_loss: 0.3469 - classification_loss: 0.0282 154/500 [========>.....................] - ETA: 2:54 - loss: 0.3778 - regression_loss: 0.3494 - classification_loss: 0.0284 155/500 [========>.....................] - ETA: 2:54 - loss: 0.3771 - regression_loss: 0.3488 - classification_loss: 0.0283 156/500 [========>.....................] - ETA: 2:53 - loss: 0.3781 - regression_loss: 0.3497 - classification_loss: 0.0284 157/500 [========>.....................] - ETA: 2:53 - loss: 0.3771 - regression_loss: 0.3488 - classification_loss: 0.0283 158/500 [========>.....................] - ETA: 2:52 - loss: 0.3761 - regression_loss: 0.3480 - classification_loss: 0.0282 159/500 [========>.....................] - ETA: 2:52 - loss: 0.3754 - regression_loss: 0.3473 - classification_loss: 0.0281 160/500 [========>.....................] - ETA: 2:51 - loss: 0.3751 - regression_loss: 0.3471 - classification_loss: 0.0281 161/500 [========>.....................] - ETA: 2:51 - loss: 0.3759 - regression_loss: 0.3477 - classification_loss: 0.0281 162/500 [========>.....................] - ETA: 2:50 - loss: 0.3769 - regression_loss: 0.3487 - classification_loss: 0.0282 163/500 [========>.....................] - ETA: 2:50 - loss: 0.3763 - regression_loss: 0.3482 - classification_loss: 0.0281 164/500 [========>.....................] - ETA: 2:49 - loss: 0.3786 - regression_loss: 0.3504 - classification_loss: 0.0282 165/500 [========>.....................] - ETA: 2:49 - loss: 0.3790 - regression_loss: 0.3506 - classification_loss: 0.0283 166/500 [========>.....................] - ETA: 2:48 - loss: 0.3782 - regression_loss: 0.3500 - classification_loss: 0.0282 167/500 [=========>....................] - ETA: 2:48 - loss: 0.3779 - regression_loss: 0.3498 - classification_loss: 0.0281 168/500 [=========>....................] - ETA: 2:47 - loss: 0.3784 - regression_loss: 0.3503 - classification_loss: 0.0281 169/500 [=========>....................] - ETA: 2:47 - loss: 0.3807 - regression_loss: 0.3524 - classification_loss: 0.0282 170/500 [=========>....................] - ETA: 2:46 - loss: 0.3803 - regression_loss: 0.3521 - classification_loss: 0.0282 171/500 [=========>....................] - ETA: 2:46 - loss: 0.3803 - regression_loss: 0.3521 - classification_loss: 0.0282 172/500 [=========>....................] - ETA: 2:45 - loss: 0.3801 - regression_loss: 0.3520 - classification_loss: 0.0282 173/500 [=========>....................] - ETA: 2:45 - loss: 0.3807 - regression_loss: 0.3524 - classification_loss: 0.0282 174/500 [=========>....................] - ETA: 2:44 - loss: 0.3794 - regression_loss: 0.3513 - classification_loss: 0.0281 175/500 [=========>....................] - ETA: 2:43 - loss: 0.3792 - regression_loss: 0.3510 - classification_loss: 0.0281 176/500 [=========>....................] - ETA: 2:43 - loss: 0.3796 - regression_loss: 0.3514 - classification_loss: 0.0282 177/500 [=========>....................] - ETA: 2:42 - loss: 0.3797 - regression_loss: 0.3516 - classification_loss: 0.0281 178/500 [=========>....................] - ETA: 2:42 - loss: 0.3802 - regression_loss: 0.3519 - classification_loss: 0.0283 179/500 [=========>....................] - ETA: 2:41 - loss: 0.3794 - regression_loss: 0.3512 - classification_loss: 0.0282 180/500 [=========>....................] - ETA: 2:41 - loss: 0.3806 - regression_loss: 0.3521 - classification_loss: 0.0285 181/500 [=========>....................] - ETA: 2:41 - loss: 0.3814 - regression_loss: 0.3529 - classification_loss: 0.0285 182/500 [=========>....................] - ETA: 2:40 - loss: 0.3803 - regression_loss: 0.3519 - classification_loss: 0.0284 183/500 [=========>....................] - ETA: 2:39 - loss: 0.3801 - regression_loss: 0.3517 - classification_loss: 0.0284 184/500 [==========>...................] - ETA: 2:39 - loss: 0.3799 - regression_loss: 0.3516 - classification_loss: 0.0283 185/500 [==========>...................] - ETA: 2:38 - loss: 0.3786 - regression_loss: 0.3503 - classification_loss: 0.0282 186/500 [==========>...................] - ETA: 2:38 - loss: 0.3791 - regression_loss: 0.3509 - classification_loss: 0.0282 187/500 [==========>...................] - ETA: 2:37 - loss: 0.3790 - regression_loss: 0.3509 - classification_loss: 0.0281 188/500 [==========>...................] - ETA: 2:37 - loss: 0.3791 - regression_loss: 0.3510 - classification_loss: 0.0281 189/500 [==========>...................] - ETA: 2:36 - loss: 0.3804 - regression_loss: 0.3522 - classification_loss: 0.0282 190/500 [==========>...................] - ETA: 2:36 - loss: 0.3798 - regression_loss: 0.3516 - classification_loss: 0.0282 191/500 [==========>...................] - ETA: 2:35 - loss: 0.3798 - regression_loss: 0.3516 - classification_loss: 0.0282 192/500 [==========>...................] - ETA: 2:35 - loss: 0.3796 - regression_loss: 0.3514 - classification_loss: 0.0282 193/500 [==========>...................] - ETA: 2:34 - loss: 0.3798 - regression_loss: 0.3517 - classification_loss: 0.0281 194/500 [==========>...................] - ETA: 2:34 - loss: 0.3796 - regression_loss: 0.3515 - classification_loss: 0.0281 195/500 [==========>...................] - ETA: 2:33 - loss: 0.3804 - regression_loss: 0.3522 - classification_loss: 0.0282 196/500 [==========>...................] - ETA: 2:33 - loss: 0.3809 - regression_loss: 0.3527 - classification_loss: 0.0282 197/500 [==========>...................] - ETA: 2:32 - loss: 0.3814 - regression_loss: 0.3532 - classification_loss: 0.0282 198/500 [==========>...................] - ETA: 2:32 - loss: 0.3808 - regression_loss: 0.3525 - classification_loss: 0.0282 199/500 [==========>...................] - ETA: 2:31 - loss: 0.3802 - regression_loss: 0.3521 - classification_loss: 0.0281 200/500 [===========>..................] - ETA: 2:31 - loss: 0.3796 - regression_loss: 0.3515 - classification_loss: 0.0281 201/500 [===========>..................] - ETA: 2:30 - loss: 0.3793 - regression_loss: 0.3513 - classification_loss: 0.0280 202/500 [===========>..................] - ETA: 2:30 - loss: 0.3787 - regression_loss: 0.3507 - classification_loss: 0.0279 203/500 [===========>..................] - ETA: 2:29 - loss: 0.3782 - regression_loss: 0.3503 - classification_loss: 0.0279 204/500 [===========>..................] - ETA: 2:29 - loss: 0.3784 - regression_loss: 0.3506 - classification_loss: 0.0279 205/500 [===========>..................] - ETA: 2:28 - loss: 0.3793 - regression_loss: 0.3513 - classification_loss: 0.0280 206/500 [===========>..................] - ETA: 2:28 - loss: 0.3794 - regression_loss: 0.3514 - classification_loss: 0.0280 207/500 [===========>..................] - ETA: 2:27 - loss: 0.3800 - regression_loss: 0.3520 - classification_loss: 0.0280 208/500 [===========>..................] - ETA: 2:27 - loss: 0.3798 - regression_loss: 0.3518 - classification_loss: 0.0280 209/500 [===========>..................] - ETA: 2:26 - loss: 0.3809 - regression_loss: 0.3530 - classification_loss: 0.0279 210/500 [===========>..................] - ETA: 2:26 - loss: 0.3809 - regression_loss: 0.3530 - classification_loss: 0.0279 211/500 [===========>..................] - ETA: 2:25 - loss: 0.3798 - regression_loss: 0.3521 - classification_loss: 0.0278 212/500 [===========>..................] - ETA: 2:25 - loss: 0.3814 - regression_loss: 0.3536 - classification_loss: 0.0278 213/500 [===========>..................] - ETA: 2:24 - loss: 0.3814 - regression_loss: 0.3536 - classification_loss: 0.0278 214/500 [===========>..................] - ETA: 2:24 - loss: 0.3804 - regression_loss: 0.3526 - classification_loss: 0.0277 215/500 [===========>..................] - ETA: 2:23 - loss: 0.3796 - regression_loss: 0.3519 - classification_loss: 0.0277 216/500 [===========>..................] - ETA: 2:23 - loss: 0.3796 - regression_loss: 0.3520 - classification_loss: 0.0277 217/500 [============>.................] - ETA: 2:22 - loss: 0.3796 - regression_loss: 0.3519 - classification_loss: 0.0276 218/500 [============>.................] - ETA: 2:22 - loss: 0.3788 - regression_loss: 0.3512 - classification_loss: 0.0276 219/500 [============>.................] - ETA: 2:21 - loss: 0.3782 - regression_loss: 0.3507 - classification_loss: 0.0276 220/500 [============>.................] - ETA: 2:21 - loss: 0.3771 - regression_loss: 0.3496 - classification_loss: 0.0275 221/500 [============>.................] - ETA: 2:20 - loss: 0.3771 - regression_loss: 0.3496 - classification_loss: 0.0274 222/500 [============>.................] - ETA: 2:20 - loss: 0.3769 - regression_loss: 0.3495 - classification_loss: 0.0274 223/500 [============>.................] - ETA: 2:19 - loss: 0.3765 - regression_loss: 0.3491 - classification_loss: 0.0274 224/500 [============>.................] - ETA: 2:19 - loss: 0.3762 - regression_loss: 0.3489 - classification_loss: 0.0274 225/500 [============>.................] - ETA: 2:18 - loss: 0.3760 - regression_loss: 0.3487 - classification_loss: 0.0274 226/500 [============>.................] - ETA: 2:18 - loss: 0.3779 - regression_loss: 0.3503 - classification_loss: 0.0276 227/500 [============>.................] - ETA: 2:17 - loss: 0.3783 - regression_loss: 0.3507 - classification_loss: 0.0276 228/500 [============>.................] - ETA: 2:17 - loss: 0.3783 - regression_loss: 0.3507 - classification_loss: 0.0276 229/500 [============>.................] - ETA: 2:16 - loss: 0.3775 - regression_loss: 0.3500 - classification_loss: 0.0275 230/500 [============>.................] - ETA: 2:16 - loss: 0.3773 - regression_loss: 0.3498 - classification_loss: 0.0275 231/500 [============>.................] - ETA: 2:15 - loss: 0.3778 - regression_loss: 0.3501 - classification_loss: 0.0276 232/500 [============>.................] - ETA: 2:15 - loss: 0.3768 - regression_loss: 0.3492 - classification_loss: 0.0276 233/500 [============>.................] - ETA: 2:14 - loss: 0.3762 - regression_loss: 0.3487 - classification_loss: 0.0275 234/500 [=============>................] - ETA: 2:14 - loss: 0.3761 - regression_loss: 0.3486 - classification_loss: 0.0274 235/500 [=============>................] - ETA: 2:13 - loss: 0.3751 - regression_loss: 0.3477 - classification_loss: 0.0274 236/500 [=============>................] - ETA: 2:13 - loss: 0.3746 - regression_loss: 0.3473 - classification_loss: 0.0273 237/500 [=============>................] - ETA: 2:12 - loss: 0.3750 - regression_loss: 0.3477 - classification_loss: 0.0273 238/500 [=============>................] - ETA: 2:12 - loss: 0.3741 - regression_loss: 0.3469 - classification_loss: 0.0272 239/500 [=============>................] - ETA: 2:11 - loss: 0.3738 - regression_loss: 0.3467 - classification_loss: 0.0272 240/500 [=============>................] - ETA: 2:11 - loss: 0.3737 - regression_loss: 0.3466 - classification_loss: 0.0271 241/500 [=============>................] - ETA: 2:10 - loss: 0.3741 - regression_loss: 0.3469 - classification_loss: 0.0271 242/500 [=============>................] - ETA: 2:10 - loss: 0.3741 - regression_loss: 0.3469 - classification_loss: 0.0272 243/500 [=============>................] - ETA: 2:09 - loss: 0.3741 - regression_loss: 0.3468 - classification_loss: 0.0273 244/500 [=============>................] - ETA: 2:09 - loss: 0.3735 - regression_loss: 0.3463 - classification_loss: 0.0272 245/500 [=============>................] - ETA: 2:08 - loss: 0.3732 - regression_loss: 0.3460 - classification_loss: 0.0271 246/500 [=============>................] - ETA: 2:08 - loss: 0.3736 - regression_loss: 0.3465 - classification_loss: 0.0271 247/500 [=============>................] - ETA: 2:07 - loss: 0.3730 - regression_loss: 0.3460 - classification_loss: 0.0270 248/500 [=============>................] - ETA: 2:07 - loss: 0.3726 - regression_loss: 0.3457 - classification_loss: 0.0269 249/500 [=============>................] - ETA: 2:06 - loss: 0.3729 - regression_loss: 0.3460 - classification_loss: 0.0269 250/500 [==============>...............] - ETA: 2:06 - loss: 0.3724 - regression_loss: 0.3455 - classification_loss: 0.0268 251/500 [==============>...............] - ETA: 2:05 - loss: 0.3716 - regression_loss: 0.3449 - classification_loss: 0.0268 252/500 [==============>...............] - ETA: 2:05 - loss: 0.3722 - regression_loss: 0.3454 - classification_loss: 0.0268 253/500 [==============>...............] - ETA: 2:04 - loss: 0.3715 - regression_loss: 0.3448 - classification_loss: 0.0267 254/500 [==============>...............] - ETA: 2:04 - loss: 0.3714 - regression_loss: 0.3448 - classification_loss: 0.0266 255/500 [==============>...............] - ETA: 2:03 - loss: 0.3722 - regression_loss: 0.3455 - classification_loss: 0.0268 256/500 [==============>...............] - ETA: 2:03 - loss: 0.3722 - regression_loss: 0.3454 - classification_loss: 0.0267 257/500 [==============>...............] - ETA: 2:02 - loss: 0.3724 - regression_loss: 0.3457 - classification_loss: 0.0267 258/500 [==============>...............] - ETA: 2:02 - loss: 0.3723 - regression_loss: 0.3455 - classification_loss: 0.0267 259/500 [==============>...............] - ETA: 2:01 - loss: 0.3725 - regression_loss: 0.3458 - classification_loss: 0.0267 260/500 [==============>...............] - ETA: 2:01 - loss: 0.3720 - regression_loss: 0.3453 - classification_loss: 0.0267 261/500 [==============>...............] - ETA: 2:00 - loss: 0.3716 - regression_loss: 0.3450 - classification_loss: 0.0266 262/500 [==============>...............] - ETA: 2:00 - loss: 0.3719 - regression_loss: 0.3452 - classification_loss: 0.0266 263/500 [==============>...............] - ETA: 1:59 - loss: 0.3719 - regression_loss: 0.3452 - classification_loss: 0.0266 264/500 [==============>...............] - ETA: 1:59 - loss: 0.3734 - regression_loss: 0.3466 - classification_loss: 0.0268 265/500 [==============>...............] - ETA: 1:58 - loss: 0.3747 - regression_loss: 0.3479 - classification_loss: 0.0268 266/500 [==============>...............] - ETA: 1:58 - loss: 0.3751 - regression_loss: 0.3483 - classification_loss: 0.0268 267/500 [===============>..............] - ETA: 1:57 - loss: 0.3760 - regression_loss: 0.3491 - classification_loss: 0.0268 268/500 [===============>..............] - ETA: 1:57 - loss: 0.3765 - regression_loss: 0.3497 - classification_loss: 0.0268 269/500 [===============>..............] - ETA: 1:56 - loss: 0.3759 - regression_loss: 0.3492 - classification_loss: 0.0267 270/500 [===============>..............] - ETA: 1:56 - loss: 0.3761 - regression_loss: 0.3493 - classification_loss: 0.0267 271/500 [===============>..............] - ETA: 1:55 - loss: 0.3759 - regression_loss: 0.3493 - classification_loss: 0.0267 272/500 [===============>..............] - ETA: 1:55 - loss: 0.3766 - regression_loss: 0.3498 - classification_loss: 0.0268 273/500 [===============>..............] - ETA: 1:54 - loss: 0.3764 - regression_loss: 0.3497 - classification_loss: 0.0267 274/500 [===============>..............] - ETA: 1:54 - loss: 0.3772 - regression_loss: 0.3504 - classification_loss: 0.0268 275/500 [===============>..............] - ETA: 1:53 - loss: 0.3776 - regression_loss: 0.3508 - classification_loss: 0.0268 276/500 [===============>..............] - ETA: 1:52 - loss: 0.3773 - regression_loss: 0.3505 - classification_loss: 0.0268 277/500 [===============>..............] - ETA: 1:52 - loss: 0.3771 - regression_loss: 0.3503 - classification_loss: 0.0268 278/500 [===============>..............] - ETA: 1:51 - loss: 0.3769 - regression_loss: 0.3502 - classification_loss: 0.0267 279/500 [===============>..............] - ETA: 1:51 - loss: 0.3767 - regression_loss: 0.3500 - classification_loss: 0.0268 280/500 [===============>..............] - ETA: 1:50 - loss: 0.3763 - regression_loss: 0.3496 - classification_loss: 0.0267 281/500 [===============>..............] - ETA: 1:50 - loss: 0.3773 - regression_loss: 0.3505 - classification_loss: 0.0267 282/500 [===============>..............] - ETA: 1:49 - loss: 0.3778 - regression_loss: 0.3511 - classification_loss: 0.0267 283/500 [===============>..............] - ETA: 1:49 - loss: 0.3776 - regression_loss: 0.3509 - classification_loss: 0.0267 284/500 [================>.............] - ETA: 1:48 - loss: 0.3778 - regression_loss: 0.3512 - classification_loss: 0.0267 285/500 [================>.............] - ETA: 1:48 - loss: 0.3773 - regression_loss: 0.3506 - classification_loss: 0.0267 286/500 [================>.............] - ETA: 1:47 - loss: 0.3771 - regression_loss: 0.3505 - classification_loss: 0.0266 287/500 [================>.............] - ETA: 1:47 - loss: 0.3776 - regression_loss: 0.3509 - classification_loss: 0.0267 288/500 [================>.............] - ETA: 1:46 - loss: 0.3774 - regression_loss: 0.3507 - classification_loss: 0.0267 289/500 [================>.............] - ETA: 1:46 - loss: 0.3771 - regression_loss: 0.3504 - classification_loss: 0.0267 290/500 [================>.............] - ETA: 1:45 - loss: 0.3771 - regression_loss: 0.3504 - classification_loss: 0.0267 291/500 [================>.............] - ETA: 1:45 - loss: 0.3768 - regression_loss: 0.3502 - classification_loss: 0.0266 292/500 [================>.............] - ETA: 1:44 - loss: 0.3765 - regression_loss: 0.3500 - classification_loss: 0.0266 293/500 [================>.............] - ETA: 1:44 - loss: 0.3763 - regression_loss: 0.3498 - classification_loss: 0.0265 294/500 [================>.............] - ETA: 1:43 - loss: 0.3754 - regression_loss: 0.3489 - classification_loss: 0.0265 295/500 [================>.............] - ETA: 1:43 - loss: 0.3759 - regression_loss: 0.3493 - classification_loss: 0.0266 296/500 [================>.............] - ETA: 1:42 - loss: 0.3757 - regression_loss: 0.3491 - classification_loss: 0.0266 297/500 [================>.............] - ETA: 1:42 - loss: 0.3763 - regression_loss: 0.3497 - classification_loss: 0.0266 298/500 [================>.............] - ETA: 1:41 - loss: 0.3764 - regression_loss: 0.3498 - classification_loss: 0.0266 299/500 [================>.............] - ETA: 1:41 - loss: 0.3757 - regression_loss: 0.3492 - classification_loss: 0.0265 300/500 [=================>............] - ETA: 1:40 - loss: 0.3756 - regression_loss: 0.3492 - classification_loss: 0.0264 301/500 [=================>............] - ETA: 1:40 - loss: 0.3759 - regression_loss: 0.3495 - classification_loss: 0.0264 302/500 [=================>............] - ETA: 1:39 - loss: 0.3758 - regression_loss: 0.3494 - classification_loss: 0.0264 303/500 [=================>............] - ETA: 1:39 - loss: 0.3764 - regression_loss: 0.3499 - classification_loss: 0.0265 304/500 [=================>............] - ETA: 1:38 - loss: 0.3765 - regression_loss: 0.3499 - classification_loss: 0.0265 305/500 [=================>............] - ETA: 1:38 - loss: 0.3767 - regression_loss: 0.3501 - classification_loss: 0.0266 306/500 [=================>............] - ETA: 1:37 - loss: 0.3775 - regression_loss: 0.3509 - classification_loss: 0.0266 307/500 [=================>............] - ETA: 1:37 - loss: 0.3772 - regression_loss: 0.3506 - classification_loss: 0.0266 308/500 [=================>............] - ETA: 1:36 - loss: 0.3765 - regression_loss: 0.3500 - classification_loss: 0.0265 309/500 [=================>............] - ETA: 1:36 - loss: 0.3770 - regression_loss: 0.3505 - classification_loss: 0.0265 310/500 [=================>............] - ETA: 1:35 - loss: 0.3769 - regression_loss: 0.3505 - classification_loss: 0.0265 311/500 [=================>............] - ETA: 1:35 - loss: 0.3781 - regression_loss: 0.3515 - classification_loss: 0.0265 312/500 [=================>............] - ETA: 1:34 - loss: 0.3790 - regression_loss: 0.3523 - classification_loss: 0.0267 313/500 [=================>............] - ETA: 1:34 - loss: 0.3792 - regression_loss: 0.3525 - classification_loss: 0.0267 314/500 [=================>............] - ETA: 1:33 - loss: 0.3795 - regression_loss: 0.3528 - classification_loss: 0.0267 315/500 [=================>............] - ETA: 1:33 - loss: 0.3801 - regression_loss: 0.3534 - classification_loss: 0.0267 316/500 [=================>............] - ETA: 1:32 - loss: 0.3802 - regression_loss: 0.3535 - classification_loss: 0.0267 317/500 [==================>...........] - ETA: 1:32 - loss: 0.3805 - regression_loss: 0.3538 - classification_loss: 0.0267 318/500 [==================>...........] - ETA: 1:31 - loss: 0.3816 - regression_loss: 0.3548 - classification_loss: 0.0268 319/500 [==================>...........] - ETA: 1:31 - loss: 0.3811 - regression_loss: 0.3544 - classification_loss: 0.0267 320/500 [==================>...........] - ETA: 1:30 - loss: 0.3833 - regression_loss: 0.3563 - classification_loss: 0.0270 321/500 [==================>...........] - ETA: 1:30 - loss: 0.3840 - regression_loss: 0.3569 - classification_loss: 0.0271 322/500 [==================>...........] - ETA: 1:29 - loss: 0.3858 - regression_loss: 0.3585 - classification_loss: 0.0272 323/500 [==================>...........] - ETA: 1:29 - loss: 0.3852 - regression_loss: 0.3580 - classification_loss: 0.0272 324/500 [==================>...........] - ETA: 1:28 - loss: 0.3847 - regression_loss: 0.3576 - classification_loss: 0.0271 325/500 [==================>...........] - ETA: 1:28 - loss: 0.3851 - regression_loss: 0.3579 - classification_loss: 0.0271 326/500 [==================>...........] - ETA: 1:27 - loss: 0.3857 - regression_loss: 0.3586 - classification_loss: 0.0271 327/500 [==================>...........] - ETA: 1:27 - loss: 0.3858 - regression_loss: 0.3587 - classification_loss: 0.0271 328/500 [==================>...........] - ETA: 1:26 - loss: 0.3861 - regression_loss: 0.3590 - classification_loss: 0.0271 329/500 [==================>...........] - ETA: 1:26 - loss: 0.3859 - regression_loss: 0.3587 - classification_loss: 0.0271 330/500 [==================>...........] - ETA: 1:25 - loss: 0.3857 - regression_loss: 0.3586 - classification_loss: 0.0271 331/500 [==================>...........] - ETA: 1:25 - loss: 0.3857 - regression_loss: 0.3586 - classification_loss: 0.0271 332/500 [==================>...........] - ETA: 1:24 - loss: 0.3856 - regression_loss: 0.3586 - classification_loss: 0.0270 333/500 [==================>...........] - ETA: 1:24 - loss: 0.3853 - regression_loss: 0.3583 - classification_loss: 0.0270 334/500 [===================>..........] - ETA: 1:23 - loss: 0.3854 - regression_loss: 0.3584 - classification_loss: 0.0271 335/500 [===================>..........] - ETA: 1:23 - loss: 0.3859 - regression_loss: 0.3587 - classification_loss: 0.0272 336/500 [===================>..........] - ETA: 1:22 - loss: 0.3856 - regression_loss: 0.3585 - classification_loss: 0.0272 337/500 [===================>..........] - ETA: 1:22 - loss: 0.3855 - regression_loss: 0.3584 - classification_loss: 0.0271 338/500 [===================>..........] - ETA: 1:21 - loss: 0.3855 - regression_loss: 0.3584 - classification_loss: 0.0271 339/500 [===================>..........] - ETA: 1:21 - loss: 0.3856 - regression_loss: 0.3585 - classification_loss: 0.0271 340/500 [===================>..........] - ETA: 1:20 - loss: 0.3854 - regression_loss: 0.3583 - classification_loss: 0.0271 341/500 [===================>..........] - ETA: 1:20 - loss: 0.3849 - regression_loss: 0.3578 - classification_loss: 0.0270 342/500 [===================>..........] - ETA: 1:19 - loss: 0.3845 - regression_loss: 0.3575 - classification_loss: 0.0270 343/500 [===================>..........] - ETA: 1:19 - loss: 0.3845 - regression_loss: 0.3574 - classification_loss: 0.0271 344/500 [===================>..........] - ETA: 1:18 - loss: 0.3846 - regression_loss: 0.3576 - classification_loss: 0.0271 345/500 [===================>..........] - ETA: 1:18 - loss: 0.3856 - regression_loss: 0.3584 - classification_loss: 0.0272 346/500 [===================>..........] - ETA: 1:17 - loss: 0.3861 - regression_loss: 0.3589 - classification_loss: 0.0272 347/500 [===================>..........] - ETA: 1:17 - loss: 0.3863 - regression_loss: 0.3591 - classification_loss: 0.0272 348/500 [===================>..........] - ETA: 1:16 - loss: 0.3855 - regression_loss: 0.3583 - classification_loss: 0.0271 349/500 [===================>..........] - ETA: 1:16 - loss: 0.3853 - regression_loss: 0.3581 - classification_loss: 0.0271 350/500 [====================>.........] - ETA: 1:15 - loss: 0.3856 - regression_loss: 0.3584 - classification_loss: 0.0271 351/500 [====================>.........] - ETA: 1:15 - loss: 0.3860 - regression_loss: 0.3589 - classification_loss: 0.0271 352/500 [====================>.........] - ETA: 1:14 - loss: 0.3862 - regression_loss: 0.3591 - classification_loss: 0.0271 353/500 [====================>.........] - ETA: 1:14 - loss: 0.3862 - regression_loss: 0.3590 - classification_loss: 0.0272 354/500 [====================>.........] - ETA: 1:13 - loss: 0.3859 - regression_loss: 0.3587 - classification_loss: 0.0271 355/500 [====================>.........] - ETA: 1:13 - loss: 0.3864 - regression_loss: 0.3593 - classification_loss: 0.0272 356/500 [====================>.........] - ETA: 1:12 - loss: 0.3865 - regression_loss: 0.3594 - classification_loss: 0.0272 357/500 [====================>.........] - ETA: 1:12 - loss: 0.3865 - regression_loss: 0.3593 - classification_loss: 0.0272 358/500 [====================>.........] - ETA: 1:11 - loss: 0.3862 - regression_loss: 0.3590 - classification_loss: 0.0271 359/500 [====================>.........] - ETA: 1:11 - loss: 0.3856 - regression_loss: 0.3585 - classification_loss: 0.0271 360/500 [====================>.........] - ETA: 1:10 - loss: 0.3858 - regression_loss: 0.3587 - classification_loss: 0.0271 361/500 [====================>.........] - ETA: 1:10 - loss: 0.3860 - regression_loss: 0.3589 - classification_loss: 0.0271 362/500 [====================>.........] - ETA: 1:09 - loss: 0.3858 - regression_loss: 0.3588 - classification_loss: 0.0271 363/500 [====================>.........] - ETA: 1:09 - loss: 0.3858 - regression_loss: 0.3587 - classification_loss: 0.0271 364/500 [====================>.........] - ETA: 1:08 - loss: 0.3855 - regression_loss: 0.3585 - classification_loss: 0.0270 365/500 [====================>.........] - ETA: 1:08 - loss: 0.3851 - regression_loss: 0.3581 - classification_loss: 0.0270 366/500 [====================>.........] - ETA: 1:07 - loss: 0.3850 - regression_loss: 0.3579 - classification_loss: 0.0270 367/500 [=====================>........] - ETA: 1:07 - loss: 0.3846 - regression_loss: 0.3576 - classification_loss: 0.0270 368/500 [=====================>........] - ETA: 1:06 - loss: 0.3845 - regression_loss: 0.3575 - classification_loss: 0.0270 369/500 [=====================>........] - ETA: 1:06 - loss: 0.3842 - regression_loss: 0.3573 - classification_loss: 0.0269 370/500 [=====================>........] - ETA: 1:05 - loss: 0.3842 - regression_loss: 0.3572 - classification_loss: 0.0270 371/500 [=====================>........] - ETA: 1:05 - loss: 0.3836 - regression_loss: 0.3567 - classification_loss: 0.0269 372/500 [=====================>........] - ETA: 1:04 - loss: 0.3836 - regression_loss: 0.3567 - classification_loss: 0.0269 373/500 [=====================>........] - ETA: 1:04 - loss: 0.3833 - regression_loss: 0.3565 - classification_loss: 0.0269 374/500 [=====================>........] - ETA: 1:03 - loss: 0.3831 - regression_loss: 0.3563 - classification_loss: 0.0268 375/500 [=====================>........] - ETA: 1:03 - loss: 0.3827 - regression_loss: 0.3559 - classification_loss: 0.0268 376/500 [=====================>........] - ETA: 1:02 - loss: 0.3826 - regression_loss: 0.3558 - classification_loss: 0.0268 377/500 [=====================>........] - ETA: 1:02 - loss: 0.3823 - regression_loss: 0.3555 - classification_loss: 0.0267 378/500 [=====================>........] - ETA: 1:01 - loss: 0.3820 - regression_loss: 0.3553 - classification_loss: 0.0267 379/500 [=====================>........] - ETA: 1:01 - loss: 0.3818 - regression_loss: 0.3551 - classification_loss: 0.0267 380/500 [=====================>........] - ETA: 1:00 - loss: 0.3814 - regression_loss: 0.3548 - classification_loss: 0.0266 381/500 [=====================>........] - ETA: 1:00 - loss: 0.3814 - regression_loss: 0.3548 - classification_loss: 0.0266 382/500 [=====================>........] - ETA: 59s - loss: 0.3815 - regression_loss: 0.3548 - classification_loss: 0.0266  383/500 [=====================>........] - ETA: 59s - loss: 0.3814 - regression_loss: 0.3548 - classification_loss: 0.0266 384/500 [======================>.......] - ETA: 58s - loss: 0.3817 - regression_loss: 0.3551 - classification_loss: 0.0267 385/500 [======================>.......] - ETA: 58s - loss: 0.3817 - regression_loss: 0.3551 - classification_loss: 0.0266 386/500 [======================>.......] - ETA: 57s - loss: 0.3814 - regression_loss: 0.3548 - classification_loss: 0.0266 387/500 [======================>.......] - ETA: 56s - loss: 0.3816 - regression_loss: 0.3550 - classification_loss: 0.0266 388/500 [======================>.......] - ETA: 56s - loss: 0.3812 - regression_loss: 0.3547 - classification_loss: 0.0266 389/500 [======================>.......] - ETA: 55s - loss: 0.3813 - regression_loss: 0.3547 - classification_loss: 0.0266 390/500 [======================>.......] - ETA: 55s - loss: 0.3815 - regression_loss: 0.3549 - classification_loss: 0.0266 391/500 [======================>.......] - ETA: 54s - loss: 0.3812 - regression_loss: 0.3546 - classification_loss: 0.0266 392/500 [======================>.......] - ETA: 54s - loss: 0.3808 - regression_loss: 0.3543 - classification_loss: 0.0266 393/500 [======================>.......] - ETA: 53s - loss: 0.3807 - regression_loss: 0.3542 - classification_loss: 0.0265 394/500 [======================>.......] - ETA: 53s - loss: 0.3805 - regression_loss: 0.3540 - classification_loss: 0.0265 395/500 [======================>.......] - ETA: 52s - loss: 0.3808 - regression_loss: 0.3543 - classification_loss: 0.0265 396/500 [======================>.......] - ETA: 52s - loss: 0.3804 - regression_loss: 0.3539 - classification_loss: 0.0265 397/500 [======================>.......] - ETA: 51s - loss: 0.3797 - regression_loss: 0.3533 - classification_loss: 0.0264 398/500 [======================>.......] - ETA: 51s - loss: 0.3796 - regression_loss: 0.3532 - classification_loss: 0.0264 399/500 [======================>.......] - ETA: 50s - loss: 0.3795 - regression_loss: 0.3531 - classification_loss: 0.0264 400/500 [=======================>......] - ETA: 50s - loss: 0.3794 - regression_loss: 0.3530 - classification_loss: 0.0263 401/500 [=======================>......] - ETA: 49s - loss: 0.3799 - regression_loss: 0.3536 - classification_loss: 0.0264 402/500 [=======================>......] - ETA: 49s - loss: 0.3793 - regression_loss: 0.3530 - classification_loss: 0.0263 403/500 [=======================>......] - ETA: 48s - loss: 0.3795 - regression_loss: 0.3532 - classification_loss: 0.0263 404/500 [=======================>......] - ETA: 48s - loss: 0.3795 - regression_loss: 0.3532 - classification_loss: 0.0263 405/500 [=======================>......] - ETA: 47s - loss: 0.3794 - regression_loss: 0.3531 - classification_loss: 0.0264 406/500 [=======================>......] - ETA: 47s - loss: 0.3792 - regression_loss: 0.3528 - classification_loss: 0.0263 407/500 [=======================>......] - ETA: 46s - loss: 0.3791 - regression_loss: 0.3528 - classification_loss: 0.0263 408/500 [=======================>......] - ETA: 46s - loss: 0.3793 - regression_loss: 0.3530 - classification_loss: 0.0263 409/500 [=======================>......] - ETA: 45s - loss: 0.3791 - regression_loss: 0.3528 - classification_loss: 0.0263 410/500 [=======================>......] - ETA: 45s - loss: 0.3787 - regression_loss: 0.3524 - classification_loss: 0.0263 411/500 [=======================>......] - ETA: 44s - loss: 0.3786 - regression_loss: 0.3523 - classification_loss: 0.0263 412/500 [=======================>......] - ETA: 44s - loss: 0.3784 - regression_loss: 0.3522 - classification_loss: 0.0263 413/500 [=======================>......] - ETA: 43s - loss: 0.3791 - regression_loss: 0.3528 - classification_loss: 0.0263 414/500 [=======================>......] - ETA: 43s - loss: 0.3786 - regression_loss: 0.3524 - classification_loss: 0.0263 415/500 [=======================>......] - ETA: 42s - loss: 0.3782 - regression_loss: 0.3520 - classification_loss: 0.0262 416/500 [=======================>......] - ETA: 42s - loss: 0.3781 - regression_loss: 0.3519 - classification_loss: 0.0262 417/500 [========================>.....] - ETA: 41s - loss: 0.3785 - regression_loss: 0.3523 - classification_loss: 0.0262 418/500 [========================>.....] - ETA: 41s - loss: 0.3788 - regression_loss: 0.3525 - classification_loss: 0.0263 419/500 [========================>.....] - ETA: 40s - loss: 0.3782 - regression_loss: 0.3520 - classification_loss: 0.0262 420/500 [========================>.....] - ETA: 40s - loss: 0.3786 - regression_loss: 0.3524 - classification_loss: 0.0263 421/500 [========================>.....] - ETA: 39s - loss: 0.3787 - regression_loss: 0.3524 - classification_loss: 0.0263 422/500 [========================>.....] - ETA: 39s - loss: 0.3786 - regression_loss: 0.3523 - classification_loss: 0.0263 423/500 [========================>.....] - ETA: 38s - loss: 0.3799 - regression_loss: 0.3536 - classification_loss: 0.0263 424/500 [========================>.....] - ETA: 38s - loss: 0.3803 - regression_loss: 0.3539 - classification_loss: 0.0264 425/500 [========================>.....] - ETA: 37s - loss: 0.3801 - regression_loss: 0.3538 - classification_loss: 0.0263 426/500 [========================>.....] - ETA: 37s - loss: 0.3801 - regression_loss: 0.3538 - classification_loss: 0.0263 427/500 [========================>.....] - ETA: 36s - loss: 0.3802 - regression_loss: 0.3539 - classification_loss: 0.0263 428/500 [========================>.....] - ETA: 36s - loss: 0.3808 - regression_loss: 0.3545 - classification_loss: 0.0264 429/500 [========================>.....] - ETA: 35s - loss: 0.3809 - regression_loss: 0.3545 - classification_loss: 0.0264 430/500 [========================>.....] - ETA: 35s - loss: 0.3803 - regression_loss: 0.3540 - classification_loss: 0.0263 431/500 [========================>.....] - ETA: 34s - loss: 0.3805 - regression_loss: 0.3541 - classification_loss: 0.0264 432/500 [========================>.....] - ETA: 34s - loss: 0.3804 - regression_loss: 0.3541 - classification_loss: 0.0264 433/500 [========================>.....] - ETA: 33s - loss: 0.3808 - regression_loss: 0.3543 - classification_loss: 0.0265 434/500 [=========================>....] - ETA: 33s - loss: 0.3815 - regression_loss: 0.3550 - classification_loss: 0.0265 435/500 [=========================>....] - ETA: 32s - loss: 0.3815 - regression_loss: 0.3550 - classification_loss: 0.0265 436/500 [=========================>....] - ETA: 32s - loss: 0.3810 - regression_loss: 0.3546 - classification_loss: 0.0264 437/500 [=========================>....] - ETA: 31s - loss: 0.3811 - regression_loss: 0.3546 - classification_loss: 0.0264 438/500 [=========================>....] - ETA: 31s - loss: 0.3811 - regression_loss: 0.3546 - classification_loss: 0.0265 439/500 [=========================>....] - ETA: 30s - loss: 0.3812 - regression_loss: 0.3548 - classification_loss: 0.0265 440/500 [=========================>....] - ETA: 30s - loss: 0.3812 - regression_loss: 0.3546 - classification_loss: 0.0265 441/500 [=========================>....] - ETA: 29s - loss: 0.3809 - regression_loss: 0.3544 - classification_loss: 0.0265 442/500 [=========================>....] - ETA: 29s - loss: 0.3807 - regression_loss: 0.3542 - classification_loss: 0.0265 443/500 [=========================>....] - ETA: 28s - loss: 0.3803 - regression_loss: 0.3539 - classification_loss: 0.0264 444/500 [=========================>....] - ETA: 28s - loss: 0.3803 - regression_loss: 0.3539 - classification_loss: 0.0264 445/500 [=========================>....] - ETA: 27s - loss: 0.3803 - regression_loss: 0.3539 - classification_loss: 0.0264 446/500 [=========================>....] - ETA: 27s - loss: 0.3800 - regression_loss: 0.3535 - classification_loss: 0.0265 447/500 [=========================>....] - ETA: 26s - loss: 0.3802 - regression_loss: 0.3537 - classification_loss: 0.0265 448/500 [=========================>....] - ETA: 26s - loss: 0.3802 - regression_loss: 0.3537 - classification_loss: 0.0265 449/500 [=========================>....] - ETA: 25s - loss: 0.3800 - regression_loss: 0.3536 - classification_loss: 0.0264 450/500 [==========================>...] - ETA: 25s - loss: 0.3807 - regression_loss: 0.3542 - classification_loss: 0.0265 451/500 [==========================>...] - ETA: 24s - loss: 0.3813 - regression_loss: 0.3548 - classification_loss: 0.0265 452/500 [==========================>...] - ETA: 24s - loss: 0.3818 - regression_loss: 0.3553 - classification_loss: 0.0264 453/500 [==========================>...] - ETA: 23s - loss: 0.3827 - regression_loss: 0.3562 - classification_loss: 0.0265 454/500 [==========================>...] - ETA: 23s - loss: 0.3822 - regression_loss: 0.3557 - classification_loss: 0.0265 455/500 [==========================>...] - ETA: 22s - loss: 0.3824 - regression_loss: 0.3558 - classification_loss: 0.0265 456/500 [==========================>...] - ETA: 22s - loss: 0.3821 - regression_loss: 0.3556 - classification_loss: 0.0265 457/500 [==========================>...] - ETA: 21s - loss: 0.3828 - regression_loss: 0.3563 - classification_loss: 0.0265 458/500 [==========================>...] - ETA: 21s - loss: 0.3827 - regression_loss: 0.3562 - classification_loss: 0.0265 459/500 [==========================>...] - ETA: 20s - loss: 0.3826 - regression_loss: 0.3561 - classification_loss: 0.0265 460/500 [==========================>...] - ETA: 20s - loss: 0.3822 - regression_loss: 0.3557 - classification_loss: 0.0264 461/500 [==========================>...] - ETA: 19s - loss: 0.3829 - regression_loss: 0.3565 - classification_loss: 0.0265 462/500 [==========================>...] - ETA: 19s - loss: 0.3827 - regression_loss: 0.3563 - classification_loss: 0.0264 463/500 [==========================>...] - ETA: 18s - loss: 0.3824 - regression_loss: 0.3560 - classification_loss: 0.0264 464/500 [==========================>...] - ETA: 18s - loss: 0.3822 - regression_loss: 0.3558 - classification_loss: 0.0264 465/500 [==========================>...] - ETA: 17s - loss: 0.3821 - regression_loss: 0.3557 - classification_loss: 0.0264 466/500 [==========================>...] - ETA: 17s - loss: 0.3819 - regression_loss: 0.3556 - classification_loss: 0.0264 467/500 [===========================>..] - ETA: 16s - loss: 0.3814 - regression_loss: 0.3551 - classification_loss: 0.0263 468/500 [===========================>..] - ETA: 16s - loss: 0.3812 - regression_loss: 0.3549 - classification_loss: 0.0263 469/500 [===========================>..] - ETA: 15s - loss: 0.3814 - regression_loss: 0.3551 - classification_loss: 0.0263 470/500 [===========================>..] - ETA: 15s - loss: 0.3817 - regression_loss: 0.3554 - classification_loss: 0.0263 471/500 [===========================>..] - ETA: 14s - loss: 0.3819 - regression_loss: 0.3556 - classification_loss: 0.0263 472/500 [===========================>..] - ETA: 14s - loss: 0.3814 - regression_loss: 0.3552 - classification_loss: 0.0263 473/500 [===========================>..] - ETA: 13s - loss: 0.3815 - regression_loss: 0.3553 - classification_loss: 0.0262 474/500 [===========================>..] - ETA: 13s - loss: 0.3820 - regression_loss: 0.3557 - classification_loss: 0.0263 475/500 [===========================>..] - ETA: 12s - loss: 0.3818 - regression_loss: 0.3556 - classification_loss: 0.0262 476/500 [===========================>..] - ETA: 12s - loss: 0.3820 - regression_loss: 0.3557 - classification_loss: 0.0263 477/500 [===========================>..] - ETA: 11s - loss: 0.3820 - regression_loss: 0.3557 - classification_loss: 0.0263 478/500 [===========================>..] - ETA: 11s - loss: 0.3825 - regression_loss: 0.3561 - classification_loss: 0.0263 479/500 [===========================>..] - ETA: 10s - loss: 0.3825 - regression_loss: 0.3562 - classification_loss: 0.0263 480/500 [===========================>..] - ETA: 10s - loss: 0.3821 - regression_loss: 0.3558 - classification_loss: 0.0263 481/500 [===========================>..] - ETA: 9s - loss: 0.3822 - regression_loss: 0.3559 - classification_loss: 0.0263  482/500 [===========================>..] - ETA: 9s - loss: 0.3820 - regression_loss: 0.3557 - classification_loss: 0.0263 483/500 [===========================>..] - ETA: 8s - loss: 0.3821 - regression_loss: 0.3558 - classification_loss: 0.0263 484/500 [============================>.] - ETA: 8s - loss: 0.3821 - regression_loss: 0.3557 - classification_loss: 0.0263 485/500 [============================>.] - ETA: 7s - loss: 0.3819 - regression_loss: 0.3556 - classification_loss: 0.0263 486/500 [============================>.] - ETA: 7s - loss: 0.3822 - regression_loss: 0.3559 - classification_loss: 0.0263 487/500 [============================>.] - ETA: 6s - loss: 0.3823 - regression_loss: 0.3560 - classification_loss: 0.0263 488/500 [============================>.] - ETA: 6s - loss: 0.3823 - regression_loss: 0.3559 - classification_loss: 0.0263 489/500 [============================>.] - ETA: 5s - loss: 0.3820 - regression_loss: 0.3557 - classification_loss: 0.0263 490/500 [============================>.] - ETA: 5s - loss: 0.3818 - regression_loss: 0.3555 - classification_loss: 0.0263 491/500 [============================>.] - ETA: 4s - loss: 0.3813 - regression_loss: 0.3551 - classification_loss: 0.0262 492/500 [============================>.] - ETA: 4s - loss: 0.3808 - regression_loss: 0.3546 - classification_loss: 0.0262 493/500 [============================>.] - ETA: 3s - loss: 0.3805 - regression_loss: 0.3543 - classification_loss: 0.0262 494/500 [============================>.] - ETA: 3s - loss: 0.3805 - regression_loss: 0.3543 - classification_loss: 0.0262 495/500 [============================>.] - ETA: 2s - loss: 0.3804 - regression_loss: 0.3542 - classification_loss: 0.0262 496/500 [============================>.] - ETA: 2s - loss: 0.3802 - regression_loss: 0.3540 - classification_loss: 0.0262 497/500 [============================>.] - ETA: 1s - loss: 0.3799 - regression_loss: 0.3538 - classification_loss: 0.0262 498/500 [============================>.] - ETA: 1s - loss: 0.3798 - regression_loss: 0.3536 - classification_loss: 0.0262 499/500 [============================>.] - ETA: 0s - loss: 0.3800 - regression_loss: 0.3538 - classification_loss: 0.0262 500/500 [==============================] - 252s 504ms/step - loss: 0.3798 - regression_loss: 0.3537 - classification_loss: 0.0261 1172 instances of class plum with average precision: 0.7355 mAP: 0.7355 Epoch 00011: saving model to ./training/snapshots/resnet101_pascal_11.h5 Epoch 12/150 1/500 [..............................] - ETA: 4:15 - loss: 0.6891 - regression_loss: 0.6565 - classification_loss: 0.0326 2/500 [..............................] - ETA: 4:04 - loss: 0.5637 - regression_loss: 0.5330 - classification_loss: 0.0307 3/500 [..............................] - ETA: 4:04 - loss: 0.5210 - regression_loss: 0.4934 - classification_loss: 0.0276 4/500 [..............................] - ETA: 4:02 - loss: 0.4784 - regression_loss: 0.4476 - classification_loss: 0.0308 5/500 [..............................] - ETA: 4:02 - loss: 0.4312 - regression_loss: 0.4012 - classification_loss: 0.0301 6/500 [..............................] - ETA: 4:02 - loss: 0.4027 - regression_loss: 0.3761 - classification_loss: 0.0266 7/500 [..............................] - ETA: 4:03 - loss: 0.4265 - regression_loss: 0.3939 - classification_loss: 0.0327 8/500 [..............................] - ETA: 4:03 - loss: 0.4227 - regression_loss: 0.3903 - classification_loss: 0.0324 9/500 [..............................] - ETA: 4:02 - loss: 0.4053 - regression_loss: 0.3752 - classification_loss: 0.0301 10/500 [..............................] - ETA: 4:04 - loss: 0.4004 - regression_loss: 0.3707 - classification_loss: 0.0297 11/500 [..............................] - ETA: 4:03 - loss: 0.3787 - regression_loss: 0.3504 - classification_loss: 0.0283 12/500 [..............................] - ETA: 4:03 - loss: 0.3659 - regression_loss: 0.3396 - classification_loss: 0.0263 13/500 [..............................] - ETA: 4:03 - loss: 0.3704 - regression_loss: 0.3438 - classification_loss: 0.0266 14/500 [..............................] - ETA: 4:02 - loss: 0.3773 - regression_loss: 0.3505 - classification_loss: 0.0268 15/500 [..............................] - ETA: 4:01 - loss: 0.3654 - regression_loss: 0.3397 - classification_loss: 0.0257 16/500 [..............................] - ETA: 4:01 - loss: 0.3815 - regression_loss: 0.3537 - classification_loss: 0.0279 17/500 [>.............................] - ETA: 4:00 - loss: 0.3957 - regression_loss: 0.3672 - classification_loss: 0.0285 18/500 [>.............................] - ETA: 4:00 - loss: 0.3963 - regression_loss: 0.3680 - classification_loss: 0.0283 19/500 [>.............................] - ETA: 3:59 - loss: 0.3867 - regression_loss: 0.3594 - classification_loss: 0.0273 20/500 [>.............................] - ETA: 3:59 - loss: 0.3856 - regression_loss: 0.3587 - classification_loss: 0.0269 21/500 [>.............................] - ETA: 3:59 - loss: 0.3859 - regression_loss: 0.3592 - classification_loss: 0.0267 22/500 [>.............................] - ETA: 3:58 - loss: 0.3807 - regression_loss: 0.3544 - classification_loss: 0.0263 23/500 [>.............................] - ETA: 3:58 - loss: 0.3863 - regression_loss: 0.3586 - classification_loss: 0.0277 24/500 [>.............................] - ETA: 3:57 - loss: 0.3886 - regression_loss: 0.3603 - classification_loss: 0.0283 25/500 [>.............................] - ETA: 3:58 - loss: 0.3782 - regression_loss: 0.3507 - classification_loss: 0.0275 26/500 [>.............................] - ETA: 3:57 - loss: 0.3812 - regression_loss: 0.3530 - classification_loss: 0.0282 27/500 [>.............................] - ETA: 3:56 - loss: 0.3806 - regression_loss: 0.3527 - classification_loss: 0.0279 28/500 [>.............................] - ETA: 3:56 - loss: 0.3689 - regression_loss: 0.3417 - classification_loss: 0.0272 29/500 [>.............................] - ETA: 3:56 - loss: 0.3670 - regression_loss: 0.3405 - classification_loss: 0.0265 30/500 [>.............................] - ETA: 3:55 - loss: 0.3645 - regression_loss: 0.3387 - classification_loss: 0.0258 31/500 [>.............................] - ETA: 3:54 - loss: 0.3602 - regression_loss: 0.3347 - classification_loss: 0.0255 32/500 [>.............................] - ETA: 3:54 - loss: 0.3695 - regression_loss: 0.3436 - classification_loss: 0.0259 33/500 [>.............................] - ETA: 3:53 - loss: 0.3732 - regression_loss: 0.3473 - classification_loss: 0.0260 34/500 [=>............................] - ETA: 3:53 - loss: 0.3695 - regression_loss: 0.3441 - classification_loss: 0.0254 35/500 [=>............................] - ETA: 3:52 - loss: 0.3759 - regression_loss: 0.3495 - classification_loss: 0.0264 36/500 [=>............................] - ETA: 3:52 - loss: 0.3724 - regression_loss: 0.3464 - classification_loss: 0.0261 37/500 [=>............................] - ETA: 3:52 - loss: 0.3703 - regression_loss: 0.3446 - classification_loss: 0.0258 38/500 [=>............................] - ETA: 3:51 - loss: 0.3660 - regression_loss: 0.3407 - classification_loss: 0.0252 39/500 [=>............................] - ETA: 3:51 - loss: 0.3660 - regression_loss: 0.3409 - classification_loss: 0.0251 40/500 [=>............................] - ETA: 3:50 - loss: 0.3646 - regression_loss: 0.3396 - classification_loss: 0.0249 41/500 [=>............................] - ETA: 3:50 - loss: 0.3638 - regression_loss: 0.3390 - classification_loss: 0.0248 42/500 [=>............................] - ETA: 3:49 - loss: 0.3594 - regression_loss: 0.3348 - classification_loss: 0.0245 43/500 [=>............................] - ETA: 3:49 - loss: 0.3577 - regression_loss: 0.3335 - classification_loss: 0.0242 44/500 [=>............................] - ETA: 3:48 - loss: 0.3584 - regression_loss: 0.3343 - classification_loss: 0.0242 45/500 [=>............................] - ETA: 3:48 - loss: 0.3602 - regression_loss: 0.3360 - classification_loss: 0.0242 46/500 [=>............................] - ETA: 3:47 - loss: 0.3672 - regression_loss: 0.3426 - classification_loss: 0.0246 47/500 [=>............................] - ETA: 3:46 - loss: 0.3718 - regression_loss: 0.3462 - classification_loss: 0.0256 48/500 [=>............................] - ETA: 3:46 - loss: 0.3670 - regression_loss: 0.3419 - classification_loss: 0.0252 49/500 [=>............................] - ETA: 3:46 - loss: 0.3659 - regression_loss: 0.3409 - classification_loss: 0.0250 50/500 [==>...........................] - ETA: 3:45 - loss: 0.3690 - regression_loss: 0.3436 - classification_loss: 0.0254 51/500 [==>...........................] - ETA: 3:45 - loss: 0.3651 - regression_loss: 0.3401 - classification_loss: 0.0251 52/500 [==>...........................] - ETA: 3:44 - loss: 0.3624 - regression_loss: 0.3378 - classification_loss: 0.0247 53/500 [==>...........................] - ETA: 3:44 - loss: 0.3649 - regression_loss: 0.3403 - classification_loss: 0.0245 54/500 [==>...........................] - ETA: 3:43 - loss: 0.3710 - regression_loss: 0.3454 - classification_loss: 0.0256 55/500 [==>...........................] - ETA: 3:43 - loss: 0.3722 - regression_loss: 0.3468 - classification_loss: 0.0254 56/500 [==>...........................] - ETA: 3:43 - loss: 0.3737 - regression_loss: 0.3480 - classification_loss: 0.0258 57/500 [==>...........................] - ETA: 3:42 - loss: 0.3728 - regression_loss: 0.3473 - classification_loss: 0.0255 58/500 [==>...........................] - ETA: 3:41 - loss: 0.3743 - regression_loss: 0.3483 - classification_loss: 0.0260 59/500 [==>...........................] - ETA: 3:41 - loss: 0.3746 - regression_loss: 0.3487 - classification_loss: 0.0259 60/500 [==>...........................] - ETA: 3:40 - loss: 0.3728 - regression_loss: 0.3467 - classification_loss: 0.0261 61/500 [==>...........................] - ETA: 3:40 - loss: 0.3743 - regression_loss: 0.3482 - classification_loss: 0.0261 62/500 [==>...........................] - ETA: 3:39 - loss: 0.3753 - regression_loss: 0.3492 - classification_loss: 0.0262 63/500 [==>...........................] - ETA: 3:39 - loss: 0.3731 - regression_loss: 0.3473 - classification_loss: 0.0258 64/500 [==>...........................] - ETA: 3:38 - loss: 0.3733 - regression_loss: 0.3477 - classification_loss: 0.0256 65/500 [==>...........................] - ETA: 3:38 - loss: 0.3715 - regression_loss: 0.3463 - classification_loss: 0.0252 66/500 [==>...........................] - ETA: 3:37 - loss: 0.3720 - regression_loss: 0.3466 - classification_loss: 0.0253 67/500 [===>..........................] - ETA: 3:37 - loss: 0.3744 - regression_loss: 0.3487 - classification_loss: 0.0257 68/500 [===>..........................] - ETA: 3:36 - loss: 0.3741 - regression_loss: 0.3484 - classification_loss: 0.0256 69/500 [===>..........................] - ETA: 3:36 - loss: 0.3739 - regression_loss: 0.3482 - classification_loss: 0.0257 70/500 [===>..........................] - ETA: 3:35 - loss: 0.3725 - regression_loss: 0.3470 - classification_loss: 0.0256 71/500 [===>..........................] - ETA: 3:35 - loss: 0.3706 - regression_loss: 0.3451 - classification_loss: 0.0256 72/500 [===>..........................] - ETA: 3:35 - loss: 0.3739 - regression_loss: 0.3478 - classification_loss: 0.0262 73/500 [===>..........................] - ETA: 3:34 - loss: 0.3729 - regression_loss: 0.3468 - classification_loss: 0.0260 74/500 [===>..........................] - ETA: 3:33 - loss: 0.3719 - regression_loss: 0.3461 - classification_loss: 0.0259 75/500 [===>..........................] - ETA: 3:33 - loss: 0.3693 - regression_loss: 0.3436 - classification_loss: 0.0257 76/500 [===>..........................] - ETA: 3:32 - loss: 0.3684 - regression_loss: 0.3428 - classification_loss: 0.0257 77/500 [===>..........................] - ETA: 3:32 - loss: 0.3670 - regression_loss: 0.3414 - classification_loss: 0.0256 78/500 [===>..........................] - ETA: 3:31 - loss: 0.3669 - regression_loss: 0.3415 - classification_loss: 0.0255 79/500 [===>..........................] - ETA: 3:30 - loss: 0.3664 - regression_loss: 0.3411 - classification_loss: 0.0253 80/500 [===>..........................] - ETA: 3:30 - loss: 0.3655 - regression_loss: 0.3403 - classification_loss: 0.0252 81/500 [===>..........................] - ETA: 3:29 - loss: 0.3659 - regression_loss: 0.3405 - classification_loss: 0.0253 82/500 [===>..........................] - ETA: 3:29 - loss: 0.3644 - regression_loss: 0.3392 - classification_loss: 0.0252 83/500 [===>..........................] - ETA: 3:28 - loss: 0.3658 - regression_loss: 0.3406 - classification_loss: 0.0251 84/500 [====>.........................] - ETA: 3:28 - loss: 0.3645 - regression_loss: 0.3394 - classification_loss: 0.0251 85/500 [====>.........................] - ETA: 3:27 - loss: 0.3640 - regression_loss: 0.3391 - classification_loss: 0.0249 86/500 [====>.........................] - ETA: 3:27 - loss: 0.3627 - regression_loss: 0.3380 - classification_loss: 0.0247 87/500 [====>.........................] - ETA: 3:26 - loss: 0.3638 - regression_loss: 0.3387 - classification_loss: 0.0251 88/500 [====>.........................] - ETA: 3:26 - loss: 0.3642 - regression_loss: 0.3391 - classification_loss: 0.0251 89/500 [====>.........................] - ETA: 3:26 - loss: 0.3639 - regression_loss: 0.3388 - classification_loss: 0.0251 90/500 [====>.........................] - ETA: 3:25 - loss: 0.3635 - regression_loss: 0.3384 - classification_loss: 0.0251 91/500 [====>.........................] - ETA: 3:25 - loss: 0.3616 - regression_loss: 0.3366 - classification_loss: 0.0249 92/500 [====>.........................] - ETA: 3:24 - loss: 0.3606 - regression_loss: 0.3359 - classification_loss: 0.0247 93/500 [====>.........................] - ETA: 3:23 - loss: 0.3614 - regression_loss: 0.3367 - classification_loss: 0.0248 94/500 [====>.........................] - ETA: 3:23 - loss: 0.3635 - regression_loss: 0.3387 - classification_loss: 0.0248 95/500 [====>.........................] - ETA: 3:23 - loss: 0.3639 - regression_loss: 0.3391 - classification_loss: 0.0248 96/500 [====>.........................] - ETA: 3:22 - loss: 0.3630 - regression_loss: 0.3383 - classification_loss: 0.0247 97/500 [====>.........................] - ETA: 3:21 - loss: 0.3641 - regression_loss: 0.3394 - classification_loss: 0.0247 98/500 [====>.........................] - ETA: 3:21 - loss: 0.3630 - regression_loss: 0.3383 - classification_loss: 0.0247 99/500 [====>.........................] - ETA: 3:20 - loss: 0.3627 - regression_loss: 0.3381 - classification_loss: 0.0245 100/500 [=====>........................] - ETA: 3:20 - loss: 0.3637 - regression_loss: 0.3390 - classification_loss: 0.0247 101/500 [=====>........................] - ETA: 3:20 - loss: 0.3623 - regression_loss: 0.3378 - classification_loss: 0.0245 102/500 [=====>........................] - ETA: 3:19 - loss: 0.3638 - regression_loss: 0.3393 - classification_loss: 0.0245 103/500 [=====>........................] - ETA: 3:19 - loss: 0.3660 - regression_loss: 0.3414 - classification_loss: 0.0246 104/500 [=====>........................] - ETA: 3:18 - loss: 0.3666 - regression_loss: 0.3420 - classification_loss: 0.0245 105/500 [=====>........................] - ETA: 3:18 - loss: 0.3653 - regression_loss: 0.3409 - classification_loss: 0.0244 106/500 [=====>........................] - ETA: 3:17 - loss: 0.3674 - regression_loss: 0.3429 - classification_loss: 0.0245 107/500 [=====>........................] - ETA: 3:17 - loss: 0.3695 - regression_loss: 0.3448 - classification_loss: 0.0248 108/500 [=====>........................] - ETA: 3:16 - loss: 0.3676 - regression_loss: 0.3430 - classification_loss: 0.0246 109/500 [=====>........................] - ETA: 3:15 - loss: 0.3675 - regression_loss: 0.3430 - classification_loss: 0.0245 110/500 [=====>........................] - ETA: 3:15 - loss: 0.3659 - regression_loss: 0.3415 - classification_loss: 0.0244 111/500 [=====>........................] - ETA: 3:14 - loss: 0.3647 - regression_loss: 0.3405 - classification_loss: 0.0243 112/500 [=====>........................] - ETA: 3:14 - loss: 0.3651 - regression_loss: 0.3408 - classification_loss: 0.0243 113/500 [=====>........................] - ETA: 3:13 - loss: 0.3644 - regression_loss: 0.3400 - classification_loss: 0.0243 114/500 [=====>........................] - ETA: 3:13 - loss: 0.3635 - regression_loss: 0.3392 - classification_loss: 0.0243 115/500 [=====>........................] - ETA: 3:12 - loss: 0.3626 - regression_loss: 0.3384 - classification_loss: 0.0242 116/500 [=====>........................] - ETA: 3:12 - loss: 0.3623 - regression_loss: 0.3382 - classification_loss: 0.0241 117/500 [======>.......................] - ETA: 3:11 - loss: 0.3622 - regression_loss: 0.3381 - classification_loss: 0.0241 118/500 [======>.......................] - ETA: 3:11 - loss: 0.3648 - regression_loss: 0.3402 - classification_loss: 0.0245 119/500 [======>.......................] - ETA: 3:10 - loss: 0.3648 - regression_loss: 0.3403 - classification_loss: 0.0245 120/500 [======>.......................] - ETA: 3:10 - loss: 0.3684 - regression_loss: 0.3439 - classification_loss: 0.0246 121/500 [======>.......................] - ETA: 3:09 - loss: 0.3684 - regression_loss: 0.3438 - classification_loss: 0.0246 122/500 [======>.......................] - ETA: 3:09 - loss: 0.3674 - regression_loss: 0.3429 - classification_loss: 0.0245 123/500 [======>.......................] - ETA: 3:08 - loss: 0.3685 - regression_loss: 0.3442 - classification_loss: 0.0244 124/500 [======>.......................] - ETA: 3:08 - loss: 0.3674 - regression_loss: 0.3431 - classification_loss: 0.0243 125/500 [======>.......................] - ETA: 3:07 - loss: 0.3673 - regression_loss: 0.3431 - classification_loss: 0.0242 126/500 [======>.......................] - ETA: 3:07 - loss: 0.3683 - regression_loss: 0.3441 - classification_loss: 0.0242 127/500 [======>.......................] - ETA: 3:06 - loss: 0.3679 - regression_loss: 0.3437 - classification_loss: 0.0242 128/500 [======>.......................] - ETA: 3:06 - loss: 0.3669 - regression_loss: 0.3426 - classification_loss: 0.0243 129/500 [======>.......................] - ETA: 3:05 - loss: 0.3664 - regression_loss: 0.3422 - classification_loss: 0.0242 130/500 [======>.......................] - ETA: 3:05 - loss: 0.3658 - regression_loss: 0.3417 - classification_loss: 0.0240 131/500 [======>.......................] - ETA: 3:04 - loss: 0.3658 - regression_loss: 0.3418 - classification_loss: 0.0240 132/500 [======>.......................] - ETA: 3:04 - loss: 0.3648 - regression_loss: 0.3409 - classification_loss: 0.0239 133/500 [======>.......................] - ETA: 3:03 - loss: 0.3635 - regression_loss: 0.3398 - classification_loss: 0.0238 134/500 [=======>......................] - ETA: 3:03 - loss: 0.3620 - regression_loss: 0.3383 - classification_loss: 0.0236 135/500 [=======>......................] - ETA: 3:02 - loss: 0.3607 - regression_loss: 0.3372 - classification_loss: 0.0235 136/500 [=======>......................] - ETA: 3:02 - loss: 0.3598 - regression_loss: 0.3364 - classification_loss: 0.0234 137/500 [=======>......................] - ETA: 3:01 - loss: 0.3616 - regression_loss: 0.3380 - classification_loss: 0.0236 138/500 [=======>......................] - ETA: 3:01 - loss: 0.3636 - regression_loss: 0.3396 - classification_loss: 0.0240 139/500 [=======>......................] - ETA: 3:00 - loss: 0.3643 - regression_loss: 0.3401 - classification_loss: 0.0242 140/500 [=======>......................] - ETA: 3:00 - loss: 0.3645 - regression_loss: 0.3403 - classification_loss: 0.0242 141/500 [=======>......................] - ETA: 2:59 - loss: 0.3635 - regression_loss: 0.3393 - classification_loss: 0.0242 142/500 [=======>......................] - ETA: 2:59 - loss: 0.3633 - regression_loss: 0.3391 - classification_loss: 0.0241 143/500 [=======>......................] - ETA: 2:58 - loss: 0.3637 - regression_loss: 0.3396 - classification_loss: 0.0241 144/500 [=======>......................] - ETA: 2:58 - loss: 0.3651 - regression_loss: 0.3409 - classification_loss: 0.0242 145/500 [=======>......................] - ETA: 2:57 - loss: 0.3651 - regression_loss: 0.3408 - classification_loss: 0.0242 146/500 [=======>......................] - ETA: 2:57 - loss: 0.3651 - regression_loss: 0.3408 - classification_loss: 0.0243 147/500 [=======>......................] - ETA: 2:56 - loss: 0.3642 - regression_loss: 0.3400 - classification_loss: 0.0242 148/500 [=======>......................] - ETA: 2:56 - loss: 0.3649 - regression_loss: 0.3407 - classification_loss: 0.0243 149/500 [=======>......................] - ETA: 2:55 - loss: 0.3657 - regression_loss: 0.3414 - classification_loss: 0.0243 150/500 [========>.....................] - ETA: 2:55 - loss: 0.3658 - regression_loss: 0.3416 - classification_loss: 0.0243 151/500 [========>.....................] - ETA: 2:54 - loss: 0.3648 - regression_loss: 0.3406 - classification_loss: 0.0242 152/500 [========>.....................] - ETA: 2:54 - loss: 0.3666 - regression_loss: 0.3423 - classification_loss: 0.0243 153/500 [========>.....................] - ETA: 2:53 - loss: 0.3665 - regression_loss: 0.3422 - classification_loss: 0.0243 154/500 [========>.....................] - ETA: 2:53 - loss: 0.3684 - regression_loss: 0.3441 - classification_loss: 0.0243 155/500 [========>.....................] - ETA: 2:52 - loss: 0.3685 - regression_loss: 0.3441 - classification_loss: 0.0244 156/500 [========>.....................] - ETA: 2:52 - loss: 0.3694 - regression_loss: 0.3450 - classification_loss: 0.0244 157/500 [========>.....................] - ETA: 2:51 - loss: 0.3686 - regression_loss: 0.3443 - classification_loss: 0.0242 158/500 [========>.....................] - ETA: 2:51 - loss: 0.3683 - regression_loss: 0.3441 - classification_loss: 0.0242 159/500 [========>.....................] - ETA: 2:50 - loss: 0.3689 - regression_loss: 0.3447 - classification_loss: 0.0242 160/500 [========>.....................] - ETA: 2:50 - loss: 0.3707 - regression_loss: 0.3462 - classification_loss: 0.0245 161/500 [========>.....................] - ETA: 2:49 - loss: 0.3710 - regression_loss: 0.3465 - classification_loss: 0.0245 162/500 [========>.....................] - ETA: 2:49 - loss: 0.3711 - regression_loss: 0.3465 - classification_loss: 0.0245 163/500 [========>.....................] - ETA: 2:48 - loss: 0.3712 - regression_loss: 0.3467 - classification_loss: 0.0245 164/500 [========>.....................] - ETA: 2:48 - loss: 0.3701 - regression_loss: 0.3456 - classification_loss: 0.0244 165/500 [========>.....................] - ETA: 2:47 - loss: 0.3692 - regression_loss: 0.3449 - classification_loss: 0.0243 166/500 [========>.....................] - ETA: 2:47 - loss: 0.3702 - regression_loss: 0.3458 - classification_loss: 0.0244 167/500 [=========>....................] - ETA: 2:46 - loss: 0.3700 - regression_loss: 0.3456 - classification_loss: 0.0244 168/500 [=========>....................] - ETA: 2:46 - loss: 0.3707 - regression_loss: 0.3462 - classification_loss: 0.0245 169/500 [=========>....................] - ETA: 2:45 - loss: 0.3693 - regression_loss: 0.3448 - classification_loss: 0.0244 170/500 [=========>....................] - ETA: 2:45 - loss: 0.3702 - regression_loss: 0.3458 - classification_loss: 0.0245 171/500 [=========>....................] - ETA: 2:44 - loss: 0.3705 - regression_loss: 0.3460 - classification_loss: 0.0245 172/500 [=========>....................] - ETA: 2:44 - loss: 0.3693 - regression_loss: 0.3450 - classification_loss: 0.0244 173/500 [=========>....................] - ETA: 2:43 - loss: 0.3696 - regression_loss: 0.3452 - classification_loss: 0.0244 174/500 [=========>....................] - ETA: 2:43 - loss: 0.3692 - regression_loss: 0.3449 - classification_loss: 0.0243 175/500 [=========>....................] - ETA: 2:43 - loss: 0.3691 - regression_loss: 0.3448 - classification_loss: 0.0243 176/500 [=========>....................] - ETA: 2:42 - loss: 0.3695 - regression_loss: 0.3452 - classification_loss: 0.0243 177/500 [=========>....................] - ETA: 2:41 - loss: 0.3700 - regression_loss: 0.3457 - classification_loss: 0.0243 178/500 [=========>....................] - ETA: 2:41 - loss: 0.3709 - regression_loss: 0.3464 - classification_loss: 0.0245 179/500 [=========>....................] - ETA: 2:40 - loss: 0.3702 - regression_loss: 0.3457 - classification_loss: 0.0245 180/500 [=========>....................] - ETA: 2:40 - loss: 0.3697 - regression_loss: 0.3453 - classification_loss: 0.0244 181/500 [=========>....................] - ETA: 2:39 - loss: 0.3692 - regression_loss: 0.3448 - classification_loss: 0.0244 182/500 [=========>....................] - ETA: 2:39 - loss: 0.3685 - regression_loss: 0.3442 - classification_loss: 0.0243 183/500 [=========>....................] - ETA: 2:38 - loss: 0.3678 - regression_loss: 0.3436 - classification_loss: 0.0242 184/500 [==========>...................] - ETA: 2:38 - loss: 0.3685 - regression_loss: 0.3443 - classification_loss: 0.0242 185/500 [==========>...................] - ETA: 2:38 - loss: 0.3694 - regression_loss: 0.3452 - classification_loss: 0.0242 186/500 [==========>...................] - ETA: 2:37 - loss: 0.3698 - regression_loss: 0.3455 - classification_loss: 0.0243 187/500 [==========>...................] - ETA: 2:37 - loss: 0.3705 - regression_loss: 0.3461 - classification_loss: 0.0244 188/500 [==========>...................] - ETA: 2:36 - loss: 0.3693 - regression_loss: 0.3449 - classification_loss: 0.0243 189/500 [==========>...................] - ETA: 2:35 - loss: 0.3687 - regression_loss: 0.3445 - classification_loss: 0.0243 190/500 [==========>...................] - ETA: 2:35 - loss: 0.3687 - regression_loss: 0.3445 - classification_loss: 0.0242 191/500 [==========>...................] - ETA: 2:34 - loss: 0.3683 - regression_loss: 0.3440 - classification_loss: 0.0242 192/500 [==========>...................] - ETA: 2:34 - loss: 0.3675 - regression_loss: 0.3432 - classification_loss: 0.0243 193/500 [==========>...................] - ETA: 2:34 - loss: 0.3677 - regression_loss: 0.3433 - classification_loss: 0.0244 194/500 [==========>...................] - ETA: 2:33 - loss: 0.3676 - regression_loss: 0.3432 - classification_loss: 0.0244 195/500 [==========>...................] - ETA: 2:33 - loss: 0.3671 - regression_loss: 0.3427 - classification_loss: 0.0244 196/500 [==========>...................] - ETA: 2:32 - loss: 0.3670 - regression_loss: 0.3427 - classification_loss: 0.0243 197/500 [==========>...................] - ETA: 2:31 - loss: 0.3670 - regression_loss: 0.3427 - classification_loss: 0.0243 198/500 [==========>...................] - ETA: 2:31 - loss: 0.3670 - regression_loss: 0.3426 - classification_loss: 0.0243 199/500 [==========>...................] - ETA: 2:31 - loss: 0.3662 - regression_loss: 0.3419 - classification_loss: 0.0243 200/500 [===========>..................] - ETA: 2:30 - loss: 0.3658 - regression_loss: 0.3415 - classification_loss: 0.0243 201/500 [===========>..................] - ETA: 2:30 - loss: 0.3651 - regression_loss: 0.3408 - classification_loss: 0.0242 202/500 [===========>..................] - ETA: 2:29 - loss: 0.3660 - regression_loss: 0.3417 - classification_loss: 0.0243 203/500 [===========>..................] - ETA: 2:28 - loss: 0.3655 - regression_loss: 0.3412 - classification_loss: 0.0243 204/500 [===========>..................] - ETA: 2:28 - loss: 0.3646 - regression_loss: 0.3403 - classification_loss: 0.0243 205/500 [===========>..................] - ETA: 2:27 - loss: 0.3644 - regression_loss: 0.3402 - classification_loss: 0.0243 206/500 [===========>..................] - ETA: 2:27 - loss: 0.3636 - regression_loss: 0.3394 - classification_loss: 0.0242 207/500 [===========>..................] - ETA: 2:26 - loss: 0.3633 - regression_loss: 0.3392 - classification_loss: 0.0242 208/500 [===========>..................] - ETA: 2:26 - loss: 0.3630 - regression_loss: 0.3389 - classification_loss: 0.0241 209/500 [===========>..................] - ETA: 2:25 - loss: 0.3646 - regression_loss: 0.3405 - classification_loss: 0.0241 210/500 [===========>..................] - ETA: 2:25 - loss: 0.3648 - regression_loss: 0.3406 - classification_loss: 0.0242 211/500 [===========>..................] - ETA: 2:24 - loss: 0.3647 - regression_loss: 0.3406 - classification_loss: 0.0242 212/500 [===========>..................] - ETA: 2:24 - loss: 0.3642 - regression_loss: 0.3402 - classification_loss: 0.0241 213/500 [===========>..................] - ETA: 2:23 - loss: 0.3635 - regression_loss: 0.3395 - classification_loss: 0.0240 214/500 [===========>..................] - ETA: 2:23 - loss: 0.3632 - regression_loss: 0.3392 - classification_loss: 0.0239 215/500 [===========>..................] - ETA: 2:23 - loss: 0.3629 - regression_loss: 0.3390 - classification_loss: 0.0239 216/500 [===========>..................] - ETA: 2:22 - loss: 0.3619 - regression_loss: 0.3381 - classification_loss: 0.0238 217/500 [============>.................] - ETA: 2:21 - loss: 0.3619 - regression_loss: 0.3381 - classification_loss: 0.0238 218/500 [============>.................] - ETA: 2:21 - loss: 0.3623 - regression_loss: 0.3385 - classification_loss: 0.0238 219/500 [============>.................] - ETA: 2:20 - loss: 0.3630 - regression_loss: 0.3393 - classification_loss: 0.0238 220/500 [============>.................] - ETA: 2:20 - loss: 0.3626 - regression_loss: 0.3388 - classification_loss: 0.0237 221/500 [============>.................] - ETA: 2:20 - loss: 0.3622 - regression_loss: 0.3386 - classification_loss: 0.0237 222/500 [============>.................] - ETA: 2:19 - loss: 0.3623 - regression_loss: 0.3387 - classification_loss: 0.0237 223/500 [============>.................] - ETA: 2:18 - loss: 0.3626 - regression_loss: 0.3389 - classification_loss: 0.0237 224/500 [============>.................] - ETA: 2:18 - loss: 0.3632 - regression_loss: 0.3394 - classification_loss: 0.0238 225/500 [============>.................] - ETA: 2:17 - loss: 0.3632 - regression_loss: 0.3394 - classification_loss: 0.0238 226/500 [============>.................] - ETA: 2:17 - loss: 0.3630 - regression_loss: 0.3393 - classification_loss: 0.0238 227/500 [============>.................] - ETA: 2:16 - loss: 0.3625 - regression_loss: 0.3387 - classification_loss: 0.0237 228/500 [============>.................] - ETA: 2:16 - loss: 0.3631 - regression_loss: 0.3393 - classification_loss: 0.0238 229/500 [============>.................] - ETA: 2:15 - loss: 0.3626 - regression_loss: 0.3388 - classification_loss: 0.0238 230/500 [============>.................] - ETA: 2:15 - loss: 0.3623 - regression_loss: 0.3385 - classification_loss: 0.0238 231/500 [============>.................] - ETA: 2:15 - loss: 0.3617 - regression_loss: 0.3380 - classification_loss: 0.0237 232/500 [============>.................] - ETA: 2:14 - loss: 0.3613 - regression_loss: 0.3376 - classification_loss: 0.0237 233/500 [============>.................] - ETA: 2:13 - loss: 0.3620 - regression_loss: 0.3383 - classification_loss: 0.0236 234/500 [=============>................] - ETA: 2:13 - loss: 0.3616 - regression_loss: 0.3381 - classification_loss: 0.0236 235/500 [=============>................] - ETA: 2:12 - loss: 0.3608 - regression_loss: 0.3373 - classification_loss: 0.0235 236/500 [=============>................] - ETA: 2:12 - loss: 0.3611 - regression_loss: 0.3375 - classification_loss: 0.0235 237/500 [=============>................] - ETA: 2:11 - loss: 0.3622 - regression_loss: 0.3386 - classification_loss: 0.0236 238/500 [=============>................] - ETA: 2:11 - loss: 0.3627 - regression_loss: 0.3390 - classification_loss: 0.0236 239/500 [=============>................] - ETA: 2:10 - loss: 0.3627 - regression_loss: 0.3390 - classification_loss: 0.0236 240/500 [=============>................] - ETA: 2:10 - loss: 0.3626 - regression_loss: 0.3390 - classification_loss: 0.0236 241/500 [=============>................] - ETA: 2:09 - loss: 0.3630 - regression_loss: 0.3394 - classification_loss: 0.0236 242/500 [=============>................] - ETA: 2:09 - loss: 0.3627 - regression_loss: 0.3391 - classification_loss: 0.0235 243/500 [=============>................] - ETA: 2:08 - loss: 0.3630 - regression_loss: 0.3395 - classification_loss: 0.0235 244/500 [=============>................] - ETA: 2:08 - loss: 0.3642 - regression_loss: 0.3404 - classification_loss: 0.0237 245/500 [=============>................] - ETA: 2:07 - loss: 0.3635 - regression_loss: 0.3398 - classification_loss: 0.0237 246/500 [=============>................] - ETA: 2:07 - loss: 0.3634 - regression_loss: 0.3397 - classification_loss: 0.0237 247/500 [=============>................] - ETA: 2:06 - loss: 0.3632 - regression_loss: 0.3395 - classification_loss: 0.0237 248/500 [=============>................] - ETA: 2:06 - loss: 0.3626 - regression_loss: 0.3390 - classification_loss: 0.0236 249/500 [=============>................] - ETA: 2:06 - loss: 0.3619 - regression_loss: 0.3382 - classification_loss: 0.0236 250/500 [==============>...............] - ETA: 2:05 - loss: 0.3612 - regression_loss: 0.3376 - classification_loss: 0.0236 251/500 [==============>...............] - ETA: 2:04 - loss: 0.3609 - regression_loss: 0.3374 - classification_loss: 0.0235 252/500 [==============>...............] - ETA: 2:04 - loss: 0.3613 - regression_loss: 0.3377 - classification_loss: 0.0236 253/500 [==============>...............] - ETA: 2:04 - loss: 0.3615 - regression_loss: 0.3379 - classification_loss: 0.0236 254/500 [==============>...............] - ETA: 2:03 - loss: 0.3623 - regression_loss: 0.3385 - classification_loss: 0.0238 255/500 [==============>...............] - ETA: 2:02 - loss: 0.3623 - regression_loss: 0.3385 - classification_loss: 0.0237 256/500 [==============>...............] - ETA: 2:02 - loss: 0.3627 - regression_loss: 0.3389 - classification_loss: 0.0238 257/500 [==============>...............] - ETA: 2:01 - loss: 0.3624 - regression_loss: 0.3387 - classification_loss: 0.0238 258/500 [==============>...............] - ETA: 2:01 - loss: 0.3624 - regression_loss: 0.3386 - classification_loss: 0.0238 259/500 [==============>...............] - ETA: 2:00 - loss: 0.3625 - regression_loss: 0.3387 - classification_loss: 0.0238 260/500 [==============>...............] - ETA: 2:00 - loss: 0.3627 - regression_loss: 0.3388 - classification_loss: 0.0238 261/500 [==============>...............] - ETA: 2:00 - loss: 0.3627 - regression_loss: 0.3388 - classification_loss: 0.0239 262/500 [==============>...............] - ETA: 1:59 - loss: 0.3631 - regression_loss: 0.3392 - classification_loss: 0.0239 263/500 [==============>...............] - ETA: 1:58 - loss: 0.3626 - regression_loss: 0.3387 - classification_loss: 0.0239 264/500 [==============>...............] - ETA: 1:58 - loss: 0.3631 - regression_loss: 0.3391 - classification_loss: 0.0240 265/500 [==============>...............] - ETA: 1:57 - loss: 0.3627 - regression_loss: 0.3388 - classification_loss: 0.0239 266/500 [==============>...............] - ETA: 1:57 - loss: 0.3618 - regression_loss: 0.3379 - classification_loss: 0.0239 267/500 [===============>..............] - ETA: 1:56 - loss: 0.3620 - regression_loss: 0.3381 - classification_loss: 0.0239 268/500 [===============>..............] - ETA: 1:56 - loss: 0.3616 - regression_loss: 0.3377 - classification_loss: 0.0239 269/500 [===============>..............] - ETA: 1:55 - loss: 0.3614 - regression_loss: 0.3375 - classification_loss: 0.0239 270/500 [===============>..............] - ETA: 1:55 - loss: 0.3609 - regression_loss: 0.3370 - classification_loss: 0.0239 271/500 [===============>..............] - ETA: 1:54 - loss: 0.3602 - regression_loss: 0.3364 - classification_loss: 0.0238 272/500 [===============>..............] - ETA: 1:54 - loss: 0.3606 - regression_loss: 0.3368 - classification_loss: 0.0239 273/500 [===============>..............] - ETA: 1:53 - loss: 0.3614 - regression_loss: 0.3374 - classification_loss: 0.0240 274/500 [===============>..............] - ETA: 1:53 - loss: 0.3611 - regression_loss: 0.3372 - classification_loss: 0.0239 275/500 [===============>..............] - ETA: 1:52 - loss: 0.3611 - regression_loss: 0.3372 - classification_loss: 0.0239 276/500 [===============>..............] - ETA: 1:52 - loss: 0.3615 - regression_loss: 0.3375 - classification_loss: 0.0241 277/500 [===============>..............] - ETA: 1:51 - loss: 0.3607 - regression_loss: 0.3367 - classification_loss: 0.0240 278/500 [===============>..............] - ETA: 1:51 - loss: 0.3605 - regression_loss: 0.3365 - classification_loss: 0.0240 279/500 [===============>..............] - ETA: 1:50 - loss: 0.3600 - regression_loss: 0.3361 - classification_loss: 0.0240 280/500 [===============>..............] - ETA: 1:50 - loss: 0.3603 - regression_loss: 0.3363 - classification_loss: 0.0239 281/500 [===============>..............] - ETA: 1:49 - loss: 0.3608 - regression_loss: 0.3369 - classification_loss: 0.0239 282/500 [===============>..............] - ETA: 1:49 - loss: 0.3607 - regression_loss: 0.3368 - classification_loss: 0.0239 283/500 [===============>..............] - ETA: 1:48 - loss: 0.3609 - regression_loss: 0.3370 - classification_loss: 0.0239 284/500 [================>.............] - ETA: 1:48 - loss: 0.3609 - regression_loss: 0.3369 - classification_loss: 0.0240 285/500 [================>.............] - ETA: 1:47 - loss: 0.3607 - regression_loss: 0.3368 - classification_loss: 0.0239 286/500 [================>.............] - ETA: 1:47 - loss: 0.3619 - regression_loss: 0.3379 - classification_loss: 0.0239 287/500 [================>.............] - ETA: 1:46 - loss: 0.3619 - regression_loss: 0.3379 - classification_loss: 0.0240 288/500 [================>.............] - ETA: 1:46 - loss: 0.3614 - regression_loss: 0.3375 - classification_loss: 0.0239 289/500 [================>.............] - ETA: 1:45 - loss: 0.3621 - regression_loss: 0.3383 - classification_loss: 0.0239 290/500 [================>.............] - ETA: 1:45 - loss: 0.3626 - regression_loss: 0.3387 - classification_loss: 0.0239 291/500 [================>.............] - ETA: 1:44 - loss: 0.3624 - regression_loss: 0.3385 - classification_loss: 0.0239 292/500 [================>.............] - ETA: 1:44 - loss: 0.3616 - regression_loss: 0.3378 - classification_loss: 0.0238 293/500 [================>.............] - ETA: 1:43 - loss: 0.3612 - regression_loss: 0.3374 - classification_loss: 0.0238 294/500 [================>.............] - ETA: 1:43 - loss: 0.3610 - regression_loss: 0.3372 - classification_loss: 0.0238 295/500 [================>.............] - ETA: 1:42 - loss: 0.3607 - regression_loss: 0.3369 - classification_loss: 0.0238 296/500 [================>.............] - ETA: 1:42 - loss: 0.3600 - regression_loss: 0.3362 - classification_loss: 0.0237 297/500 [================>.............] - ETA: 1:41 - loss: 0.3591 - regression_loss: 0.3354 - classification_loss: 0.0237 298/500 [================>.............] - ETA: 1:41 - loss: 0.3600 - regression_loss: 0.3363 - classification_loss: 0.0238 299/500 [================>.............] - ETA: 1:40 - loss: 0.3603 - regression_loss: 0.3365 - classification_loss: 0.0238 300/500 [=================>............] - ETA: 1:40 - loss: 0.3610 - regression_loss: 0.3371 - classification_loss: 0.0239 301/500 [=================>............] - ETA: 1:39 - loss: 0.3606 - regression_loss: 0.3367 - classification_loss: 0.0239 302/500 [=================>............] - ETA: 1:39 - loss: 0.3604 - regression_loss: 0.3366 - classification_loss: 0.0239 303/500 [=================>............] - ETA: 1:38 - loss: 0.3603 - regression_loss: 0.3364 - classification_loss: 0.0239 304/500 [=================>............] - ETA: 1:38 - loss: 0.3599 - regression_loss: 0.3361 - classification_loss: 0.0238 305/500 [=================>............] - ETA: 1:37 - loss: 0.3595 - regression_loss: 0.3357 - classification_loss: 0.0238 306/500 [=================>............] - ETA: 1:37 - loss: 0.3595 - regression_loss: 0.3358 - classification_loss: 0.0237 307/500 [=================>............] - ETA: 1:36 - loss: 0.3595 - regression_loss: 0.3358 - classification_loss: 0.0237 308/500 [=================>............] - ETA: 1:36 - loss: 0.3597 - regression_loss: 0.3360 - classification_loss: 0.0237 309/500 [=================>............] - ETA: 1:35 - loss: 0.3598 - regression_loss: 0.3361 - classification_loss: 0.0237 310/500 [=================>............] - ETA: 1:35 - loss: 0.3600 - regression_loss: 0.3363 - classification_loss: 0.0237 311/500 [=================>............] - ETA: 1:34 - loss: 0.3605 - regression_loss: 0.3368 - classification_loss: 0.0237 312/500 [=================>............] - ETA: 1:34 - loss: 0.3603 - regression_loss: 0.3366 - classification_loss: 0.0238 313/500 [=================>............] - ETA: 1:33 - loss: 0.3597 - regression_loss: 0.3360 - classification_loss: 0.0237 314/500 [=================>............] - ETA: 1:33 - loss: 0.3596 - regression_loss: 0.3358 - classification_loss: 0.0237 315/500 [=================>............] - ETA: 1:32 - loss: 0.3597 - regression_loss: 0.3360 - classification_loss: 0.0237 316/500 [=================>............] - ETA: 1:32 - loss: 0.3601 - regression_loss: 0.3364 - classification_loss: 0.0237 317/500 [==================>...........] - ETA: 1:31 - loss: 0.3593 - regression_loss: 0.3357 - classification_loss: 0.0236 318/500 [==================>...........] - ETA: 1:31 - loss: 0.3591 - regression_loss: 0.3355 - classification_loss: 0.0236 319/500 [==================>...........] - ETA: 1:30 - loss: 0.3594 - regression_loss: 0.3358 - classification_loss: 0.0236 320/500 [==================>...........] - ETA: 1:30 - loss: 0.3597 - regression_loss: 0.3361 - classification_loss: 0.0236 321/500 [==================>...........] - ETA: 1:29 - loss: 0.3600 - regression_loss: 0.3364 - classification_loss: 0.0236 322/500 [==================>...........] - ETA: 1:29 - loss: 0.3605 - regression_loss: 0.3369 - classification_loss: 0.0236 323/500 [==================>...........] - ETA: 1:28 - loss: 0.3606 - regression_loss: 0.3370 - classification_loss: 0.0236 324/500 [==================>...........] - ETA: 1:28 - loss: 0.3601 - regression_loss: 0.3365 - classification_loss: 0.0236 325/500 [==================>...........] - ETA: 1:27 - loss: 0.3601 - regression_loss: 0.3365 - classification_loss: 0.0236 326/500 [==================>...........] - ETA: 1:27 - loss: 0.3596 - regression_loss: 0.3361 - classification_loss: 0.0235 327/500 [==================>...........] - ETA: 1:26 - loss: 0.3595 - regression_loss: 0.3360 - classification_loss: 0.0236 328/500 [==================>...........] - ETA: 1:26 - loss: 0.3592 - regression_loss: 0.3357 - classification_loss: 0.0235 329/500 [==================>...........] - ETA: 1:25 - loss: 0.3592 - regression_loss: 0.3356 - classification_loss: 0.0235 330/500 [==================>...........] - ETA: 1:25 - loss: 0.3587 - regression_loss: 0.3353 - classification_loss: 0.0235 331/500 [==================>...........] - ETA: 1:24 - loss: 0.3587 - regression_loss: 0.3352 - classification_loss: 0.0235 332/500 [==================>...........] - ETA: 1:24 - loss: 0.3582 - regression_loss: 0.3348 - classification_loss: 0.0234 333/500 [==================>...........] - ETA: 1:23 - loss: 0.3580 - regression_loss: 0.3346 - classification_loss: 0.0234 334/500 [===================>..........] - ETA: 1:23 - loss: 0.3588 - regression_loss: 0.3354 - classification_loss: 0.0234 335/500 [===================>..........] - ETA: 1:22 - loss: 0.3592 - regression_loss: 0.3358 - classification_loss: 0.0234 336/500 [===================>..........] - ETA: 1:22 - loss: 0.3587 - regression_loss: 0.3354 - classification_loss: 0.0233 337/500 [===================>..........] - ETA: 1:21 - loss: 0.3590 - regression_loss: 0.3356 - classification_loss: 0.0234 338/500 [===================>..........] - ETA: 1:21 - loss: 0.3588 - regression_loss: 0.3355 - classification_loss: 0.0234 339/500 [===================>..........] - ETA: 1:20 - loss: 0.3586 - regression_loss: 0.3352 - classification_loss: 0.0233 340/500 [===================>..........] - ETA: 1:20 - loss: 0.3585 - regression_loss: 0.3352 - classification_loss: 0.0233 341/500 [===================>..........] - ETA: 1:19 - loss: 0.3588 - regression_loss: 0.3355 - classification_loss: 0.0233 342/500 [===================>..........] - ETA: 1:19 - loss: 0.3594 - regression_loss: 0.3360 - classification_loss: 0.0234 343/500 [===================>..........] - ETA: 1:18 - loss: 0.3592 - regression_loss: 0.3359 - classification_loss: 0.0233 344/500 [===================>..........] - ETA: 1:18 - loss: 0.3591 - regression_loss: 0.3358 - classification_loss: 0.0233 345/500 [===================>..........] - ETA: 1:17 - loss: 0.3591 - regression_loss: 0.3357 - classification_loss: 0.0233 346/500 [===================>..........] - ETA: 1:17 - loss: 0.3595 - regression_loss: 0.3362 - classification_loss: 0.0233 347/500 [===================>..........] - ETA: 1:16 - loss: 0.3595 - regression_loss: 0.3362 - classification_loss: 0.0233 348/500 [===================>..........] - ETA: 1:16 - loss: 0.3600 - regression_loss: 0.3367 - classification_loss: 0.0233 349/500 [===================>..........] - ETA: 1:15 - loss: 0.3609 - regression_loss: 0.3374 - classification_loss: 0.0235 350/500 [====================>.........] - ETA: 1:15 - loss: 0.3610 - regression_loss: 0.3374 - classification_loss: 0.0235 351/500 [====================>.........] - ETA: 1:14 - loss: 0.3608 - regression_loss: 0.3373 - classification_loss: 0.0235 352/500 [====================>.........] - ETA: 1:14 - loss: 0.3608 - regression_loss: 0.3373 - classification_loss: 0.0235 353/500 [====================>.........] - ETA: 1:13 - loss: 0.3608 - regression_loss: 0.3373 - classification_loss: 0.0235 354/500 [====================>.........] - ETA: 1:13 - loss: 0.3602 - regression_loss: 0.3367 - classification_loss: 0.0234 355/500 [====================>.........] - ETA: 1:12 - loss: 0.3601 - regression_loss: 0.3367 - classification_loss: 0.0234 356/500 [====================>.........] - ETA: 1:12 - loss: 0.3602 - regression_loss: 0.3369 - classification_loss: 0.0234 357/500 [====================>.........] - ETA: 1:11 - loss: 0.3602 - regression_loss: 0.3368 - classification_loss: 0.0233 358/500 [====================>.........] - ETA: 1:11 - loss: 0.3607 - regression_loss: 0.3371 - classification_loss: 0.0235 359/500 [====================>.........] - ETA: 1:10 - loss: 0.3613 - regression_loss: 0.3377 - classification_loss: 0.0236 360/500 [====================>.........] - ETA: 1:10 - loss: 0.3612 - regression_loss: 0.3376 - classification_loss: 0.0236 361/500 [====================>.........] - ETA: 1:09 - loss: 0.3613 - regression_loss: 0.3377 - classification_loss: 0.0236 362/500 [====================>.........] - ETA: 1:09 - loss: 0.3608 - regression_loss: 0.3372 - classification_loss: 0.0236 363/500 [====================>.........] - ETA: 1:08 - loss: 0.3608 - regression_loss: 0.3372 - classification_loss: 0.0236 364/500 [====================>.........] - ETA: 1:08 - loss: 0.3613 - regression_loss: 0.3377 - classification_loss: 0.0236 365/500 [====================>.........] - ETA: 1:07 - loss: 0.3610 - regression_loss: 0.3374 - classification_loss: 0.0236 366/500 [====================>.........] - ETA: 1:07 - loss: 0.3611 - regression_loss: 0.3374 - classification_loss: 0.0236 367/500 [=====================>........] - ETA: 1:06 - loss: 0.3612 - regression_loss: 0.3376 - classification_loss: 0.0236 368/500 [=====================>........] - ETA: 1:06 - loss: 0.3618 - regression_loss: 0.3382 - classification_loss: 0.0236 369/500 [=====================>........] - ETA: 1:05 - loss: 0.3612 - regression_loss: 0.3376 - classification_loss: 0.0236 370/500 [=====================>........] - ETA: 1:05 - loss: 0.3614 - regression_loss: 0.3378 - classification_loss: 0.0236 371/500 [=====================>........] - ETA: 1:04 - loss: 0.3613 - regression_loss: 0.3377 - classification_loss: 0.0236 372/500 [=====================>........] - ETA: 1:04 - loss: 0.3612 - regression_loss: 0.3377 - classification_loss: 0.0236 373/500 [=====================>........] - ETA: 1:03 - loss: 0.3607 - regression_loss: 0.3371 - classification_loss: 0.0235 374/500 [=====================>........] - ETA: 1:03 - loss: 0.3607 - regression_loss: 0.3372 - classification_loss: 0.0235 375/500 [=====================>........] - ETA: 1:02 - loss: 0.3607 - regression_loss: 0.3372 - classification_loss: 0.0235 376/500 [=====================>........] - ETA: 1:02 - loss: 0.3610 - regression_loss: 0.3374 - classification_loss: 0.0236 377/500 [=====================>........] - ETA: 1:01 - loss: 0.3611 - regression_loss: 0.3374 - classification_loss: 0.0236 378/500 [=====================>........] - ETA: 1:01 - loss: 0.3611 - regression_loss: 0.3375 - classification_loss: 0.0237 379/500 [=====================>........] - ETA: 1:00 - loss: 0.3606 - regression_loss: 0.3370 - classification_loss: 0.0236 380/500 [=====================>........] - ETA: 1:00 - loss: 0.3605 - regression_loss: 0.3369 - classification_loss: 0.0236 381/500 [=====================>........] - ETA: 59s - loss: 0.3614 - regression_loss: 0.3378 - classification_loss: 0.0236  382/500 [=====================>........] - ETA: 59s - loss: 0.3611 - regression_loss: 0.3375 - classification_loss: 0.0236 383/500 [=====================>........] - ETA: 58s - loss: 0.3614 - regression_loss: 0.3378 - classification_loss: 0.0236 384/500 [======================>.......] - ETA: 58s - loss: 0.3612 - regression_loss: 0.3377 - classification_loss: 0.0235 385/500 [======================>.......] - ETA: 57s - loss: 0.3610 - regression_loss: 0.3375 - classification_loss: 0.0235 386/500 [======================>.......] - ETA: 57s - loss: 0.3606 - regression_loss: 0.3371 - classification_loss: 0.0235 387/500 [======================>.......] - ETA: 56s - loss: 0.3604 - regression_loss: 0.3369 - classification_loss: 0.0235 388/500 [======================>.......] - ETA: 56s - loss: 0.3601 - regression_loss: 0.3366 - classification_loss: 0.0235 389/500 [======================>.......] - ETA: 55s - loss: 0.3599 - regression_loss: 0.3364 - classification_loss: 0.0235 390/500 [======================>.......] - ETA: 55s - loss: 0.3599 - regression_loss: 0.3365 - classification_loss: 0.0235 391/500 [======================>.......] - ETA: 54s - loss: 0.3602 - regression_loss: 0.3366 - classification_loss: 0.0235 392/500 [======================>.......] - ETA: 54s - loss: 0.3598 - regression_loss: 0.3364 - classification_loss: 0.0235 393/500 [======================>.......] - ETA: 53s - loss: 0.3596 - regression_loss: 0.3362 - classification_loss: 0.0235 394/500 [======================>.......] - ETA: 53s - loss: 0.3597 - regression_loss: 0.3362 - classification_loss: 0.0234 395/500 [======================>.......] - ETA: 52s - loss: 0.3596 - regression_loss: 0.3362 - classification_loss: 0.0234 396/500 [======================>.......] - ETA: 52s - loss: 0.3597 - regression_loss: 0.3362 - classification_loss: 0.0235 397/500 [======================>.......] - ETA: 51s - loss: 0.3601 - regression_loss: 0.3366 - classification_loss: 0.0235 398/500 [======================>.......] - ETA: 51s - loss: 0.3601 - regression_loss: 0.3366 - classification_loss: 0.0236 399/500 [======================>.......] - ETA: 50s - loss: 0.3599 - regression_loss: 0.3364 - classification_loss: 0.0235 400/500 [=======================>......] - ETA: 50s - loss: 0.3593 - regression_loss: 0.3358 - classification_loss: 0.0235 401/500 [=======================>......] - ETA: 49s - loss: 0.3592 - regression_loss: 0.3358 - classification_loss: 0.0234 402/500 [=======================>......] - ETA: 49s - loss: 0.3591 - regression_loss: 0.3357 - classification_loss: 0.0234 403/500 [=======================>......] - ETA: 48s - loss: 0.3596 - regression_loss: 0.3361 - classification_loss: 0.0235 404/500 [=======================>......] - ETA: 48s - loss: 0.3598 - regression_loss: 0.3363 - classification_loss: 0.0235 405/500 [=======================>......] - ETA: 47s - loss: 0.3592 - regression_loss: 0.3358 - classification_loss: 0.0234 406/500 [=======================>......] - ETA: 47s - loss: 0.3601 - regression_loss: 0.3366 - classification_loss: 0.0235 407/500 [=======================>......] - ETA: 46s - loss: 0.3599 - regression_loss: 0.3364 - classification_loss: 0.0235 408/500 [=======================>......] - ETA: 46s - loss: 0.3601 - regression_loss: 0.3366 - classification_loss: 0.0235 409/500 [=======================>......] - ETA: 45s - loss: 0.3596 - regression_loss: 0.3361 - classification_loss: 0.0235 410/500 [=======================>......] - ETA: 45s - loss: 0.3595 - regression_loss: 0.3361 - classification_loss: 0.0235 411/500 [=======================>......] - ETA: 44s - loss: 0.3603 - regression_loss: 0.3368 - classification_loss: 0.0235 412/500 [=======================>......] - ETA: 44s - loss: 0.3605 - regression_loss: 0.3369 - classification_loss: 0.0236 413/500 [=======================>......] - ETA: 43s - loss: 0.3602 - regression_loss: 0.3367 - classification_loss: 0.0235 414/500 [=======================>......] - ETA: 43s - loss: 0.3608 - regression_loss: 0.3372 - classification_loss: 0.0236 415/500 [=======================>......] - ETA: 42s - loss: 0.3610 - regression_loss: 0.3373 - classification_loss: 0.0237 416/500 [=======================>......] - ETA: 42s - loss: 0.3610 - regression_loss: 0.3374 - classification_loss: 0.0237 417/500 [========================>.....] - ETA: 41s - loss: 0.3608 - regression_loss: 0.3372 - classification_loss: 0.0237 418/500 [========================>.....] - ETA: 41s - loss: 0.3609 - regression_loss: 0.3373 - classification_loss: 0.0236 419/500 [========================>.....] - ETA: 40s - loss: 0.3610 - regression_loss: 0.3374 - classification_loss: 0.0237 420/500 [========================>.....] - ETA: 40s - loss: 0.3608 - regression_loss: 0.3371 - classification_loss: 0.0237 421/500 [========================>.....] - ETA: 39s - loss: 0.3604 - regression_loss: 0.3367 - classification_loss: 0.0236 422/500 [========================>.....] - ETA: 39s - loss: 0.3605 - regression_loss: 0.3369 - classification_loss: 0.0236 423/500 [========================>.....] - ETA: 38s - loss: 0.3604 - regression_loss: 0.3367 - classification_loss: 0.0236 424/500 [========================>.....] - ETA: 38s - loss: 0.3608 - regression_loss: 0.3371 - classification_loss: 0.0237 425/500 [========================>.....] - ETA: 37s - loss: 0.3604 - regression_loss: 0.3367 - classification_loss: 0.0237 426/500 [========================>.....] - ETA: 37s - loss: 0.3601 - regression_loss: 0.3364 - classification_loss: 0.0236 427/500 [========================>.....] - ETA: 36s - loss: 0.3598 - regression_loss: 0.3362 - classification_loss: 0.0236 428/500 [========================>.....] - ETA: 36s - loss: 0.3596 - regression_loss: 0.3360 - classification_loss: 0.0236 429/500 [========================>.....] - ETA: 35s - loss: 0.3596 - regression_loss: 0.3360 - classification_loss: 0.0236 430/500 [========================>.....] - ETA: 35s - loss: 0.3597 - regression_loss: 0.3361 - classification_loss: 0.0236 431/500 [========================>.....] - ETA: 34s - loss: 0.3593 - regression_loss: 0.3358 - classification_loss: 0.0236 432/500 [========================>.....] - ETA: 34s - loss: 0.3596 - regression_loss: 0.3360 - classification_loss: 0.0236 433/500 [========================>.....] - ETA: 33s - loss: 0.3597 - regression_loss: 0.3361 - classification_loss: 0.0236 434/500 [=========================>....] - ETA: 33s - loss: 0.3598 - regression_loss: 0.3362 - classification_loss: 0.0236 435/500 [=========================>....] - ETA: 32s - loss: 0.3599 - regression_loss: 0.3363 - classification_loss: 0.0236 436/500 [=========================>....] - ETA: 32s - loss: 0.3602 - regression_loss: 0.3366 - classification_loss: 0.0236 437/500 [=========================>....] - ETA: 31s - loss: 0.3609 - regression_loss: 0.3372 - classification_loss: 0.0237 438/500 [=========================>....] - ETA: 31s - loss: 0.3611 - regression_loss: 0.3374 - classification_loss: 0.0237 439/500 [=========================>....] - ETA: 30s - loss: 0.3613 - regression_loss: 0.3376 - classification_loss: 0.0237 440/500 [=========================>....] - ETA: 30s - loss: 0.3615 - regression_loss: 0.3378 - classification_loss: 0.0237 441/500 [=========================>....] - ETA: 29s - loss: 0.3615 - regression_loss: 0.3379 - classification_loss: 0.0237 442/500 [=========================>....] - ETA: 29s - loss: 0.3613 - regression_loss: 0.3376 - classification_loss: 0.0236 443/500 [=========================>....] - ETA: 28s - loss: 0.3620 - regression_loss: 0.3383 - classification_loss: 0.0237 444/500 [=========================>....] - ETA: 28s - loss: 0.3620 - regression_loss: 0.3382 - classification_loss: 0.0237 445/500 [=========================>....] - ETA: 27s - loss: 0.3622 - regression_loss: 0.3385 - classification_loss: 0.0237 446/500 [=========================>....] - ETA: 27s - loss: 0.3617 - regression_loss: 0.3380 - classification_loss: 0.0237 447/500 [=========================>....] - ETA: 26s - loss: 0.3614 - regression_loss: 0.3377 - classification_loss: 0.0237 448/500 [=========================>....] - ETA: 26s - loss: 0.3617 - regression_loss: 0.3380 - classification_loss: 0.0237 449/500 [=========================>....] - ETA: 25s - loss: 0.3612 - regression_loss: 0.3375 - classification_loss: 0.0237 450/500 [==========================>...] - ETA: 25s - loss: 0.3612 - regression_loss: 0.3375 - classification_loss: 0.0237 451/500 [==========================>...] - ETA: 24s - loss: 0.3612 - regression_loss: 0.3375 - classification_loss: 0.0236 452/500 [==========================>...] - ETA: 24s - loss: 0.3610 - regression_loss: 0.3374 - classification_loss: 0.0236 453/500 [==========================>...] - ETA: 23s - loss: 0.3609 - regression_loss: 0.3373 - classification_loss: 0.0236 454/500 [==========================>...] - ETA: 23s - loss: 0.3615 - regression_loss: 0.3379 - classification_loss: 0.0236 455/500 [==========================>...] - ETA: 22s - loss: 0.3620 - regression_loss: 0.3383 - classification_loss: 0.0236 456/500 [==========================>...] - ETA: 22s - loss: 0.3616 - regression_loss: 0.3380 - classification_loss: 0.0236 457/500 [==========================>...] - ETA: 21s - loss: 0.3615 - regression_loss: 0.3379 - classification_loss: 0.0236 458/500 [==========================>...] - ETA: 21s - loss: 0.3612 - regression_loss: 0.3377 - classification_loss: 0.0236 459/500 [==========================>...] - ETA: 20s - loss: 0.3611 - regression_loss: 0.3376 - classification_loss: 0.0236 460/500 [==========================>...] - ETA: 20s - loss: 0.3615 - regression_loss: 0.3378 - classification_loss: 0.0237 461/500 [==========================>...] - ETA: 19s - loss: 0.3616 - regression_loss: 0.3379 - classification_loss: 0.0237 462/500 [==========================>...] - ETA: 19s - loss: 0.3618 - regression_loss: 0.3381 - classification_loss: 0.0237 463/500 [==========================>...] - ETA: 18s - loss: 0.3617 - regression_loss: 0.3380 - classification_loss: 0.0237 464/500 [==========================>...] - ETA: 18s - loss: 0.3615 - regression_loss: 0.3379 - classification_loss: 0.0237 465/500 [==========================>...] - ETA: 17s - loss: 0.3622 - regression_loss: 0.3385 - classification_loss: 0.0238 466/500 [==========================>...] - ETA: 17s - loss: 0.3623 - regression_loss: 0.3385 - classification_loss: 0.0238 467/500 [===========================>..] - ETA: 16s - loss: 0.3618 - regression_loss: 0.3381 - classification_loss: 0.0237 468/500 [===========================>..] - ETA: 16s - loss: 0.3619 - regression_loss: 0.3382 - classification_loss: 0.0237 469/500 [===========================>..] - ETA: 15s - loss: 0.3615 - regression_loss: 0.3378 - classification_loss: 0.0237 470/500 [===========================>..] - ETA: 15s - loss: 0.3617 - regression_loss: 0.3380 - classification_loss: 0.0237 471/500 [===========================>..] - ETA: 14s - loss: 0.3616 - regression_loss: 0.3379 - classification_loss: 0.0237 472/500 [===========================>..] - ETA: 14s - loss: 0.3616 - regression_loss: 0.3379 - classification_loss: 0.0237 473/500 [===========================>..] - ETA: 13s - loss: 0.3613 - regression_loss: 0.3377 - classification_loss: 0.0237 474/500 [===========================>..] - ETA: 13s - loss: 0.3614 - regression_loss: 0.3377 - classification_loss: 0.0238 475/500 [===========================>..] - ETA: 12s - loss: 0.3617 - regression_loss: 0.3379 - classification_loss: 0.0238 476/500 [===========================>..] - ETA: 12s - loss: 0.3612 - regression_loss: 0.3375 - classification_loss: 0.0237 477/500 [===========================>..] - ETA: 11s - loss: 0.3612 - regression_loss: 0.3375 - classification_loss: 0.0237 478/500 [===========================>..] - ETA: 11s - loss: 0.3613 - regression_loss: 0.3375 - classification_loss: 0.0237 479/500 [===========================>..] - ETA: 10s - loss: 0.3611 - regression_loss: 0.3374 - classification_loss: 0.0237 480/500 [===========================>..] - ETA: 10s - loss: 0.3610 - regression_loss: 0.3373 - classification_loss: 0.0237 481/500 [===========================>..] - ETA: 9s - loss: 0.3610 - regression_loss: 0.3373 - classification_loss: 0.0237  482/500 [===========================>..] - ETA: 9s - loss: 0.3608 - regression_loss: 0.3371 - classification_loss: 0.0237 483/500 [===========================>..] - ETA: 8s - loss: 0.3604 - regression_loss: 0.3367 - classification_loss: 0.0237 484/500 [============================>.] - ETA: 8s - loss: 0.3602 - regression_loss: 0.3366 - classification_loss: 0.0236 485/500 [============================>.] - ETA: 7s - loss: 0.3599 - regression_loss: 0.3363 - classification_loss: 0.0236 486/500 [============================>.] - ETA: 7s - loss: 0.3596 - regression_loss: 0.3360 - classification_loss: 0.0236 487/500 [============================>.] - ETA: 6s - loss: 0.3594 - regression_loss: 0.3358 - classification_loss: 0.0236 488/500 [============================>.] - ETA: 6s - loss: 0.3590 - regression_loss: 0.3355 - classification_loss: 0.0235 489/500 [============================>.] - ETA: 5s - loss: 0.3585 - regression_loss: 0.3350 - classification_loss: 0.0235 490/500 [============================>.] - ETA: 5s - loss: 0.3593 - regression_loss: 0.3357 - classification_loss: 0.0236 491/500 [============================>.] - ETA: 4s - loss: 0.3591 - regression_loss: 0.3355 - classification_loss: 0.0235 492/500 [============================>.] - ETA: 4s - loss: 0.3591 - regression_loss: 0.3355 - classification_loss: 0.0235 493/500 [============================>.] - ETA: 3s - loss: 0.3595 - regression_loss: 0.3360 - classification_loss: 0.0236 494/500 [============================>.] - ETA: 3s - loss: 0.3592 - regression_loss: 0.3356 - classification_loss: 0.0236 495/500 [============================>.] - ETA: 2s - loss: 0.3591 - regression_loss: 0.3355 - classification_loss: 0.0236 496/500 [============================>.] - ETA: 2s - loss: 0.3593 - regression_loss: 0.3356 - classification_loss: 0.0237 497/500 [============================>.] - ETA: 1s - loss: 0.3594 - regression_loss: 0.3357 - classification_loss: 0.0237 498/500 [============================>.] - ETA: 1s - loss: 0.3595 - regression_loss: 0.3359 - classification_loss: 0.0237 499/500 [============================>.] - ETA: 0s - loss: 0.3593 - regression_loss: 0.3356 - classification_loss: 0.0236 500/500 [==============================] - 251s 501ms/step - loss: 0.3600 - regression_loss: 0.3362 - classification_loss: 0.0237 1172 instances of class plum with average precision: 0.7187 mAP: 0.7187 Epoch 00012: saving model to ./training/snapshots/resnet101_pascal_12.h5 Epoch 13/150 1/500 [..............................] - ETA: 4:00 - loss: 0.3930 - regression_loss: 0.3698 - classification_loss: 0.0232 2/500 [..............................] - ETA: 4:00 - loss: 0.4315 - regression_loss: 0.4072 - classification_loss: 0.0243 3/500 [..............................] - ETA: 3:59 - loss: 0.5115 - regression_loss: 0.4675 - classification_loss: 0.0439 4/500 [..............................] - ETA: 3:58 - loss: 0.4895 - regression_loss: 0.4480 - classification_loss: 0.0415 5/500 [..............................] - ETA: 3:58 - loss: 0.4295 - regression_loss: 0.3951 - classification_loss: 0.0344 6/500 [..............................] - ETA: 3:58 - loss: 0.4282 - regression_loss: 0.3972 - classification_loss: 0.0310 7/500 [..............................] - ETA: 3:57 - loss: 0.3788 - regression_loss: 0.3515 - classification_loss: 0.0273 8/500 [..............................] - ETA: 3:57 - loss: 0.3859 - regression_loss: 0.3562 - classification_loss: 0.0297 9/500 [..............................] - ETA: 3:57 - loss: 0.3743 - regression_loss: 0.3449 - classification_loss: 0.0294 10/500 [..............................] - ETA: 3:57 - loss: 0.3591 - regression_loss: 0.3309 - classification_loss: 0.0282 11/500 [..............................] - ETA: 3:58 - loss: 0.3549 - regression_loss: 0.3278 - classification_loss: 0.0271 12/500 [..............................] - ETA: 3:59 - loss: 0.3503 - regression_loss: 0.3238 - classification_loss: 0.0265 13/500 [..............................] - ETA: 3:59 - loss: 0.3876 - regression_loss: 0.3592 - classification_loss: 0.0284 14/500 [..............................] - ETA: 3:59 - loss: 0.3828 - regression_loss: 0.3560 - classification_loss: 0.0268 15/500 [..............................] - ETA: 3:59 - loss: 0.3776 - regression_loss: 0.3513 - classification_loss: 0.0263 16/500 [..............................] - ETA: 3:59 - loss: 0.3887 - regression_loss: 0.3592 - classification_loss: 0.0294 17/500 [>.............................] - ETA: 3:59 - loss: 0.3926 - regression_loss: 0.3613 - classification_loss: 0.0313 18/500 [>.............................] - ETA: 3:59 - loss: 0.3907 - regression_loss: 0.3603 - classification_loss: 0.0304 19/500 [>.............................] - ETA: 3:58 - loss: 0.3995 - regression_loss: 0.3661 - classification_loss: 0.0334 20/500 [>.............................] - ETA: 3:57 - loss: 0.4003 - regression_loss: 0.3669 - classification_loss: 0.0334 21/500 [>.............................] - ETA: 3:58 - loss: 0.3928 - regression_loss: 0.3599 - classification_loss: 0.0329 22/500 [>.............................] - ETA: 3:58 - loss: 0.3895 - regression_loss: 0.3571 - classification_loss: 0.0325 23/500 [>.............................] - ETA: 3:57 - loss: 0.3827 - regression_loss: 0.3509 - classification_loss: 0.0318 24/500 [>.............................] - ETA: 3:56 - loss: 0.3804 - regression_loss: 0.3490 - classification_loss: 0.0315 25/500 [>.............................] - ETA: 3:55 - loss: 0.3800 - regression_loss: 0.3478 - classification_loss: 0.0322 26/500 [>.............................] - ETA: 3:55 - loss: 0.3737 - regression_loss: 0.3423 - classification_loss: 0.0314 27/500 [>.............................] - ETA: 3:55 - loss: 0.3768 - regression_loss: 0.3456 - classification_loss: 0.0312 28/500 [>.............................] - ETA: 3:55 - loss: 0.3764 - regression_loss: 0.3453 - classification_loss: 0.0312 29/500 [>.............................] - ETA: 3:54 - loss: 0.3822 - regression_loss: 0.3508 - classification_loss: 0.0314 30/500 [>.............................] - ETA: 3:54 - loss: 0.3796 - regression_loss: 0.3483 - classification_loss: 0.0313 31/500 [>.............................] - ETA: 3:53 - loss: 0.3814 - regression_loss: 0.3491 - classification_loss: 0.0323 32/500 [>.............................] - ETA: 3:52 - loss: 0.3790 - regression_loss: 0.3472 - classification_loss: 0.0318 33/500 [>.............................] - ETA: 3:52 - loss: 0.3852 - regression_loss: 0.3535 - classification_loss: 0.0317 34/500 [=>............................] - ETA: 3:52 - loss: 0.3854 - regression_loss: 0.3540 - classification_loss: 0.0314 35/500 [=>............................] - ETA: 3:51 - loss: 0.3944 - regression_loss: 0.3637 - classification_loss: 0.0307 36/500 [=>............................] - ETA: 3:51 - loss: 0.3934 - regression_loss: 0.3630 - classification_loss: 0.0304 37/500 [=>............................] - ETA: 3:51 - loss: 0.3909 - regression_loss: 0.3608 - classification_loss: 0.0301 38/500 [=>............................] - ETA: 3:50 - loss: 0.3867 - regression_loss: 0.3567 - classification_loss: 0.0300 39/500 [=>............................] - ETA: 3:49 - loss: 0.3898 - regression_loss: 0.3593 - classification_loss: 0.0305 40/500 [=>............................] - ETA: 3:49 - loss: 0.3847 - regression_loss: 0.3547 - classification_loss: 0.0300 41/500 [=>............................] - ETA: 3:49 - loss: 0.3859 - regression_loss: 0.3559 - classification_loss: 0.0300 42/500 [=>............................] - ETA: 3:48 - loss: 0.3830 - regression_loss: 0.3534 - classification_loss: 0.0296 43/500 [=>............................] - ETA: 3:48 - loss: 0.3806 - regression_loss: 0.3509 - classification_loss: 0.0297 44/500 [=>............................] - ETA: 3:48 - loss: 0.3781 - regression_loss: 0.3487 - classification_loss: 0.0294 45/500 [=>............................] - ETA: 3:47 - loss: 0.3753 - regression_loss: 0.3458 - classification_loss: 0.0295 46/500 [=>............................] - ETA: 3:46 - loss: 0.3743 - regression_loss: 0.3452 - classification_loss: 0.0291 47/500 [=>............................] - ETA: 3:46 - loss: 0.3736 - regression_loss: 0.3447 - classification_loss: 0.0289 48/500 [=>............................] - ETA: 3:46 - loss: 0.3713 - regression_loss: 0.3426 - classification_loss: 0.0287 49/500 [=>............................] - ETA: 3:45 - loss: 0.3674 - regression_loss: 0.3390 - classification_loss: 0.0284 50/500 [==>...........................] - ETA: 3:45 - loss: 0.3641 - regression_loss: 0.3362 - classification_loss: 0.0279 51/500 [==>...........................] - ETA: 3:44 - loss: 0.3658 - regression_loss: 0.3380 - classification_loss: 0.0278 52/500 [==>...........................] - ETA: 3:44 - loss: 0.3650 - regression_loss: 0.3371 - classification_loss: 0.0279 53/500 [==>...........................] - ETA: 3:43 - loss: 0.3629 - regression_loss: 0.3352 - classification_loss: 0.0276 54/500 [==>...........................] - ETA: 3:43 - loss: 0.3612 - regression_loss: 0.3337 - classification_loss: 0.0275 55/500 [==>...........................] - ETA: 3:42 - loss: 0.3578 - regression_loss: 0.3306 - classification_loss: 0.0272 56/500 [==>...........................] - ETA: 3:42 - loss: 0.3591 - regression_loss: 0.3319 - classification_loss: 0.0272 57/500 [==>...........................] - ETA: 3:41 - loss: 0.3630 - regression_loss: 0.3348 - classification_loss: 0.0282 58/500 [==>...........................] - ETA: 3:41 - loss: 0.3633 - regression_loss: 0.3349 - classification_loss: 0.0284 59/500 [==>...........................] - ETA: 3:40 - loss: 0.3613 - regression_loss: 0.3331 - classification_loss: 0.0282 60/500 [==>...........................] - ETA: 3:40 - loss: 0.3623 - regression_loss: 0.3343 - classification_loss: 0.0280 61/500 [==>...........................] - ETA: 3:39 - loss: 0.3619 - regression_loss: 0.3341 - classification_loss: 0.0277 62/500 [==>...........................] - ETA: 3:39 - loss: 0.3642 - regression_loss: 0.3361 - classification_loss: 0.0282 63/500 [==>...........................] - ETA: 3:38 - loss: 0.3632 - regression_loss: 0.3352 - classification_loss: 0.0280 64/500 [==>...........................] - ETA: 3:38 - loss: 0.3641 - regression_loss: 0.3361 - classification_loss: 0.0280 65/500 [==>...........................] - ETA: 3:38 - loss: 0.3631 - regression_loss: 0.3351 - classification_loss: 0.0280 66/500 [==>...........................] - ETA: 3:37 - loss: 0.3624 - regression_loss: 0.3346 - classification_loss: 0.0278 67/500 [===>..........................] - ETA: 3:36 - loss: 0.3626 - regression_loss: 0.3349 - classification_loss: 0.0276 68/500 [===>..........................] - ETA: 3:36 - loss: 0.3601 - regression_loss: 0.3328 - classification_loss: 0.0273 69/500 [===>..........................] - ETA: 3:35 - loss: 0.3644 - regression_loss: 0.3365 - classification_loss: 0.0279 70/500 [===>..........................] - ETA: 3:35 - loss: 0.3641 - regression_loss: 0.3363 - classification_loss: 0.0278 71/500 [===>..........................] - ETA: 3:34 - loss: 0.3648 - regression_loss: 0.3369 - classification_loss: 0.0279 72/500 [===>..........................] - ETA: 3:34 - loss: 0.3637 - regression_loss: 0.3359 - classification_loss: 0.0278 73/500 [===>..........................] - ETA: 3:33 - loss: 0.3652 - regression_loss: 0.3375 - classification_loss: 0.0278 74/500 [===>..........................] - ETA: 3:33 - loss: 0.3651 - regression_loss: 0.3373 - classification_loss: 0.0278 75/500 [===>..........................] - ETA: 3:32 - loss: 0.3652 - regression_loss: 0.3375 - classification_loss: 0.0277 76/500 [===>..........................] - ETA: 3:32 - loss: 0.3651 - regression_loss: 0.3375 - classification_loss: 0.0277 77/500 [===>..........................] - ETA: 3:32 - loss: 0.3651 - regression_loss: 0.3375 - classification_loss: 0.0276 78/500 [===>..........................] - ETA: 3:31 - loss: 0.3644 - regression_loss: 0.3368 - classification_loss: 0.0276 79/500 [===>..........................] - ETA: 3:30 - loss: 0.3658 - regression_loss: 0.3382 - classification_loss: 0.0276 80/500 [===>..........................] - ETA: 3:30 - loss: 0.3677 - regression_loss: 0.3399 - classification_loss: 0.0278 81/500 [===>..........................] - ETA: 3:29 - loss: 0.3651 - regression_loss: 0.3375 - classification_loss: 0.0276 82/500 [===>..........................] - ETA: 3:29 - loss: 0.3635 - regression_loss: 0.3361 - classification_loss: 0.0274 83/500 [===>..........................] - ETA: 3:28 - loss: 0.3645 - regression_loss: 0.3368 - classification_loss: 0.0277 84/500 [====>.........................] - ETA: 3:28 - loss: 0.3636 - regression_loss: 0.3359 - classification_loss: 0.0277 85/500 [====>.........................] - ETA: 3:27 - loss: 0.3628 - regression_loss: 0.3352 - classification_loss: 0.0277 86/500 [====>.........................] - ETA: 3:27 - loss: 0.3633 - regression_loss: 0.3356 - classification_loss: 0.0277 87/500 [====>.........................] - ETA: 3:26 - loss: 0.3645 - regression_loss: 0.3370 - classification_loss: 0.0275 88/500 [====>.........................] - ETA: 3:26 - loss: 0.3621 - regression_loss: 0.3349 - classification_loss: 0.0272 89/500 [====>.........................] - ETA: 3:25 - loss: 0.3621 - regression_loss: 0.3349 - classification_loss: 0.0272 90/500 [====>.........................] - ETA: 3:25 - loss: 0.3618 - regression_loss: 0.3347 - classification_loss: 0.0271 91/500 [====>.........................] - ETA: 3:24 - loss: 0.3599 - regression_loss: 0.3331 - classification_loss: 0.0268 92/500 [====>.........................] - ETA: 3:24 - loss: 0.3643 - regression_loss: 0.3365 - classification_loss: 0.0278 93/500 [====>.........................] - ETA: 3:23 - loss: 0.3624 - regression_loss: 0.3348 - classification_loss: 0.0276 94/500 [====>.........................] - ETA: 3:23 - loss: 0.3606 - regression_loss: 0.3332 - classification_loss: 0.0274 95/500 [====>.........................] - ETA: 3:22 - loss: 0.3599 - regression_loss: 0.3326 - classification_loss: 0.0273 96/500 [====>.........................] - ETA: 3:22 - loss: 0.3596 - regression_loss: 0.3325 - classification_loss: 0.0271 97/500 [====>.........................] - ETA: 3:21 - loss: 0.3581 - regression_loss: 0.3312 - classification_loss: 0.0269 98/500 [====>.........................] - ETA: 3:21 - loss: 0.3600 - regression_loss: 0.3331 - classification_loss: 0.0269 99/500 [====>.........................] - ETA: 3:20 - loss: 0.3604 - regression_loss: 0.3335 - classification_loss: 0.0269 100/500 [=====>........................] - ETA: 3:20 - loss: 0.3609 - regression_loss: 0.3340 - classification_loss: 0.0269 101/500 [=====>........................] - ETA: 3:19 - loss: 0.3613 - regression_loss: 0.3344 - classification_loss: 0.0269 102/500 [=====>........................] - ETA: 3:19 - loss: 0.3621 - regression_loss: 0.3352 - classification_loss: 0.0270 103/500 [=====>........................] - ETA: 3:18 - loss: 0.3622 - regression_loss: 0.3354 - classification_loss: 0.0268 104/500 [=====>........................] - ETA: 3:18 - loss: 0.3615 - regression_loss: 0.3349 - classification_loss: 0.0266 105/500 [=====>........................] - ETA: 3:17 - loss: 0.3614 - regression_loss: 0.3348 - classification_loss: 0.0266 106/500 [=====>........................] - ETA: 3:17 - loss: 0.3610 - regression_loss: 0.3346 - classification_loss: 0.0264 107/500 [=====>........................] - ETA: 3:16 - loss: 0.3596 - regression_loss: 0.3333 - classification_loss: 0.0263 108/500 [=====>........................] - ETA: 3:16 - loss: 0.3596 - regression_loss: 0.3334 - classification_loss: 0.0263 109/500 [=====>........................] - ETA: 3:15 - loss: 0.3599 - regression_loss: 0.3335 - classification_loss: 0.0264 110/500 [=====>........................] - ETA: 3:15 - loss: 0.3619 - regression_loss: 0.3352 - classification_loss: 0.0267 111/500 [=====>........................] - ETA: 3:14 - loss: 0.3618 - regression_loss: 0.3351 - classification_loss: 0.0267 112/500 [=====>........................] - ETA: 3:14 - loss: 0.3637 - regression_loss: 0.3368 - classification_loss: 0.0269 113/500 [=====>........................] - ETA: 3:13 - loss: 0.3628 - regression_loss: 0.3357 - classification_loss: 0.0271 114/500 [=====>........................] - ETA: 3:13 - loss: 0.3645 - regression_loss: 0.3374 - classification_loss: 0.0271 115/500 [=====>........................] - ETA: 3:12 - loss: 0.3649 - regression_loss: 0.3378 - classification_loss: 0.0271 116/500 [=====>........................] - ETA: 3:12 - loss: 0.3645 - regression_loss: 0.3376 - classification_loss: 0.0269 117/500 [======>.......................] - ETA: 3:11 - loss: 0.3642 - regression_loss: 0.3373 - classification_loss: 0.0270 118/500 [======>.......................] - ETA: 3:11 - loss: 0.3634 - regression_loss: 0.3366 - classification_loss: 0.0269 119/500 [======>.......................] - ETA: 3:10 - loss: 0.3627 - regression_loss: 0.3360 - classification_loss: 0.0267 120/500 [======>.......................] - ETA: 3:10 - loss: 0.3632 - regression_loss: 0.3365 - classification_loss: 0.0267 121/500 [======>.......................] - ETA: 3:09 - loss: 0.3626 - regression_loss: 0.3359 - classification_loss: 0.0267 122/500 [======>.......................] - ETA: 3:09 - loss: 0.3615 - regression_loss: 0.3350 - classification_loss: 0.0266 123/500 [======>.......................] - ETA: 3:08 - loss: 0.3611 - regression_loss: 0.3346 - classification_loss: 0.0265 124/500 [======>.......................] - ETA: 3:08 - loss: 0.3592 - regression_loss: 0.3329 - classification_loss: 0.0263 125/500 [======>.......................] - ETA: 3:07 - loss: 0.3585 - regression_loss: 0.3323 - classification_loss: 0.0262 126/500 [======>.......................] - ETA: 3:07 - loss: 0.3571 - regression_loss: 0.3311 - classification_loss: 0.0260 127/500 [======>.......................] - ETA: 3:06 - loss: 0.3586 - regression_loss: 0.3322 - classification_loss: 0.0264 128/500 [======>.......................] - ETA: 3:06 - loss: 0.3587 - regression_loss: 0.3324 - classification_loss: 0.0263 129/500 [======>.......................] - ETA: 3:05 - loss: 0.3576 - regression_loss: 0.3314 - classification_loss: 0.0262 130/500 [======>.......................] - ETA: 3:05 - loss: 0.3576 - regression_loss: 0.3314 - classification_loss: 0.0261 131/500 [======>.......................] - ETA: 3:04 - loss: 0.3576 - regression_loss: 0.3315 - classification_loss: 0.0262 132/500 [======>.......................] - ETA: 3:04 - loss: 0.3579 - regression_loss: 0.3317 - classification_loss: 0.0262 133/500 [======>.......................] - ETA: 3:03 - loss: 0.3587 - regression_loss: 0.3325 - classification_loss: 0.0263 134/500 [=======>......................] - ETA: 3:03 - loss: 0.3595 - regression_loss: 0.3333 - classification_loss: 0.0262 135/500 [=======>......................] - ETA: 3:02 - loss: 0.3598 - regression_loss: 0.3334 - classification_loss: 0.0264 136/500 [=======>......................] - ETA: 3:02 - loss: 0.3607 - regression_loss: 0.3344 - classification_loss: 0.0263 137/500 [=======>......................] - ETA: 3:01 - loss: 0.3602 - regression_loss: 0.3339 - classification_loss: 0.0262 138/500 [=======>......................] - ETA: 3:01 - loss: 0.3604 - regression_loss: 0.3342 - classification_loss: 0.0262 139/500 [=======>......................] - ETA: 3:00 - loss: 0.3599 - regression_loss: 0.3339 - classification_loss: 0.0261 140/500 [=======>......................] - ETA: 3:00 - loss: 0.3617 - regression_loss: 0.3355 - classification_loss: 0.0262 141/500 [=======>......................] - ETA: 2:59 - loss: 0.3603 - regression_loss: 0.3343 - classification_loss: 0.0260 142/500 [=======>......................] - ETA: 2:59 - loss: 0.3616 - regression_loss: 0.3355 - classification_loss: 0.0261 143/500 [=======>......................] - ETA: 2:58 - loss: 0.3627 - regression_loss: 0.3367 - classification_loss: 0.0260 144/500 [=======>......................] - ETA: 2:58 - loss: 0.3632 - regression_loss: 0.3373 - classification_loss: 0.0259 145/500 [=======>......................] - ETA: 2:57 - loss: 0.3628 - regression_loss: 0.3370 - classification_loss: 0.0258 146/500 [=======>......................] - ETA: 2:57 - loss: 0.3659 - regression_loss: 0.3399 - classification_loss: 0.0259 147/500 [=======>......................] - ETA: 2:56 - loss: 0.3671 - regression_loss: 0.3412 - classification_loss: 0.0259 148/500 [=======>......................] - ETA: 2:56 - loss: 0.3680 - regression_loss: 0.3421 - classification_loss: 0.0259 149/500 [=======>......................] - ETA: 2:55 - loss: 0.3671 - regression_loss: 0.3413 - classification_loss: 0.0258 150/500 [========>.....................] - ETA: 2:55 - loss: 0.3657 - regression_loss: 0.3400 - classification_loss: 0.0257 151/500 [========>.....................] - ETA: 2:54 - loss: 0.3657 - regression_loss: 0.3400 - classification_loss: 0.0257 152/500 [========>.....................] - ETA: 2:54 - loss: 0.3650 - regression_loss: 0.3394 - classification_loss: 0.0256 153/500 [========>.....................] - ETA: 2:53 - loss: 0.3638 - regression_loss: 0.3383 - classification_loss: 0.0255 154/500 [========>.....................] - ETA: 2:53 - loss: 0.3638 - regression_loss: 0.3384 - classification_loss: 0.0254 155/500 [========>.....................] - ETA: 2:52 - loss: 0.3668 - regression_loss: 0.3410 - classification_loss: 0.0258 156/500 [========>.....................] - ETA: 2:52 - loss: 0.3657 - regression_loss: 0.3401 - classification_loss: 0.0257 157/500 [========>.....................] - ETA: 2:51 - loss: 0.3647 - regression_loss: 0.3391 - classification_loss: 0.0256 158/500 [========>.....................] - ETA: 2:51 - loss: 0.3642 - regression_loss: 0.3387 - classification_loss: 0.0255 159/500 [========>.....................] - ETA: 2:50 - loss: 0.3638 - regression_loss: 0.3384 - classification_loss: 0.0254 160/500 [========>.....................] - ETA: 2:50 - loss: 0.3638 - regression_loss: 0.3385 - classification_loss: 0.0253 161/500 [========>.....................] - ETA: 2:49 - loss: 0.3630 - regression_loss: 0.3378 - classification_loss: 0.0252 162/500 [========>.....................] - ETA: 2:49 - loss: 0.3635 - regression_loss: 0.3382 - classification_loss: 0.0253 163/500 [========>.....................] - ETA: 2:48 - loss: 0.3634 - regression_loss: 0.3381 - classification_loss: 0.0253 164/500 [========>.....................] - ETA: 2:48 - loss: 0.3648 - regression_loss: 0.3393 - classification_loss: 0.0254 165/500 [========>.....................] - ETA: 2:47 - loss: 0.3656 - regression_loss: 0.3401 - classification_loss: 0.0254 166/500 [========>.....................] - ETA: 2:47 - loss: 0.3660 - regression_loss: 0.3406 - classification_loss: 0.0254 167/500 [=========>....................] - ETA: 2:46 - loss: 0.3660 - regression_loss: 0.3406 - classification_loss: 0.0253 168/500 [=========>....................] - ETA: 2:46 - loss: 0.3684 - regression_loss: 0.3431 - classification_loss: 0.0254 169/500 [=========>....................] - ETA: 2:45 - loss: 0.3685 - regression_loss: 0.3432 - classification_loss: 0.0253 170/500 [=========>....................] - ETA: 2:45 - loss: 0.3677 - regression_loss: 0.3425 - classification_loss: 0.0252 171/500 [=========>....................] - ETA: 2:44 - loss: 0.3681 - regression_loss: 0.3429 - classification_loss: 0.0252 172/500 [=========>....................] - ETA: 2:44 - loss: 0.3687 - regression_loss: 0.3435 - classification_loss: 0.0252 173/500 [=========>....................] - ETA: 2:43 - loss: 0.3686 - regression_loss: 0.3432 - classification_loss: 0.0253 174/500 [=========>....................] - ETA: 2:43 - loss: 0.3695 - regression_loss: 0.3440 - classification_loss: 0.0255 175/500 [=========>....................] - ETA: 2:42 - loss: 0.3704 - regression_loss: 0.3449 - classification_loss: 0.0256 176/500 [=========>....................] - ETA: 2:42 - loss: 0.3696 - regression_loss: 0.3441 - classification_loss: 0.0255 177/500 [=========>....................] - ETA: 2:41 - loss: 0.3688 - regression_loss: 0.3434 - classification_loss: 0.0254 178/500 [=========>....................] - ETA: 2:41 - loss: 0.3689 - regression_loss: 0.3435 - classification_loss: 0.0254 179/500 [=========>....................] - ETA: 2:40 - loss: 0.3688 - regression_loss: 0.3434 - classification_loss: 0.0254 180/500 [=========>....................] - ETA: 2:40 - loss: 0.3695 - regression_loss: 0.3442 - classification_loss: 0.0254 181/500 [=========>....................] - ETA: 2:39 - loss: 0.3687 - regression_loss: 0.3434 - classification_loss: 0.0253 182/500 [=========>....................] - ETA: 2:39 - loss: 0.3698 - regression_loss: 0.3444 - classification_loss: 0.0255 183/500 [=========>....................] - ETA: 2:38 - loss: 0.3693 - regression_loss: 0.3439 - classification_loss: 0.0254 184/500 [==========>...................] - ETA: 2:38 - loss: 0.3681 - regression_loss: 0.3428 - classification_loss: 0.0253 185/500 [==========>...................] - ETA: 2:37 - loss: 0.3684 - regression_loss: 0.3431 - classification_loss: 0.0253 186/500 [==========>...................] - ETA: 2:37 - loss: 0.3677 - regression_loss: 0.3424 - classification_loss: 0.0253 187/500 [==========>...................] - ETA: 2:36 - loss: 0.3674 - regression_loss: 0.3422 - classification_loss: 0.0252 188/500 [==========>...................] - ETA: 2:36 - loss: 0.3679 - regression_loss: 0.3427 - classification_loss: 0.0252 189/500 [==========>...................] - ETA: 2:35 - loss: 0.3692 - regression_loss: 0.3439 - classification_loss: 0.0253 190/500 [==========>...................] - ETA: 2:35 - loss: 0.3695 - regression_loss: 0.3442 - classification_loss: 0.0253 191/500 [==========>...................] - ETA: 2:34 - loss: 0.3692 - regression_loss: 0.3439 - classification_loss: 0.0253 192/500 [==========>...................] - ETA: 2:34 - loss: 0.3681 - regression_loss: 0.3429 - classification_loss: 0.0252 193/500 [==========>...................] - ETA: 2:33 - loss: 0.3680 - regression_loss: 0.3429 - classification_loss: 0.0251 194/500 [==========>...................] - ETA: 2:33 - loss: 0.3695 - regression_loss: 0.3441 - classification_loss: 0.0254 195/500 [==========>...................] - ETA: 2:32 - loss: 0.3692 - regression_loss: 0.3439 - classification_loss: 0.0253 196/500 [==========>...................] - ETA: 2:32 - loss: 0.3681 - regression_loss: 0.3429 - classification_loss: 0.0252 197/500 [==========>...................] - ETA: 2:31 - loss: 0.3679 - regression_loss: 0.3428 - classification_loss: 0.0252 198/500 [==========>...................] - ETA: 2:31 - loss: 0.3680 - regression_loss: 0.3429 - classification_loss: 0.0251 199/500 [==========>...................] - ETA: 2:30 - loss: 0.3682 - regression_loss: 0.3431 - classification_loss: 0.0251 200/500 [===========>..................] - ETA: 2:30 - loss: 0.3684 - regression_loss: 0.3433 - classification_loss: 0.0251 201/500 [===========>..................] - ETA: 2:29 - loss: 0.3705 - regression_loss: 0.3450 - classification_loss: 0.0255 202/500 [===========>..................] - ETA: 2:29 - loss: 0.3708 - regression_loss: 0.3453 - classification_loss: 0.0255 203/500 [===========>..................] - ETA: 2:28 - loss: 0.3715 - regression_loss: 0.3459 - classification_loss: 0.0256 204/500 [===========>..................] - ETA: 2:28 - loss: 0.3703 - regression_loss: 0.3448 - classification_loss: 0.0255 205/500 [===========>..................] - ETA: 2:27 - loss: 0.3712 - regression_loss: 0.3458 - classification_loss: 0.0254 206/500 [===========>..................] - ETA: 2:27 - loss: 0.3721 - regression_loss: 0.3467 - classification_loss: 0.0255 207/500 [===========>..................] - ETA: 2:26 - loss: 0.3724 - regression_loss: 0.3469 - classification_loss: 0.0255 208/500 [===========>..................] - ETA: 2:26 - loss: 0.3720 - regression_loss: 0.3465 - classification_loss: 0.0255 209/500 [===========>..................] - ETA: 2:25 - loss: 0.3710 - regression_loss: 0.3456 - classification_loss: 0.0254 210/500 [===========>..................] - ETA: 2:25 - loss: 0.3723 - regression_loss: 0.3468 - classification_loss: 0.0255 211/500 [===========>..................] - ETA: 2:24 - loss: 0.3737 - regression_loss: 0.3480 - classification_loss: 0.0257 212/500 [===========>..................] - ETA: 2:24 - loss: 0.3745 - regression_loss: 0.3487 - classification_loss: 0.0258 213/500 [===========>..................] - ETA: 2:23 - loss: 0.3740 - regression_loss: 0.3483 - classification_loss: 0.0257 214/500 [===========>..................] - ETA: 2:23 - loss: 0.3740 - regression_loss: 0.3484 - classification_loss: 0.0256 215/500 [===========>..................] - ETA: 2:22 - loss: 0.3737 - regression_loss: 0.3481 - classification_loss: 0.0256 216/500 [===========>..................] - ETA: 2:22 - loss: 0.3741 - regression_loss: 0.3485 - classification_loss: 0.0256 217/500 [============>.................] - ETA: 2:21 - loss: 0.3737 - regression_loss: 0.3481 - classification_loss: 0.0256 218/500 [============>.................] - ETA: 2:21 - loss: 0.3731 - regression_loss: 0.3476 - classification_loss: 0.0255 219/500 [============>.................] - ETA: 2:20 - loss: 0.3740 - regression_loss: 0.3485 - classification_loss: 0.0255 220/500 [============>.................] - ETA: 2:20 - loss: 0.3730 - regression_loss: 0.3476 - classification_loss: 0.0254 221/500 [============>.................] - ETA: 2:19 - loss: 0.3728 - regression_loss: 0.3474 - classification_loss: 0.0253 222/500 [============>.................] - ETA: 2:19 - loss: 0.3731 - regression_loss: 0.3478 - classification_loss: 0.0253 223/500 [============>.................] - ETA: 2:18 - loss: 0.3730 - regression_loss: 0.3477 - classification_loss: 0.0253 224/500 [============>.................] - ETA: 2:18 - loss: 0.3726 - regression_loss: 0.3473 - classification_loss: 0.0253 225/500 [============>.................] - ETA: 2:17 - loss: 0.3725 - regression_loss: 0.3473 - classification_loss: 0.0253 226/500 [============>.................] - ETA: 2:17 - loss: 0.3721 - regression_loss: 0.3469 - classification_loss: 0.0252 227/500 [============>.................] - ETA: 2:16 - loss: 0.3720 - regression_loss: 0.3468 - classification_loss: 0.0252 228/500 [============>.................] - ETA: 2:16 - loss: 0.3709 - regression_loss: 0.3458 - classification_loss: 0.0251 229/500 [============>.................] - ETA: 2:15 - loss: 0.3711 - regression_loss: 0.3460 - classification_loss: 0.0251 230/500 [============>.................] - ETA: 2:15 - loss: 0.3698 - regression_loss: 0.3448 - classification_loss: 0.0250 231/500 [============>.................] - ETA: 2:14 - loss: 0.3693 - regression_loss: 0.3444 - classification_loss: 0.0250 232/500 [============>.................] - ETA: 2:14 - loss: 0.3689 - regression_loss: 0.3440 - classification_loss: 0.0249 233/500 [============>.................] - ETA: 2:13 - loss: 0.3684 - regression_loss: 0.3436 - classification_loss: 0.0248 234/500 [=============>................] - ETA: 2:13 - loss: 0.3692 - regression_loss: 0.3443 - classification_loss: 0.0249 235/500 [=============>................] - ETA: 2:12 - loss: 0.3692 - regression_loss: 0.3443 - classification_loss: 0.0249 236/500 [=============>................] - ETA: 2:12 - loss: 0.3688 - regression_loss: 0.3439 - classification_loss: 0.0249 237/500 [=============>................] - ETA: 2:11 - loss: 0.3692 - regression_loss: 0.3443 - classification_loss: 0.0249 238/500 [=============>................] - ETA: 2:11 - loss: 0.3699 - regression_loss: 0.3450 - classification_loss: 0.0249 239/500 [=============>................] - ETA: 2:10 - loss: 0.3699 - regression_loss: 0.3450 - classification_loss: 0.0249 240/500 [=============>................] - ETA: 2:10 - loss: 0.3696 - regression_loss: 0.3448 - classification_loss: 0.0249 241/500 [=============>................] - ETA: 2:09 - loss: 0.3695 - regression_loss: 0.3446 - classification_loss: 0.0249 242/500 [=============>................] - ETA: 2:09 - loss: 0.3684 - regression_loss: 0.3436 - classification_loss: 0.0248 243/500 [=============>................] - ETA: 2:08 - loss: 0.3673 - regression_loss: 0.3426 - classification_loss: 0.0247 244/500 [=============>................] - ETA: 2:08 - loss: 0.3674 - regression_loss: 0.3427 - classification_loss: 0.0247 245/500 [=============>................] - ETA: 2:07 - loss: 0.3681 - regression_loss: 0.3434 - classification_loss: 0.0247 246/500 [=============>................] - ETA: 2:07 - loss: 0.3677 - regression_loss: 0.3430 - classification_loss: 0.0247 247/500 [=============>................] - ETA: 2:06 - loss: 0.3677 - regression_loss: 0.3430 - classification_loss: 0.0247 248/500 [=============>................] - ETA: 2:06 - loss: 0.3683 - regression_loss: 0.3436 - classification_loss: 0.0248 249/500 [=============>................] - ETA: 2:05 - loss: 0.3676 - regression_loss: 0.3429 - classification_loss: 0.0247 250/500 [==============>...............] - ETA: 2:05 - loss: 0.3670 - regression_loss: 0.3423 - classification_loss: 0.0247 251/500 [==============>...............] - ETA: 2:04 - loss: 0.3674 - regression_loss: 0.3427 - classification_loss: 0.0246 252/500 [==============>...............] - ETA: 2:04 - loss: 0.3666 - regression_loss: 0.3420 - classification_loss: 0.0246 253/500 [==============>...............] - ETA: 2:03 - loss: 0.3666 - regression_loss: 0.3421 - classification_loss: 0.0246 254/500 [==============>...............] - ETA: 2:03 - loss: 0.3675 - regression_loss: 0.3428 - classification_loss: 0.0247 255/500 [==============>...............] - ETA: 2:02 - loss: 0.3669 - regression_loss: 0.3423 - classification_loss: 0.0246 256/500 [==============>...............] - ETA: 2:02 - loss: 0.3670 - regression_loss: 0.3424 - classification_loss: 0.0246 257/500 [==============>...............] - ETA: 2:01 - loss: 0.3667 - regression_loss: 0.3421 - classification_loss: 0.0245 258/500 [==============>...............] - ETA: 2:01 - loss: 0.3672 - regression_loss: 0.3425 - classification_loss: 0.0247 259/500 [==============>...............] - ETA: 2:00 - loss: 0.3676 - regression_loss: 0.3428 - classification_loss: 0.0248 260/500 [==============>...............] - ETA: 2:00 - loss: 0.3668 - regression_loss: 0.3421 - classification_loss: 0.0247 261/500 [==============>...............] - ETA: 1:59 - loss: 0.3668 - regression_loss: 0.3421 - classification_loss: 0.0247 262/500 [==============>...............] - ETA: 1:59 - loss: 0.3669 - regression_loss: 0.3421 - classification_loss: 0.0248 263/500 [==============>...............] - ETA: 1:58 - loss: 0.3668 - regression_loss: 0.3420 - classification_loss: 0.0248 264/500 [==============>...............] - ETA: 1:58 - loss: 0.3665 - regression_loss: 0.3417 - classification_loss: 0.0247 265/500 [==============>...............] - ETA: 1:57 - loss: 0.3656 - regression_loss: 0.3409 - classification_loss: 0.0247 266/500 [==============>...............] - ETA: 1:57 - loss: 0.3654 - regression_loss: 0.3408 - classification_loss: 0.0246 267/500 [===============>..............] - ETA: 1:56 - loss: 0.3656 - regression_loss: 0.3410 - classification_loss: 0.0246 268/500 [===============>..............] - ETA: 1:56 - loss: 0.3656 - regression_loss: 0.3411 - classification_loss: 0.0245 269/500 [===============>..............] - ETA: 1:55 - loss: 0.3664 - regression_loss: 0.3418 - classification_loss: 0.0247 270/500 [===============>..............] - ETA: 1:55 - loss: 0.3660 - regression_loss: 0.3414 - classification_loss: 0.0246 271/500 [===============>..............] - ETA: 1:54 - loss: 0.3657 - regression_loss: 0.3411 - classification_loss: 0.0246 272/500 [===============>..............] - ETA: 1:54 - loss: 0.3651 - regression_loss: 0.3405 - classification_loss: 0.0246 273/500 [===============>..............] - ETA: 1:53 - loss: 0.3650 - regression_loss: 0.3405 - classification_loss: 0.0245 274/500 [===============>..............] - ETA: 1:53 - loss: 0.3645 - regression_loss: 0.3400 - classification_loss: 0.0245 275/500 [===============>..............] - ETA: 1:52 - loss: 0.3642 - regression_loss: 0.3397 - classification_loss: 0.0245 276/500 [===============>..............] - ETA: 1:52 - loss: 0.3645 - regression_loss: 0.3400 - classification_loss: 0.0245 277/500 [===============>..............] - ETA: 1:51 - loss: 0.3642 - regression_loss: 0.3397 - classification_loss: 0.0244 278/500 [===============>..............] - ETA: 1:51 - loss: 0.3636 - regression_loss: 0.3392 - classification_loss: 0.0244 279/500 [===============>..............] - ETA: 1:50 - loss: 0.3631 - regression_loss: 0.3388 - classification_loss: 0.0243 280/500 [===============>..............] - ETA: 1:50 - loss: 0.3635 - regression_loss: 0.3391 - classification_loss: 0.0245 281/500 [===============>..............] - ETA: 1:49 - loss: 0.3638 - regression_loss: 0.3393 - classification_loss: 0.0244 282/500 [===============>..............] - ETA: 1:49 - loss: 0.3637 - regression_loss: 0.3393 - classification_loss: 0.0244 283/500 [===============>..............] - ETA: 1:48 - loss: 0.3637 - regression_loss: 0.3393 - classification_loss: 0.0244 284/500 [================>.............] - ETA: 1:48 - loss: 0.3635 - regression_loss: 0.3391 - classification_loss: 0.0244 285/500 [================>.............] - ETA: 1:47 - loss: 0.3634 - regression_loss: 0.3390 - classification_loss: 0.0244 286/500 [================>.............] - ETA: 1:47 - loss: 0.3631 - regression_loss: 0.3387 - classification_loss: 0.0244 287/500 [================>.............] - ETA: 1:46 - loss: 0.3641 - regression_loss: 0.3396 - classification_loss: 0.0244 288/500 [================>.............] - ETA: 1:46 - loss: 0.3635 - regression_loss: 0.3392 - classification_loss: 0.0244 289/500 [================>.............] - ETA: 1:45 - loss: 0.3640 - regression_loss: 0.3397 - classification_loss: 0.0244 290/500 [================>.............] - ETA: 1:45 - loss: 0.3651 - regression_loss: 0.3407 - classification_loss: 0.0244 291/500 [================>.............] - ETA: 1:44 - loss: 0.3644 - regression_loss: 0.3400 - classification_loss: 0.0244 292/500 [================>.............] - ETA: 1:44 - loss: 0.3639 - regression_loss: 0.3396 - classification_loss: 0.0243 293/500 [================>.............] - ETA: 1:43 - loss: 0.3642 - regression_loss: 0.3399 - classification_loss: 0.0243 294/500 [================>.............] - ETA: 1:43 - loss: 0.3645 - regression_loss: 0.3402 - classification_loss: 0.0243 295/500 [================>.............] - ETA: 1:42 - loss: 0.3658 - regression_loss: 0.3415 - classification_loss: 0.0243 296/500 [================>.............] - ETA: 1:42 - loss: 0.3650 - regression_loss: 0.3408 - classification_loss: 0.0242 297/500 [================>.............] - ETA: 1:41 - loss: 0.3668 - regression_loss: 0.3425 - classification_loss: 0.0244 298/500 [================>.............] - ETA: 1:41 - loss: 0.3663 - regression_loss: 0.3420 - classification_loss: 0.0243 299/500 [================>.............] - ETA: 1:40 - loss: 0.3668 - regression_loss: 0.3424 - classification_loss: 0.0244 300/500 [=================>............] - ETA: 1:40 - loss: 0.3664 - regression_loss: 0.3420 - classification_loss: 0.0243 301/500 [=================>............] - ETA: 1:39 - loss: 0.3664 - regression_loss: 0.3420 - classification_loss: 0.0243 302/500 [=================>............] - ETA: 1:39 - loss: 0.3660 - regression_loss: 0.3417 - classification_loss: 0.0243 303/500 [=================>............] - ETA: 1:38 - loss: 0.3661 - regression_loss: 0.3418 - classification_loss: 0.0243 304/500 [=================>............] - ETA: 1:38 - loss: 0.3660 - regression_loss: 0.3417 - classification_loss: 0.0243 305/500 [=================>............] - ETA: 1:37 - loss: 0.3662 - regression_loss: 0.3419 - classification_loss: 0.0243 306/500 [=================>............] - ETA: 1:37 - loss: 0.3658 - regression_loss: 0.3415 - classification_loss: 0.0243 307/500 [=================>............] - ETA: 1:36 - loss: 0.3655 - regression_loss: 0.3412 - classification_loss: 0.0243 308/500 [=================>............] - ETA: 1:36 - loss: 0.3652 - regression_loss: 0.3410 - classification_loss: 0.0242 309/500 [=================>............] - ETA: 1:35 - loss: 0.3653 - regression_loss: 0.3411 - classification_loss: 0.0242 310/500 [=================>............] - ETA: 1:35 - loss: 0.3661 - regression_loss: 0.3419 - classification_loss: 0.0242 311/500 [=================>............] - ETA: 1:34 - loss: 0.3657 - regression_loss: 0.3416 - classification_loss: 0.0241 312/500 [=================>............] - ETA: 1:34 - loss: 0.3663 - regression_loss: 0.3421 - classification_loss: 0.0242 313/500 [=================>............] - ETA: 1:33 - loss: 0.3665 - regression_loss: 0.3423 - classification_loss: 0.0242 314/500 [=================>............] - ETA: 1:33 - loss: 0.3672 - regression_loss: 0.3429 - classification_loss: 0.0243 315/500 [=================>............] - ETA: 1:32 - loss: 0.3670 - regression_loss: 0.3428 - classification_loss: 0.0243 316/500 [=================>............] - ETA: 1:32 - loss: 0.3672 - regression_loss: 0.3429 - classification_loss: 0.0243 317/500 [==================>...........] - ETA: 1:31 - loss: 0.3672 - regression_loss: 0.3429 - classification_loss: 0.0243 318/500 [==================>...........] - ETA: 1:31 - loss: 0.3671 - regression_loss: 0.3427 - classification_loss: 0.0243 319/500 [==================>...........] - ETA: 1:30 - loss: 0.3673 - regression_loss: 0.3430 - classification_loss: 0.0244 320/500 [==================>...........] - ETA: 1:30 - loss: 0.3669 - regression_loss: 0.3426 - classification_loss: 0.0243 321/500 [==================>...........] - ETA: 1:29 - loss: 0.3680 - regression_loss: 0.3436 - classification_loss: 0.0243 322/500 [==================>...........] - ETA: 1:29 - loss: 0.3686 - regression_loss: 0.3441 - classification_loss: 0.0245 323/500 [==================>...........] - ETA: 1:28 - loss: 0.3681 - regression_loss: 0.3436 - classification_loss: 0.0245 324/500 [==================>...........] - ETA: 1:28 - loss: 0.3679 - regression_loss: 0.3434 - classification_loss: 0.0244 325/500 [==================>...........] - ETA: 1:27 - loss: 0.3680 - regression_loss: 0.3435 - classification_loss: 0.0245 326/500 [==================>...........] - ETA: 1:27 - loss: 0.3671 - regression_loss: 0.3427 - classification_loss: 0.0244 327/500 [==================>...........] - ETA: 1:26 - loss: 0.3675 - regression_loss: 0.3430 - classification_loss: 0.0245 328/500 [==================>...........] - ETA: 1:26 - loss: 0.3670 - regression_loss: 0.3425 - classification_loss: 0.0245 329/500 [==================>...........] - ETA: 1:25 - loss: 0.3666 - regression_loss: 0.3422 - classification_loss: 0.0244 330/500 [==================>...........] - ETA: 1:25 - loss: 0.3668 - regression_loss: 0.3423 - classification_loss: 0.0244 331/500 [==================>...........] - ETA: 1:24 - loss: 0.3672 - regression_loss: 0.3427 - classification_loss: 0.0245 332/500 [==================>...........] - ETA: 1:24 - loss: 0.3670 - regression_loss: 0.3425 - classification_loss: 0.0245 333/500 [==================>...........] - ETA: 1:23 - loss: 0.3666 - regression_loss: 0.3422 - classification_loss: 0.0244 334/500 [===================>..........] - ETA: 1:23 - loss: 0.3663 - regression_loss: 0.3419 - classification_loss: 0.0244 335/500 [===================>..........] - ETA: 1:22 - loss: 0.3665 - regression_loss: 0.3421 - classification_loss: 0.0244 336/500 [===================>..........] - ETA: 1:22 - loss: 0.3663 - regression_loss: 0.3419 - classification_loss: 0.0244 337/500 [===================>..........] - ETA: 1:21 - loss: 0.3657 - regression_loss: 0.3414 - classification_loss: 0.0244 338/500 [===================>..........] - ETA: 1:21 - loss: 0.3658 - regression_loss: 0.3415 - classification_loss: 0.0244 339/500 [===================>..........] - ETA: 1:20 - loss: 0.3657 - regression_loss: 0.3414 - classification_loss: 0.0244 340/500 [===================>..........] - ETA: 1:20 - loss: 0.3655 - regression_loss: 0.3412 - classification_loss: 0.0243 341/500 [===================>..........] - ETA: 1:19 - loss: 0.3659 - regression_loss: 0.3416 - classification_loss: 0.0243 342/500 [===================>..........] - ETA: 1:19 - loss: 0.3658 - regression_loss: 0.3415 - classification_loss: 0.0243 343/500 [===================>..........] - ETA: 1:18 - loss: 0.3656 - regression_loss: 0.3413 - classification_loss: 0.0243 344/500 [===================>..........] - ETA: 1:18 - loss: 0.3658 - regression_loss: 0.3415 - classification_loss: 0.0243 345/500 [===================>..........] - ETA: 1:17 - loss: 0.3656 - regression_loss: 0.3413 - classification_loss: 0.0243 346/500 [===================>..........] - ETA: 1:17 - loss: 0.3650 - regression_loss: 0.3408 - classification_loss: 0.0242 347/500 [===================>..........] - ETA: 1:16 - loss: 0.3655 - regression_loss: 0.3412 - classification_loss: 0.0243 348/500 [===================>..........] - ETA: 1:16 - loss: 0.3659 - regression_loss: 0.3416 - classification_loss: 0.0243 349/500 [===================>..........] - ETA: 1:15 - loss: 0.3657 - regression_loss: 0.3415 - classification_loss: 0.0242 350/500 [====================>.........] - ETA: 1:15 - loss: 0.3659 - regression_loss: 0.3416 - classification_loss: 0.0243 351/500 [====================>.........] - ETA: 1:14 - loss: 0.3663 - regression_loss: 0.3420 - classification_loss: 0.0243 352/500 [====================>.........] - ETA: 1:14 - loss: 0.3661 - regression_loss: 0.3418 - classification_loss: 0.0243 353/500 [====================>.........] - ETA: 1:13 - loss: 0.3658 - regression_loss: 0.3416 - classification_loss: 0.0242 354/500 [====================>.........] - ETA: 1:13 - loss: 0.3657 - regression_loss: 0.3415 - classification_loss: 0.0242 355/500 [====================>.........] - ETA: 1:12 - loss: 0.3657 - regression_loss: 0.3415 - classification_loss: 0.0242 356/500 [====================>.........] - ETA: 1:12 - loss: 0.3655 - regression_loss: 0.3413 - classification_loss: 0.0243 357/500 [====================>.........] - ETA: 1:11 - loss: 0.3652 - regression_loss: 0.3409 - classification_loss: 0.0243 358/500 [====================>.........] - ETA: 1:11 - loss: 0.3648 - regression_loss: 0.3405 - classification_loss: 0.0243 359/500 [====================>.........] - ETA: 1:10 - loss: 0.3652 - regression_loss: 0.3409 - classification_loss: 0.0243 360/500 [====================>.........] - ETA: 1:10 - loss: 0.3653 - regression_loss: 0.3410 - classification_loss: 0.0242 361/500 [====================>.........] - ETA: 1:09 - loss: 0.3651 - regression_loss: 0.3409 - classification_loss: 0.0242 362/500 [====================>.........] - ETA: 1:09 - loss: 0.3659 - regression_loss: 0.3415 - classification_loss: 0.0243 363/500 [====================>.........] - ETA: 1:08 - loss: 0.3659 - regression_loss: 0.3416 - classification_loss: 0.0243 364/500 [====================>.........] - ETA: 1:08 - loss: 0.3659 - regression_loss: 0.3416 - classification_loss: 0.0243 365/500 [====================>.........] - ETA: 1:07 - loss: 0.3652 - regression_loss: 0.3409 - classification_loss: 0.0243 366/500 [====================>.........] - ETA: 1:07 - loss: 0.3653 - regression_loss: 0.3410 - classification_loss: 0.0243 367/500 [=====================>........] - ETA: 1:06 - loss: 0.3652 - regression_loss: 0.3409 - classification_loss: 0.0243 368/500 [=====================>........] - ETA: 1:06 - loss: 0.3655 - regression_loss: 0.3412 - classification_loss: 0.0243 369/500 [=====================>........] - ETA: 1:05 - loss: 0.3653 - regression_loss: 0.3410 - classification_loss: 0.0243 370/500 [=====================>........] - ETA: 1:05 - loss: 0.3650 - regression_loss: 0.3407 - classification_loss: 0.0243 371/500 [=====================>........] - ETA: 1:04 - loss: 0.3652 - regression_loss: 0.3410 - classification_loss: 0.0243 372/500 [=====================>........] - ETA: 1:04 - loss: 0.3655 - regression_loss: 0.3412 - classification_loss: 0.0242 373/500 [=====================>........] - ETA: 1:03 - loss: 0.3658 - regression_loss: 0.3415 - classification_loss: 0.0243 374/500 [=====================>........] - ETA: 1:03 - loss: 0.3658 - regression_loss: 0.3415 - classification_loss: 0.0243 375/500 [=====================>........] - ETA: 1:02 - loss: 0.3652 - regression_loss: 0.3409 - classification_loss: 0.0242 376/500 [=====================>........] - ETA: 1:02 - loss: 0.3652 - regression_loss: 0.3409 - classification_loss: 0.0242 377/500 [=====================>........] - ETA: 1:01 - loss: 0.3646 - regression_loss: 0.3404 - classification_loss: 0.0242 378/500 [=====================>........] - ETA: 1:01 - loss: 0.3642 - regression_loss: 0.3400 - classification_loss: 0.0241 379/500 [=====================>........] - ETA: 1:00 - loss: 0.3643 - regression_loss: 0.3401 - classification_loss: 0.0242 380/500 [=====================>........] - ETA: 1:00 - loss: 0.3639 - regression_loss: 0.3397 - classification_loss: 0.0241 381/500 [=====================>........] - ETA: 59s - loss: 0.3642 - regression_loss: 0.3400 - classification_loss: 0.0242  382/500 [=====================>........] - ETA: 59s - loss: 0.3645 - regression_loss: 0.3402 - classification_loss: 0.0243 383/500 [=====================>........] - ETA: 58s - loss: 0.3644 - regression_loss: 0.3400 - classification_loss: 0.0243 384/500 [======================>.......] - ETA: 58s - loss: 0.3646 - regression_loss: 0.3403 - classification_loss: 0.0243 385/500 [======================>.......] - ETA: 57s - loss: 0.3644 - regression_loss: 0.3401 - classification_loss: 0.0243 386/500 [======================>.......] - ETA: 57s - loss: 0.3639 - regression_loss: 0.3397 - classification_loss: 0.0243 387/500 [======================>.......] - ETA: 56s - loss: 0.3636 - regression_loss: 0.3393 - classification_loss: 0.0243 388/500 [======================>.......] - ETA: 56s - loss: 0.3636 - regression_loss: 0.3393 - classification_loss: 0.0242 389/500 [======================>.......] - ETA: 55s - loss: 0.3632 - regression_loss: 0.3390 - classification_loss: 0.0242 390/500 [======================>.......] - ETA: 55s - loss: 0.3627 - regression_loss: 0.3386 - classification_loss: 0.0241 391/500 [======================>.......] - ETA: 54s - loss: 0.3624 - regression_loss: 0.3383 - classification_loss: 0.0241 392/500 [======================>.......] - ETA: 54s - loss: 0.3621 - regression_loss: 0.3380 - classification_loss: 0.0241 393/500 [======================>.......] - ETA: 53s - loss: 0.3621 - regression_loss: 0.3380 - classification_loss: 0.0241 394/500 [======================>.......] - ETA: 53s - loss: 0.3625 - regression_loss: 0.3383 - classification_loss: 0.0242 395/500 [======================>.......] - ETA: 52s - loss: 0.3625 - regression_loss: 0.3383 - classification_loss: 0.0242 396/500 [======================>.......] - ETA: 52s - loss: 0.3634 - regression_loss: 0.3392 - classification_loss: 0.0242 397/500 [======================>.......] - ETA: 51s - loss: 0.3633 - regression_loss: 0.3391 - classification_loss: 0.0242 398/500 [======================>.......] - ETA: 51s - loss: 0.3640 - regression_loss: 0.3398 - classification_loss: 0.0242 399/500 [======================>.......] - ETA: 50s - loss: 0.3649 - regression_loss: 0.3406 - classification_loss: 0.0243 400/500 [=======================>......] - ETA: 50s - loss: 0.3649 - regression_loss: 0.3406 - classification_loss: 0.0243 401/500 [=======================>......] - ETA: 49s - loss: 0.3651 - regression_loss: 0.3409 - classification_loss: 0.0242 402/500 [=======================>......] - ETA: 49s - loss: 0.3653 - regression_loss: 0.3410 - classification_loss: 0.0243 403/500 [=======================>......] - ETA: 48s - loss: 0.3651 - regression_loss: 0.3409 - classification_loss: 0.0242 404/500 [=======================>......] - ETA: 48s - loss: 0.3653 - regression_loss: 0.3411 - classification_loss: 0.0242 405/500 [=======================>......] - ETA: 47s - loss: 0.3651 - regression_loss: 0.3409 - classification_loss: 0.0242 406/500 [=======================>......] - ETA: 47s - loss: 0.3649 - regression_loss: 0.3407 - classification_loss: 0.0242 407/500 [=======================>......] - ETA: 46s - loss: 0.3651 - regression_loss: 0.3409 - classification_loss: 0.0242 408/500 [=======================>......] - ETA: 46s - loss: 0.3654 - regression_loss: 0.3412 - classification_loss: 0.0242 409/500 [=======================>......] - ETA: 45s - loss: 0.3654 - regression_loss: 0.3413 - classification_loss: 0.0242 410/500 [=======================>......] - ETA: 45s - loss: 0.3655 - regression_loss: 0.3413 - classification_loss: 0.0241 411/500 [=======================>......] - ETA: 44s - loss: 0.3654 - regression_loss: 0.3412 - classification_loss: 0.0241 412/500 [=======================>......] - ETA: 44s - loss: 0.3653 - regression_loss: 0.3411 - classification_loss: 0.0241 413/500 [=======================>......] - ETA: 43s - loss: 0.3654 - regression_loss: 0.3412 - classification_loss: 0.0242 414/500 [=======================>......] - ETA: 43s - loss: 0.3657 - regression_loss: 0.3415 - classification_loss: 0.0241 415/500 [=======================>......] - ETA: 42s - loss: 0.3650 - regression_loss: 0.3409 - classification_loss: 0.0241 416/500 [=======================>......] - ETA: 42s - loss: 0.3649 - regression_loss: 0.3408 - classification_loss: 0.0242 417/500 [========================>.....] - ETA: 41s - loss: 0.3657 - regression_loss: 0.3415 - classification_loss: 0.0242 418/500 [========================>.....] - ETA: 41s - loss: 0.3658 - regression_loss: 0.3417 - classification_loss: 0.0242 419/500 [========================>.....] - ETA: 40s - loss: 0.3659 - regression_loss: 0.3417 - classification_loss: 0.0242 420/500 [========================>.....] - ETA: 40s - loss: 0.3664 - regression_loss: 0.3422 - classification_loss: 0.0242 421/500 [========================>.....] - ETA: 39s - loss: 0.3664 - regression_loss: 0.3422 - classification_loss: 0.0242 422/500 [========================>.....] - ETA: 39s - loss: 0.3668 - regression_loss: 0.3425 - classification_loss: 0.0242 423/500 [========================>.....] - ETA: 38s - loss: 0.3668 - regression_loss: 0.3426 - classification_loss: 0.0242 424/500 [========================>.....] - ETA: 38s - loss: 0.3667 - regression_loss: 0.3425 - classification_loss: 0.0242 425/500 [========================>.....] - ETA: 37s - loss: 0.3661 - regression_loss: 0.3420 - classification_loss: 0.0242 426/500 [========================>.....] - ETA: 37s - loss: 0.3665 - regression_loss: 0.3423 - classification_loss: 0.0242 427/500 [========================>.....] - ETA: 36s - loss: 0.3674 - regression_loss: 0.3431 - classification_loss: 0.0243 428/500 [========================>.....] - ETA: 36s - loss: 0.3679 - regression_loss: 0.3436 - classification_loss: 0.0243 429/500 [========================>.....] - ETA: 35s - loss: 0.3685 - regression_loss: 0.3441 - classification_loss: 0.0244 430/500 [========================>.....] - ETA: 35s - loss: 0.3684 - regression_loss: 0.3440 - classification_loss: 0.0243 431/500 [========================>.....] - ETA: 34s - loss: 0.3686 - regression_loss: 0.3442 - classification_loss: 0.0243 432/500 [========================>.....] - ETA: 34s - loss: 0.3686 - regression_loss: 0.3443 - classification_loss: 0.0244 433/500 [========================>.....] - ETA: 33s - loss: 0.3682 - regression_loss: 0.3439 - classification_loss: 0.0243 434/500 [=========================>....] - ETA: 33s - loss: 0.3682 - regression_loss: 0.3439 - classification_loss: 0.0243 435/500 [=========================>....] - ETA: 32s - loss: 0.3680 - regression_loss: 0.3437 - classification_loss: 0.0243 436/500 [=========================>....] - ETA: 32s - loss: 0.3678 - regression_loss: 0.3435 - classification_loss: 0.0243 437/500 [=========================>....] - ETA: 31s - loss: 0.3679 - regression_loss: 0.3436 - classification_loss: 0.0243 438/500 [=========================>....] - ETA: 31s - loss: 0.3677 - regression_loss: 0.3434 - classification_loss: 0.0243 439/500 [=========================>....] - ETA: 30s - loss: 0.3675 - regression_loss: 0.3432 - classification_loss: 0.0243 440/500 [=========================>....] - ETA: 30s - loss: 0.3671 - regression_loss: 0.3429 - classification_loss: 0.0242 441/500 [=========================>....] - ETA: 29s - loss: 0.3670 - regression_loss: 0.3427 - classification_loss: 0.0242 442/500 [=========================>....] - ETA: 29s - loss: 0.3668 - regression_loss: 0.3426 - classification_loss: 0.0242 443/500 [=========================>....] - ETA: 28s - loss: 0.3663 - regression_loss: 0.3421 - classification_loss: 0.0241 444/500 [=========================>....] - ETA: 28s - loss: 0.3667 - regression_loss: 0.3425 - classification_loss: 0.0242 445/500 [=========================>....] - ETA: 27s - loss: 0.3667 - regression_loss: 0.3425 - classification_loss: 0.0242 446/500 [=========================>....] - ETA: 27s - loss: 0.3669 - regression_loss: 0.3427 - classification_loss: 0.0242 447/500 [=========================>....] - ETA: 26s - loss: 0.3668 - regression_loss: 0.3426 - classification_loss: 0.0242 448/500 [=========================>....] - ETA: 26s - loss: 0.3663 - regression_loss: 0.3422 - classification_loss: 0.0241 449/500 [=========================>....] - ETA: 25s - loss: 0.3658 - regression_loss: 0.3417 - classification_loss: 0.0241 450/500 [==========================>...] - ETA: 25s - loss: 0.3659 - regression_loss: 0.3418 - classification_loss: 0.0242 451/500 [==========================>...] - ETA: 24s - loss: 0.3659 - regression_loss: 0.3417 - classification_loss: 0.0242 452/500 [==========================>...] - ETA: 24s - loss: 0.3656 - regression_loss: 0.3415 - classification_loss: 0.0241 453/500 [==========================>...] - ETA: 23s - loss: 0.3664 - regression_loss: 0.3421 - classification_loss: 0.0242 454/500 [==========================>...] - ETA: 23s - loss: 0.3662 - regression_loss: 0.3419 - classification_loss: 0.0242 455/500 [==========================>...] - ETA: 22s - loss: 0.3660 - regression_loss: 0.3418 - classification_loss: 0.0242 456/500 [==========================>...] - ETA: 22s - loss: 0.3662 - regression_loss: 0.3420 - classification_loss: 0.0242 457/500 [==========================>...] - ETA: 21s - loss: 0.3663 - regression_loss: 0.3420 - classification_loss: 0.0242 458/500 [==========================>...] - ETA: 21s - loss: 0.3658 - regression_loss: 0.3417 - classification_loss: 0.0242 459/500 [==========================>...] - ETA: 20s - loss: 0.3661 - regression_loss: 0.3419 - classification_loss: 0.0242 460/500 [==========================>...] - ETA: 20s - loss: 0.3666 - regression_loss: 0.3423 - classification_loss: 0.0243 461/500 [==========================>...] - ETA: 19s - loss: 0.3663 - regression_loss: 0.3421 - classification_loss: 0.0242 462/500 [==========================>...] - ETA: 19s - loss: 0.3661 - regression_loss: 0.3419 - classification_loss: 0.0242 463/500 [==========================>...] - ETA: 18s - loss: 0.3658 - regression_loss: 0.3416 - classification_loss: 0.0242 464/500 [==========================>...] - ETA: 18s - loss: 0.3652 - regression_loss: 0.3411 - classification_loss: 0.0241 465/500 [==========================>...] - ETA: 17s - loss: 0.3649 - regression_loss: 0.3408 - classification_loss: 0.0241 466/500 [==========================>...] - ETA: 17s - loss: 0.3647 - regression_loss: 0.3406 - classification_loss: 0.0241 467/500 [===========================>..] - ETA: 16s - loss: 0.3644 - regression_loss: 0.3404 - classification_loss: 0.0241 468/500 [===========================>..] - ETA: 16s - loss: 0.3648 - regression_loss: 0.3408 - classification_loss: 0.0241 469/500 [===========================>..] - ETA: 15s - loss: 0.3654 - regression_loss: 0.3413 - classification_loss: 0.0240 470/500 [===========================>..] - ETA: 15s - loss: 0.3659 - regression_loss: 0.3418 - classification_loss: 0.0242 471/500 [===========================>..] - ETA: 14s - loss: 0.3661 - regression_loss: 0.3419 - classification_loss: 0.0241 472/500 [===========================>..] - ETA: 14s - loss: 0.3661 - regression_loss: 0.3420 - classification_loss: 0.0241 473/500 [===========================>..] - ETA: 13s - loss: 0.3663 - regression_loss: 0.3421 - classification_loss: 0.0242 474/500 [===========================>..] - ETA: 13s - loss: 0.3660 - regression_loss: 0.3418 - classification_loss: 0.0242 475/500 [===========================>..] - ETA: 12s - loss: 0.3657 - regression_loss: 0.3415 - classification_loss: 0.0241 476/500 [===========================>..] - ETA: 12s - loss: 0.3655 - regression_loss: 0.3414 - classification_loss: 0.0241 477/500 [===========================>..] - ETA: 11s - loss: 0.3657 - regression_loss: 0.3416 - classification_loss: 0.0241 478/500 [===========================>..] - ETA: 11s - loss: 0.3658 - regression_loss: 0.3416 - classification_loss: 0.0242 479/500 [===========================>..] - ETA: 10s - loss: 0.3656 - regression_loss: 0.3415 - classification_loss: 0.0242 480/500 [===========================>..] - ETA: 10s - loss: 0.3656 - regression_loss: 0.3414 - classification_loss: 0.0242 481/500 [===========================>..] - ETA: 9s - loss: 0.3656 - regression_loss: 0.3414 - classification_loss: 0.0242  482/500 [===========================>..] - ETA: 9s - loss: 0.3657 - regression_loss: 0.3415 - classification_loss: 0.0242 483/500 [===========================>..] - ETA: 8s - loss: 0.3660 - regression_loss: 0.3418 - classification_loss: 0.0242 484/500 [============================>.] - ETA: 8s - loss: 0.3659 - regression_loss: 0.3418 - classification_loss: 0.0242 485/500 [============================>.] - ETA: 7s - loss: 0.3659 - regression_loss: 0.3417 - classification_loss: 0.0242 486/500 [============================>.] - ETA: 7s - loss: 0.3660 - regression_loss: 0.3418 - classification_loss: 0.0242 487/500 [============================>.] - ETA: 6s - loss: 0.3666 - regression_loss: 0.3424 - classification_loss: 0.0242 488/500 [============================>.] - ETA: 6s - loss: 0.3667 - regression_loss: 0.3425 - classification_loss: 0.0242 489/500 [============================>.] - ETA: 5s - loss: 0.3672 - regression_loss: 0.3430 - classification_loss: 0.0243 490/500 [============================>.] - ETA: 5s - loss: 0.3671 - regression_loss: 0.3428 - classification_loss: 0.0243 491/500 [============================>.] - ETA: 4s - loss: 0.3671 - regression_loss: 0.3428 - classification_loss: 0.0243 492/500 [============================>.] - ETA: 4s - loss: 0.3672 - regression_loss: 0.3429 - classification_loss: 0.0243 493/500 [============================>.] - ETA: 3s - loss: 0.3668 - regression_loss: 0.3425 - classification_loss: 0.0243 494/500 [============================>.] - ETA: 3s - loss: 0.3669 - regression_loss: 0.3426 - classification_loss: 0.0243 495/500 [============================>.] - ETA: 2s - loss: 0.3667 - regression_loss: 0.3424 - classification_loss: 0.0243 496/500 [============================>.] - ETA: 2s - loss: 0.3667 - regression_loss: 0.3424 - classification_loss: 0.0243 497/500 [============================>.] - ETA: 1s - loss: 0.3664 - regression_loss: 0.3421 - classification_loss: 0.0243 498/500 [============================>.] - ETA: 1s - loss: 0.3660 - regression_loss: 0.3418 - classification_loss: 0.0242 499/500 [============================>.] - ETA: 0s - loss: 0.3658 - regression_loss: 0.3416 - classification_loss: 0.0242 500/500 [==============================] - 251s 502ms/step - loss: 0.3657 - regression_loss: 0.3415 - classification_loss: 0.0242 1172 instances of class plum with average precision: 0.7312 mAP: 0.7312 Epoch 00013: saving model to ./training/snapshots/resnet101_pascal_13.h5 Epoch 14/150 1/500 [..............................] - ETA: 4:06 - loss: 0.3605 - regression_loss: 0.3355 - classification_loss: 0.0250 2/500 [..............................] - ETA: 4:02 - loss: 0.3270 - regression_loss: 0.3003 - classification_loss: 0.0267 3/500 [..............................] - ETA: 4:04 - loss: 0.3852 - regression_loss: 0.3412 - classification_loss: 0.0440 4/500 [..............................] - ETA: 4:08 - loss: 0.4292 - regression_loss: 0.3804 - classification_loss: 0.0488 5/500 [..............................] - ETA: 4:08 - loss: 0.3653 - regression_loss: 0.3248 - classification_loss: 0.0405 6/500 [..............................] - ETA: 4:09 - loss: 0.3549 - regression_loss: 0.3159 - classification_loss: 0.0389 7/500 [..............................] - ETA: 4:08 - loss: 0.3459 - regression_loss: 0.3113 - classification_loss: 0.0346 8/500 [..............................] - ETA: 4:06 - loss: 0.3378 - regression_loss: 0.3029 - classification_loss: 0.0349 9/500 [..............................] - ETA: 4:06 - loss: 0.3482 - regression_loss: 0.3144 - classification_loss: 0.0337 10/500 [..............................] - ETA: 4:05 - loss: 0.3528 - regression_loss: 0.3190 - classification_loss: 0.0338 11/500 [..............................] - ETA: 4:05 - loss: 0.3410 - regression_loss: 0.3099 - classification_loss: 0.0311 12/500 [..............................] - ETA: 4:05 - loss: 0.3308 - regression_loss: 0.3016 - classification_loss: 0.0292 13/500 [..............................] - ETA: 4:05 - loss: 0.3135 - regression_loss: 0.2863 - classification_loss: 0.0272 14/500 [..............................] - ETA: 4:04 - loss: 0.3019 - regression_loss: 0.2761 - classification_loss: 0.0258 15/500 [..............................] - ETA: 4:03 - loss: 0.2945 - regression_loss: 0.2694 - classification_loss: 0.0250 16/500 [..............................] - ETA: 4:02 - loss: 0.3182 - regression_loss: 0.2886 - classification_loss: 0.0296 17/500 [>.............................] - ETA: 4:02 - loss: 0.3163 - regression_loss: 0.2866 - classification_loss: 0.0296 18/500 [>.............................] - ETA: 4:01 - loss: 0.3079 - regression_loss: 0.2794 - classification_loss: 0.0285 19/500 [>.............................] - ETA: 4:01 - loss: 0.3118 - regression_loss: 0.2838 - classification_loss: 0.0280 20/500 [>.............................] - ETA: 4:00 - loss: 0.3174 - regression_loss: 0.2898 - classification_loss: 0.0276 21/500 [>.............................] - ETA: 4:00 - loss: 0.3148 - regression_loss: 0.2875 - classification_loss: 0.0273 22/500 [>.............................] - ETA: 3:59 - loss: 0.3147 - regression_loss: 0.2881 - classification_loss: 0.0266 23/500 [>.............................] - ETA: 3:58 - loss: 0.3208 - regression_loss: 0.2940 - classification_loss: 0.0269 24/500 [>.............................] - ETA: 3:57 - loss: 0.3284 - regression_loss: 0.3002 - classification_loss: 0.0282 25/500 [>.............................] - ETA: 3:56 - loss: 0.3359 - regression_loss: 0.3052 - classification_loss: 0.0307 26/500 [>.............................] - ETA: 3:56 - loss: 0.3386 - regression_loss: 0.3076 - classification_loss: 0.0310 27/500 [>.............................] - ETA: 3:55 - loss: 0.3379 - regression_loss: 0.3069 - classification_loss: 0.0310 28/500 [>.............................] - ETA: 3:55 - loss: 0.3409 - regression_loss: 0.3101 - classification_loss: 0.0309 29/500 [>.............................] - ETA: 3:54 - loss: 0.3458 - regression_loss: 0.3151 - classification_loss: 0.0307 30/500 [>.............................] - ETA: 3:54 - loss: 0.3516 - regression_loss: 0.3209 - classification_loss: 0.0307 31/500 [>.............................] - ETA: 3:53 - loss: 0.3466 - regression_loss: 0.3164 - classification_loss: 0.0302 32/500 [>.............................] - ETA: 3:53 - loss: 0.3546 - regression_loss: 0.3225 - classification_loss: 0.0320 33/500 [>.............................] - ETA: 3:53 - loss: 0.3510 - regression_loss: 0.3196 - classification_loss: 0.0314 34/500 [=>............................] - ETA: 3:52 - loss: 0.3511 - regression_loss: 0.3200 - classification_loss: 0.0311 35/500 [=>............................] - ETA: 3:52 - loss: 0.3533 - regression_loss: 0.3228 - classification_loss: 0.0305 36/500 [=>............................] - ETA: 3:51 - loss: 0.3530 - regression_loss: 0.3222 - classification_loss: 0.0309 37/500 [=>............................] - ETA: 3:51 - loss: 0.3566 - regression_loss: 0.3259 - classification_loss: 0.0306 38/500 [=>............................] - ETA: 3:50 - loss: 0.3576 - regression_loss: 0.3274 - classification_loss: 0.0302 39/500 [=>............................] - ETA: 3:50 - loss: 0.3639 - regression_loss: 0.3331 - classification_loss: 0.0307 40/500 [=>............................] - ETA: 3:49 - loss: 0.3609 - regression_loss: 0.3303 - classification_loss: 0.0306 41/500 [=>............................] - ETA: 3:49 - loss: 0.3652 - regression_loss: 0.3349 - classification_loss: 0.0303 42/500 [=>............................] - ETA: 3:49 - loss: 0.3651 - regression_loss: 0.3350 - classification_loss: 0.0301 43/500 [=>............................] - ETA: 3:48 - loss: 0.3621 - regression_loss: 0.3325 - classification_loss: 0.0296 44/500 [=>............................] - ETA: 3:48 - loss: 0.3603 - regression_loss: 0.3310 - classification_loss: 0.0293 45/500 [=>............................] - ETA: 3:47 - loss: 0.3564 - regression_loss: 0.3275 - classification_loss: 0.0289 46/500 [=>............................] - ETA: 3:47 - loss: 0.3585 - regression_loss: 0.3292 - classification_loss: 0.0293 47/500 [=>............................] - ETA: 3:46 - loss: 0.3588 - regression_loss: 0.3296 - classification_loss: 0.0292 48/500 [=>............................] - ETA: 3:46 - loss: 0.3569 - regression_loss: 0.3279 - classification_loss: 0.0290 49/500 [=>............................] - ETA: 3:45 - loss: 0.3539 - regression_loss: 0.3253 - classification_loss: 0.0286 50/500 [==>...........................] - ETA: 3:45 - loss: 0.3517 - regression_loss: 0.3236 - classification_loss: 0.0282 51/500 [==>...........................] - ETA: 3:44 - loss: 0.3496 - regression_loss: 0.3215 - classification_loss: 0.0281 52/500 [==>...........................] - ETA: 3:44 - loss: 0.3504 - regression_loss: 0.3227 - classification_loss: 0.0278 53/500 [==>...........................] - ETA: 3:43 - loss: 0.3463 - regression_loss: 0.3189 - classification_loss: 0.0274 54/500 [==>...........................] - ETA: 3:43 - loss: 0.3504 - regression_loss: 0.3230 - classification_loss: 0.0273 55/500 [==>...........................] - ETA: 3:42 - loss: 0.3580 - regression_loss: 0.3303 - classification_loss: 0.0277 56/500 [==>...........................] - ETA: 3:42 - loss: 0.3590 - regression_loss: 0.3312 - classification_loss: 0.0278 57/500 [==>...........................] - ETA: 3:41 - loss: 0.3610 - regression_loss: 0.3330 - classification_loss: 0.0280 58/500 [==>...........................] - ETA: 3:41 - loss: 0.3616 - regression_loss: 0.3337 - classification_loss: 0.0279 59/500 [==>...........................] - ETA: 3:40 - loss: 0.3609 - regression_loss: 0.3331 - classification_loss: 0.0277 60/500 [==>...........................] - ETA: 3:40 - loss: 0.3587 - regression_loss: 0.3313 - classification_loss: 0.0274 61/500 [==>...........................] - ETA: 3:39 - loss: 0.3594 - regression_loss: 0.3321 - classification_loss: 0.0273 62/500 [==>...........................] - ETA: 3:39 - loss: 0.3592 - regression_loss: 0.3322 - classification_loss: 0.0270 63/500 [==>...........................] - ETA: 3:38 - loss: 0.3601 - regression_loss: 0.3331 - classification_loss: 0.0270 64/500 [==>...........................] - ETA: 3:38 - loss: 0.3579 - regression_loss: 0.3312 - classification_loss: 0.0268 65/500 [==>...........................] - ETA: 3:37 - loss: 0.3596 - regression_loss: 0.3326 - classification_loss: 0.0270 66/500 [==>...........................] - ETA: 3:37 - loss: 0.3580 - regression_loss: 0.3312 - classification_loss: 0.0268 67/500 [===>..........................] - ETA: 3:37 - loss: 0.3545 - regression_loss: 0.3280 - classification_loss: 0.0265 68/500 [===>..........................] - ETA: 3:36 - loss: 0.3525 - regression_loss: 0.3262 - classification_loss: 0.0263 69/500 [===>..........................] - ETA: 3:36 - loss: 0.3521 - regression_loss: 0.3259 - classification_loss: 0.0262 70/500 [===>..........................] - ETA: 3:35 - loss: 0.3569 - regression_loss: 0.3304 - classification_loss: 0.0266 71/500 [===>..........................] - ETA: 3:35 - loss: 0.3598 - regression_loss: 0.3332 - classification_loss: 0.0265 72/500 [===>..........................] - ETA: 3:34 - loss: 0.3629 - regression_loss: 0.3364 - classification_loss: 0.0265 73/500 [===>..........................] - ETA: 3:34 - loss: 0.3613 - regression_loss: 0.3350 - classification_loss: 0.0263 74/500 [===>..........................] - ETA: 3:33 - loss: 0.3593 - regression_loss: 0.3333 - classification_loss: 0.0260 75/500 [===>..........................] - ETA: 3:33 - loss: 0.3594 - regression_loss: 0.3335 - classification_loss: 0.0258 76/500 [===>..........................] - ETA: 3:32 - loss: 0.3599 - regression_loss: 0.3340 - classification_loss: 0.0258 77/500 [===>..........................] - ETA: 3:32 - loss: 0.3628 - regression_loss: 0.3367 - classification_loss: 0.0261 78/500 [===>..........................] - ETA: 3:31 - loss: 0.3617 - regression_loss: 0.3358 - classification_loss: 0.0259 79/500 [===>..........................] - ETA: 3:31 - loss: 0.3616 - regression_loss: 0.3358 - classification_loss: 0.0258 80/500 [===>..........................] - ETA: 3:30 - loss: 0.3619 - regression_loss: 0.3360 - classification_loss: 0.0259 81/500 [===>..........................] - ETA: 3:30 - loss: 0.3610 - regression_loss: 0.3352 - classification_loss: 0.0258 82/500 [===>..........................] - ETA: 3:29 - loss: 0.3599 - regression_loss: 0.3343 - classification_loss: 0.0257 83/500 [===>..........................] - ETA: 3:29 - loss: 0.3610 - regression_loss: 0.3353 - classification_loss: 0.0257 84/500 [====>.........................] - ETA: 3:28 - loss: 0.3591 - regression_loss: 0.3337 - classification_loss: 0.0254 85/500 [====>.........................] - ETA: 3:28 - loss: 0.3568 - regression_loss: 0.3316 - classification_loss: 0.0252 86/500 [====>.........................] - ETA: 3:27 - loss: 0.3580 - regression_loss: 0.3328 - classification_loss: 0.0252 87/500 [====>.........................] - ETA: 3:27 - loss: 0.3569 - regression_loss: 0.3317 - classification_loss: 0.0251 88/500 [====>.........................] - ETA: 3:26 - loss: 0.3565 - regression_loss: 0.3315 - classification_loss: 0.0250 89/500 [====>.........................] - ETA: 3:26 - loss: 0.3551 - regression_loss: 0.3303 - classification_loss: 0.0248 90/500 [====>.........................] - ETA: 3:25 - loss: 0.3533 - regression_loss: 0.3286 - classification_loss: 0.0247 91/500 [====>.........................] - ETA: 3:25 - loss: 0.3587 - regression_loss: 0.3341 - classification_loss: 0.0246 92/500 [====>.........................] - ETA: 3:24 - loss: 0.3603 - regression_loss: 0.3360 - classification_loss: 0.0243 93/500 [====>.........................] - ETA: 3:24 - loss: 0.3593 - regression_loss: 0.3351 - classification_loss: 0.0242 94/500 [====>.........................] - ETA: 3:23 - loss: 0.3566 - regression_loss: 0.3325 - classification_loss: 0.0240 95/500 [====>.........................] - ETA: 3:23 - loss: 0.3552 - regression_loss: 0.3313 - classification_loss: 0.0240 96/500 [====>.........................] - ETA: 3:22 - loss: 0.3549 - regression_loss: 0.3309 - classification_loss: 0.0239 97/500 [====>.........................] - ETA: 3:22 - loss: 0.3548 - regression_loss: 0.3309 - classification_loss: 0.0239 98/500 [====>.........................] - ETA: 3:21 - loss: 0.3524 - regression_loss: 0.3286 - classification_loss: 0.0238 99/500 [====>.........................] - ETA: 3:21 - loss: 0.3519 - regression_loss: 0.3282 - classification_loss: 0.0237 100/500 [=====>........................] - ETA: 3:20 - loss: 0.3512 - regression_loss: 0.3274 - classification_loss: 0.0238 101/500 [=====>........................] - ETA: 3:20 - loss: 0.3500 - regression_loss: 0.3263 - classification_loss: 0.0237 102/500 [=====>........................] - ETA: 3:19 - loss: 0.3511 - regression_loss: 0.3275 - classification_loss: 0.0235 103/500 [=====>........................] - ETA: 3:19 - loss: 0.3522 - regression_loss: 0.3288 - classification_loss: 0.0234 104/500 [=====>........................] - ETA: 3:18 - loss: 0.3511 - regression_loss: 0.3277 - classification_loss: 0.0233 105/500 [=====>........................] - ETA: 3:18 - loss: 0.3498 - regression_loss: 0.3265 - classification_loss: 0.0232 106/500 [=====>........................] - ETA: 3:17 - loss: 0.3474 - regression_loss: 0.3244 - classification_loss: 0.0230 107/500 [=====>........................] - ETA: 3:17 - loss: 0.3455 - regression_loss: 0.3227 - classification_loss: 0.0229 108/500 [=====>........................] - ETA: 3:16 - loss: 0.3441 - regression_loss: 0.3215 - classification_loss: 0.0227 109/500 [=====>........................] - ETA: 3:16 - loss: 0.3440 - regression_loss: 0.3214 - classification_loss: 0.0226 110/500 [=====>........................] - ETA: 3:15 - loss: 0.3438 - regression_loss: 0.3212 - classification_loss: 0.0226 111/500 [=====>........................] - ETA: 3:15 - loss: 0.3436 - regression_loss: 0.3210 - classification_loss: 0.0226 112/500 [=====>........................] - ETA: 3:14 - loss: 0.3452 - regression_loss: 0.3226 - classification_loss: 0.0226 113/500 [=====>........................] - ETA: 3:13 - loss: 0.3438 - regression_loss: 0.3213 - classification_loss: 0.0225 114/500 [=====>........................] - ETA: 3:13 - loss: 0.3436 - regression_loss: 0.3211 - classification_loss: 0.0225 115/500 [=====>........................] - ETA: 3:12 - loss: 0.3434 - regression_loss: 0.3209 - classification_loss: 0.0225 116/500 [=====>........................] - ETA: 3:12 - loss: 0.3419 - regression_loss: 0.3195 - classification_loss: 0.0224 117/500 [======>.......................] - ETA: 3:11 - loss: 0.3412 - regression_loss: 0.3189 - classification_loss: 0.0223 118/500 [======>.......................] - ETA: 3:11 - loss: 0.3424 - regression_loss: 0.3199 - classification_loss: 0.0225 119/500 [======>.......................] - ETA: 3:10 - loss: 0.3425 - regression_loss: 0.3200 - classification_loss: 0.0225 120/500 [======>.......................] - ETA: 3:10 - loss: 0.3418 - regression_loss: 0.3193 - classification_loss: 0.0225 121/500 [======>.......................] - ETA: 3:09 - loss: 0.3417 - regression_loss: 0.3192 - classification_loss: 0.0225 122/500 [======>.......................] - ETA: 3:09 - loss: 0.3403 - regression_loss: 0.3179 - classification_loss: 0.0224 123/500 [======>.......................] - ETA: 3:08 - loss: 0.3410 - regression_loss: 0.3186 - classification_loss: 0.0224 124/500 [======>.......................] - ETA: 3:08 - loss: 0.3397 - regression_loss: 0.3174 - classification_loss: 0.0223 125/500 [======>.......................] - ETA: 3:08 - loss: 0.3393 - regression_loss: 0.3171 - classification_loss: 0.0222 126/500 [======>.......................] - ETA: 3:07 - loss: 0.3411 - regression_loss: 0.3188 - classification_loss: 0.0223 127/500 [======>.......................] - ETA: 3:06 - loss: 0.3408 - regression_loss: 0.3185 - classification_loss: 0.0223 128/500 [======>.......................] - ETA: 3:06 - loss: 0.3396 - regression_loss: 0.3174 - classification_loss: 0.0222 129/500 [======>.......................] - ETA: 3:06 - loss: 0.3386 - regression_loss: 0.3165 - classification_loss: 0.0221 130/500 [======>.......................] - ETA: 3:05 - loss: 0.3384 - regression_loss: 0.3164 - classification_loss: 0.0220 131/500 [======>.......................] - ETA: 3:04 - loss: 0.3395 - regression_loss: 0.3175 - classification_loss: 0.0220 132/500 [======>.......................] - ETA: 3:04 - loss: 0.3401 - regression_loss: 0.3181 - classification_loss: 0.0220 133/500 [======>.......................] - ETA: 3:03 - loss: 0.3406 - regression_loss: 0.3186 - classification_loss: 0.0220 134/500 [=======>......................] - ETA: 3:03 - loss: 0.3414 - regression_loss: 0.3195 - classification_loss: 0.0219 135/500 [=======>......................] - ETA: 3:03 - loss: 0.3411 - regression_loss: 0.3192 - classification_loss: 0.0219 136/500 [=======>......................] - ETA: 3:02 - loss: 0.3400 - regression_loss: 0.3182 - classification_loss: 0.0218 137/500 [=======>......................] - ETA: 3:01 - loss: 0.3397 - regression_loss: 0.3180 - classification_loss: 0.0217 138/500 [=======>......................] - ETA: 3:01 - loss: 0.3394 - regression_loss: 0.3178 - classification_loss: 0.0216 139/500 [=======>......................] - ETA: 3:00 - loss: 0.3388 - regression_loss: 0.3173 - classification_loss: 0.0214 140/500 [=======>......................] - ETA: 3:00 - loss: 0.3389 - regression_loss: 0.3176 - classification_loss: 0.0214 141/500 [=======>......................] - ETA: 2:59 - loss: 0.3402 - regression_loss: 0.3187 - classification_loss: 0.0215 142/500 [=======>......................] - ETA: 2:59 - loss: 0.3391 - regression_loss: 0.3177 - classification_loss: 0.0214 143/500 [=======>......................] - ETA: 2:58 - loss: 0.3397 - regression_loss: 0.3183 - classification_loss: 0.0214 144/500 [=======>......................] - ETA: 2:58 - loss: 0.3402 - regression_loss: 0.3188 - classification_loss: 0.0215 145/500 [=======>......................] - ETA: 2:57 - loss: 0.3404 - regression_loss: 0.3190 - classification_loss: 0.0214 146/500 [=======>......................] - ETA: 2:57 - loss: 0.3440 - regression_loss: 0.3223 - classification_loss: 0.0217 147/500 [=======>......................] - ETA: 2:56 - loss: 0.3466 - regression_loss: 0.3246 - classification_loss: 0.0219 148/500 [=======>......................] - ETA: 2:56 - loss: 0.3495 - regression_loss: 0.3274 - classification_loss: 0.0221 149/500 [=======>......................] - ETA: 2:55 - loss: 0.3517 - regression_loss: 0.3294 - classification_loss: 0.0223 150/500 [========>.....................] - ETA: 2:55 - loss: 0.3506 - regression_loss: 0.3283 - classification_loss: 0.0223 151/500 [========>.....................] - ETA: 2:54 - loss: 0.3494 - regression_loss: 0.3272 - classification_loss: 0.0222 152/500 [========>.....................] - ETA: 2:54 - loss: 0.3505 - regression_loss: 0.3282 - classification_loss: 0.0223 153/500 [========>.....................] - ETA: 2:53 - loss: 0.3505 - regression_loss: 0.3282 - classification_loss: 0.0223 154/500 [========>.....................] - ETA: 2:53 - loss: 0.3503 - regression_loss: 0.3280 - classification_loss: 0.0223 155/500 [========>.....................] - ETA: 2:52 - loss: 0.3514 - regression_loss: 0.3291 - classification_loss: 0.0223 156/500 [========>.....................] - ETA: 2:52 - loss: 0.3517 - regression_loss: 0.3295 - classification_loss: 0.0222 157/500 [========>.....................] - ETA: 2:51 - loss: 0.3515 - regression_loss: 0.3294 - classification_loss: 0.0221 158/500 [========>.....................] - ETA: 2:51 - loss: 0.3509 - regression_loss: 0.3289 - classification_loss: 0.0221 159/500 [========>.....................] - ETA: 2:50 - loss: 0.3517 - regression_loss: 0.3294 - classification_loss: 0.0223 160/500 [========>.....................] - ETA: 2:50 - loss: 0.3516 - regression_loss: 0.3292 - classification_loss: 0.0223 161/500 [========>.....................] - ETA: 2:49 - loss: 0.3532 - regression_loss: 0.3308 - classification_loss: 0.0224 162/500 [========>.....................] - ETA: 2:49 - loss: 0.3533 - regression_loss: 0.3308 - classification_loss: 0.0225 163/500 [========>.....................] - ETA: 2:48 - loss: 0.3535 - regression_loss: 0.3309 - classification_loss: 0.0226 164/500 [========>.....................] - ETA: 2:48 - loss: 0.3532 - regression_loss: 0.3306 - classification_loss: 0.0226 165/500 [========>.....................] - ETA: 2:47 - loss: 0.3535 - regression_loss: 0.3309 - classification_loss: 0.0227 166/500 [========>.....................] - ETA: 2:47 - loss: 0.3538 - regression_loss: 0.3311 - classification_loss: 0.0227 167/500 [=========>....................] - ETA: 2:46 - loss: 0.3530 - regression_loss: 0.3304 - classification_loss: 0.0226 168/500 [=========>....................] - ETA: 2:46 - loss: 0.3524 - regression_loss: 0.3297 - classification_loss: 0.0227 169/500 [=========>....................] - ETA: 2:45 - loss: 0.3530 - regression_loss: 0.3301 - classification_loss: 0.0228 170/500 [=========>....................] - ETA: 2:45 - loss: 0.3530 - regression_loss: 0.3301 - classification_loss: 0.0228 171/500 [=========>....................] - ETA: 2:44 - loss: 0.3517 - regression_loss: 0.3289 - classification_loss: 0.0228 172/500 [=========>....................] - ETA: 2:44 - loss: 0.3514 - regression_loss: 0.3286 - classification_loss: 0.0227 173/500 [=========>....................] - ETA: 2:43 - loss: 0.3511 - regression_loss: 0.3284 - classification_loss: 0.0227 174/500 [=========>....................] - ETA: 2:43 - loss: 0.3532 - regression_loss: 0.3301 - classification_loss: 0.0231 175/500 [=========>....................] - ETA: 2:42 - loss: 0.3525 - regression_loss: 0.3295 - classification_loss: 0.0230 176/500 [=========>....................] - ETA: 2:42 - loss: 0.3525 - regression_loss: 0.3296 - classification_loss: 0.0229 177/500 [=========>....................] - ETA: 2:41 - loss: 0.3514 - regression_loss: 0.3286 - classification_loss: 0.0228 178/500 [=========>....................] - ETA: 2:41 - loss: 0.3517 - regression_loss: 0.3289 - classification_loss: 0.0228 179/500 [=========>....................] - ETA: 2:40 - loss: 0.3519 - regression_loss: 0.3291 - classification_loss: 0.0228 180/500 [=========>....................] - ETA: 2:40 - loss: 0.3519 - regression_loss: 0.3290 - classification_loss: 0.0229 181/500 [=========>....................] - ETA: 2:39 - loss: 0.3518 - regression_loss: 0.3289 - classification_loss: 0.0229 182/500 [=========>....................] - ETA: 2:39 - loss: 0.3537 - regression_loss: 0.3309 - classification_loss: 0.0228 183/500 [=========>....................] - ETA: 2:38 - loss: 0.3533 - regression_loss: 0.3304 - classification_loss: 0.0229 184/500 [==========>...................] - ETA: 2:38 - loss: 0.3536 - regression_loss: 0.3307 - classification_loss: 0.0229 185/500 [==========>...................] - ETA: 2:37 - loss: 0.3542 - regression_loss: 0.3313 - classification_loss: 0.0229 186/500 [==========>...................] - ETA: 2:37 - loss: 0.3531 - regression_loss: 0.3303 - classification_loss: 0.0228 187/500 [==========>...................] - ETA: 2:36 - loss: 0.3539 - regression_loss: 0.3308 - classification_loss: 0.0231 188/500 [==========>...................] - ETA: 2:36 - loss: 0.3548 - regression_loss: 0.3316 - classification_loss: 0.0233 189/500 [==========>...................] - ETA: 2:35 - loss: 0.3548 - regression_loss: 0.3315 - classification_loss: 0.0232 190/500 [==========>...................] - ETA: 2:35 - loss: 0.3538 - regression_loss: 0.3306 - classification_loss: 0.0232 191/500 [==========>...................] - ETA: 2:34 - loss: 0.3534 - regression_loss: 0.3303 - classification_loss: 0.0231 192/500 [==========>...................] - ETA: 2:34 - loss: 0.3536 - regression_loss: 0.3304 - classification_loss: 0.0232 193/500 [==========>...................] - ETA: 2:33 - loss: 0.3537 - regression_loss: 0.3305 - classification_loss: 0.0232 194/500 [==========>...................] - ETA: 2:33 - loss: 0.3536 - regression_loss: 0.3305 - classification_loss: 0.0231 195/500 [==========>...................] - ETA: 2:32 - loss: 0.3541 - regression_loss: 0.3310 - classification_loss: 0.0231 196/500 [==========>...................] - ETA: 2:32 - loss: 0.3543 - regression_loss: 0.3312 - classification_loss: 0.0231 197/500 [==========>...................] - ETA: 2:31 - loss: 0.3541 - regression_loss: 0.3310 - classification_loss: 0.0231 198/500 [==========>...................] - ETA: 2:31 - loss: 0.3543 - regression_loss: 0.3312 - classification_loss: 0.0231 199/500 [==========>...................] - ETA: 2:30 - loss: 0.3535 - regression_loss: 0.3305 - classification_loss: 0.0231 200/500 [===========>..................] - ETA: 2:30 - loss: 0.3528 - regression_loss: 0.3298 - classification_loss: 0.0230 201/500 [===========>..................] - ETA: 2:29 - loss: 0.3522 - regression_loss: 0.3293 - classification_loss: 0.0229 202/500 [===========>..................] - ETA: 2:29 - loss: 0.3516 - regression_loss: 0.3287 - classification_loss: 0.0229 203/500 [===========>..................] - ETA: 2:28 - loss: 0.3510 - regression_loss: 0.3282 - classification_loss: 0.0228 204/500 [===========>..................] - ETA: 2:28 - loss: 0.3512 - regression_loss: 0.3284 - classification_loss: 0.0228 205/500 [===========>..................] - ETA: 2:27 - loss: 0.3517 - regression_loss: 0.3288 - classification_loss: 0.0229 206/500 [===========>..................] - ETA: 2:27 - loss: 0.3515 - regression_loss: 0.3286 - classification_loss: 0.0229 207/500 [===========>..................] - ETA: 2:26 - loss: 0.3509 - regression_loss: 0.3281 - classification_loss: 0.0229 208/500 [===========>..................] - ETA: 2:26 - loss: 0.3507 - regression_loss: 0.3278 - classification_loss: 0.0228 209/500 [===========>..................] - ETA: 2:25 - loss: 0.3515 - regression_loss: 0.3286 - classification_loss: 0.0228 210/500 [===========>..................] - ETA: 2:25 - loss: 0.3520 - regression_loss: 0.3291 - classification_loss: 0.0229 211/500 [===========>..................] - ETA: 2:24 - loss: 0.3518 - regression_loss: 0.3288 - classification_loss: 0.0231 212/500 [===========>..................] - ETA: 2:24 - loss: 0.3517 - regression_loss: 0.3286 - classification_loss: 0.0231 213/500 [===========>..................] - ETA: 2:23 - loss: 0.3518 - regression_loss: 0.3286 - classification_loss: 0.0232 214/500 [===========>..................] - ETA: 2:23 - loss: 0.3518 - regression_loss: 0.3285 - classification_loss: 0.0233 215/500 [===========>..................] - ETA: 2:22 - loss: 0.3520 - regression_loss: 0.3288 - classification_loss: 0.0233 216/500 [===========>..................] - ETA: 2:22 - loss: 0.3525 - regression_loss: 0.3291 - classification_loss: 0.0234 217/500 [============>.................] - ETA: 2:21 - loss: 0.3532 - regression_loss: 0.3297 - classification_loss: 0.0235 218/500 [============>.................] - ETA: 2:21 - loss: 0.3545 - regression_loss: 0.3307 - classification_loss: 0.0238 219/500 [============>.................] - ETA: 2:20 - loss: 0.3545 - regression_loss: 0.3307 - classification_loss: 0.0238 220/500 [============>.................] - ETA: 2:20 - loss: 0.3546 - regression_loss: 0.3307 - classification_loss: 0.0238 221/500 [============>.................] - ETA: 2:19 - loss: 0.3546 - regression_loss: 0.3308 - classification_loss: 0.0238 222/500 [============>.................] - ETA: 2:19 - loss: 0.3542 - regression_loss: 0.3304 - classification_loss: 0.0238 223/500 [============>.................] - ETA: 2:18 - loss: 0.3544 - regression_loss: 0.3307 - classification_loss: 0.0237 224/500 [============>.................] - ETA: 2:18 - loss: 0.3542 - regression_loss: 0.3305 - classification_loss: 0.0237 225/500 [============>.................] - ETA: 2:17 - loss: 0.3540 - regression_loss: 0.3304 - classification_loss: 0.0236 226/500 [============>.................] - ETA: 2:17 - loss: 0.3540 - regression_loss: 0.3303 - classification_loss: 0.0236 227/500 [============>.................] - ETA: 2:16 - loss: 0.3537 - regression_loss: 0.3301 - classification_loss: 0.0236 228/500 [============>.................] - ETA: 2:16 - loss: 0.3531 - regression_loss: 0.3296 - classification_loss: 0.0235 229/500 [============>.................] - ETA: 2:15 - loss: 0.3533 - regression_loss: 0.3298 - classification_loss: 0.0235 230/500 [============>.................] - ETA: 2:15 - loss: 0.3531 - regression_loss: 0.3296 - classification_loss: 0.0235 231/500 [============>.................] - ETA: 2:14 - loss: 0.3527 - regression_loss: 0.3293 - classification_loss: 0.0234 232/500 [============>.................] - ETA: 2:14 - loss: 0.3522 - regression_loss: 0.3288 - classification_loss: 0.0234 233/500 [============>.................] - ETA: 2:13 - loss: 0.3519 - regression_loss: 0.3285 - classification_loss: 0.0234 234/500 [=============>................] - ETA: 2:13 - loss: 0.3522 - regression_loss: 0.3288 - classification_loss: 0.0234 235/500 [=============>................] - ETA: 2:12 - loss: 0.3515 - regression_loss: 0.3282 - classification_loss: 0.0234 236/500 [=============>................] - ETA: 2:12 - loss: 0.3512 - regression_loss: 0.3279 - classification_loss: 0.0233 237/500 [=============>................] - ETA: 2:11 - loss: 0.3502 - regression_loss: 0.3270 - classification_loss: 0.0232 238/500 [=============>................] - ETA: 2:11 - loss: 0.3504 - regression_loss: 0.3272 - classification_loss: 0.0232 239/500 [=============>................] - ETA: 2:10 - loss: 0.3508 - regression_loss: 0.3276 - classification_loss: 0.0232 240/500 [=============>................] - ETA: 2:10 - loss: 0.3510 - regression_loss: 0.3279 - classification_loss: 0.0232 241/500 [=============>................] - ETA: 2:09 - loss: 0.3504 - regression_loss: 0.3273 - classification_loss: 0.0232 242/500 [=============>................] - ETA: 2:09 - loss: 0.3506 - regression_loss: 0.3274 - classification_loss: 0.0232 243/500 [=============>................] - ETA: 2:08 - loss: 0.3521 - regression_loss: 0.3286 - classification_loss: 0.0234 244/500 [=============>................] - ETA: 2:08 - loss: 0.3510 - regression_loss: 0.3276 - classification_loss: 0.0234 245/500 [=============>................] - ETA: 2:07 - loss: 0.3517 - regression_loss: 0.3282 - classification_loss: 0.0234 246/500 [=============>................] - ETA: 2:07 - loss: 0.3514 - regression_loss: 0.3281 - classification_loss: 0.0234 247/500 [=============>................] - ETA: 2:06 - loss: 0.3511 - regression_loss: 0.3278 - classification_loss: 0.0233 248/500 [=============>................] - ETA: 2:06 - loss: 0.3513 - regression_loss: 0.3280 - classification_loss: 0.0233 249/500 [=============>................] - ETA: 2:05 - loss: 0.3510 - regression_loss: 0.3277 - classification_loss: 0.0233 250/500 [==============>...............] - ETA: 2:05 - loss: 0.3509 - regression_loss: 0.3276 - classification_loss: 0.0233 251/500 [==============>...............] - ETA: 2:04 - loss: 0.3504 - regression_loss: 0.3272 - classification_loss: 0.0232 252/500 [==============>...............] - ETA: 2:04 - loss: 0.3500 - regression_loss: 0.3268 - classification_loss: 0.0231 253/500 [==============>...............] - ETA: 2:03 - loss: 0.3498 - regression_loss: 0.3267 - classification_loss: 0.0231 254/500 [==============>...............] - ETA: 2:03 - loss: 0.3493 - regression_loss: 0.3263 - classification_loss: 0.0231 255/500 [==============>...............] - ETA: 2:02 - loss: 0.3490 - regression_loss: 0.3259 - classification_loss: 0.0230 256/500 [==============>...............] - ETA: 2:02 - loss: 0.3494 - regression_loss: 0.3265 - classification_loss: 0.0230 257/500 [==============>...............] - ETA: 2:01 - loss: 0.3492 - regression_loss: 0.3263 - classification_loss: 0.0230 258/500 [==============>...............] - ETA: 2:01 - loss: 0.3489 - regression_loss: 0.3260 - classification_loss: 0.0229 259/500 [==============>...............] - ETA: 2:00 - loss: 0.3488 - regression_loss: 0.3259 - classification_loss: 0.0229 260/500 [==============>...............] - ETA: 2:00 - loss: 0.3489 - regression_loss: 0.3260 - classification_loss: 0.0229 261/500 [==============>...............] - ETA: 1:59 - loss: 0.3492 - regression_loss: 0.3262 - classification_loss: 0.0230 262/500 [==============>...............] - ETA: 1:59 - loss: 0.3490 - regression_loss: 0.3260 - classification_loss: 0.0229 263/500 [==============>...............] - ETA: 1:58 - loss: 0.3488 - regression_loss: 0.3259 - classification_loss: 0.0229 264/500 [==============>...............] - ETA: 1:58 - loss: 0.3493 - regression_loss: 0.3263 - classification_loss: 0.0230 265/500 [==============>...............] - ETA: 1:57 - loss: 0.3489 - regression_loss: 0.3259 - classification_loss: 0.0230 266/500 [==============>...............] - ETA: 1:57 - loss: 0.3484 - regression_loss: 0.3255 - classification_loss: 0.0229 267/500 [===============>..............] - ETA: 1:56 - loss: 0.3483 - regression_loss: 0.3253 - classification_loss: 0.0230 268/500 [===============>..............] - ETA: 1:56 - loss: 0.3487 - regression_loss: 0.3258 - classification_loss: 0.0230 269/500 [===============>..............] - ETA: 1:55 - loss: 0.3481 - regression_loss: 0.3251 - classification_loss: 0.0229 270/500 [===============>..............] - ETA: 1:55 - loss: 0.3478 - regression_loss: 0.3248 - classification_loss: 0.0229 271/500 [===============>..............] - ETA: 1:54 - loss: 0.3478 - regression_loss: 0.3249 - classification_loss: 0.0229 272/500 [===============>..............] - ETA: 1:54 - loss: 0.3486 - regression_loss: 0.3256 - classification_loss: 0.0229 273/500 [===============>..............] - ETA: 1:53 - loss: 0.3483 - regression_loss: 0.3254 - classification_loss: 0.0229 274/500 [===============>..............] - ETA: 1:53 - loss: 0.3491 - regression_loss: 0.3261 - classification_loss: 0.0230 275/500 [===============>..............] - ETA: 1:52 - loss: 0.3487 - regression_loss: 0.3257 - classification_loss: 0.0229 276/500 [===============>..............] - ETA: 1:52 - loss: 0.3485 - regression_loss: 0.3255 - classification_loss: 0.0229 277/500 [===============>..............] - ETA: 1:51 - loss: 0.3480 - regression_loss: 0.3251 - classification_loss: 0.0229 278/500 [===============>..............] - ETA: 1:51 - loss: 0.3477 - regression_loss: 0.3249 - classification_loss: 0.0229 279/500 [===============>..............] - ETA: 1:50 - loss: 0.3475 - regression_loss: 0.3246 - classification_loss: 0.0229 280/500 [===============>..............] - ETA: 1:50 - loss: 0.3481 - regression_loss: 0.3252 - classification_loss: 0.0230 281/500 [===============>..............] - ETA: 1:49 - loss: 0.3482 - regression_loss: 0.3253 - classification_loss: 0.0229 282/500 [===============>..............] - ETA: 1:49 - loss: 0.3481 - regression_loss: 0.3252 - classification_loss: 0.0229 283/500 [===============>..............] - ETA: 1:48 - loss: 0.3485 - regression_loss: 0.3255 - classification_loss: 0.0230 284/500 [================>.............] - ETA: 1:48 - loss: 0.3483 - regression_loss: 0.3253 - classification_loss: 0.0230 285/500 [================>.............] - ETA: 1:47 - loss: 0.3476 - regression_loss: 0.3247 - classification_loss: 0.0229 286/500 [================>.............] - ETA: 1:47 - loss: 0.3473 - regression_loss: 0.3244 - classification_loss: 0.0229 287/500 [================>.............] - ETA: 1:46 - loss: 0.3470 - regression_loss: 0.3242 - classification_loss: 0.0228 288/500 [================>.............] - ETA: 1:46 - loss: 0.3465 - regression_loss: 0.3237 - classification_loss: 0.0228 289/500 [================>.............] - ETA: 1:45 - loss: 0.3462 - regression_loss: 0.3235 - classification_loss: 0.0227 290/500 [================>.............] - ETA: 1:45 - loss: 0.3465 - regression_loss: 0.3237 - classification_loss: 0.0228 291/500 [================>.............] - ETA: 1:44 - loss: 0.3478 - regression_loss: 0.3250 - classification_loss: 0.0228 292/500 [================>.............] - ETA: 1:44 - loss: 0.3478 - regression_loss: 0.3250 - classification_loss: 0.0227 293/500 [================>.............] - ETA: 1:43 - loss: 0.3474 - regression_loss: 0.3247 - classification_loss: 0.0227 294/500 [================>.............] - ETA: 1:43 - loss: 0.3471 - regression_loss: 0.3245 - classification_loss: 0.0226 295/500 [================>.............] - ETA: 1:42 - loss: 0.3469 - regression_loss: 0.3243 - classification_loss: 0.0226 296/500 [================>.............] - ETA: 1:42 - loss: 0.3470 - regression_loss: 0.3244 - classification_loss: 0.0226 297/500 [================>.............] - ETA: 1:41 - loss: 0.3469 - regression_loss: 0.3243 - classification_loss: 0.0226 298/500 [================>.............] - ETA: 1:41 - loss: 0.3480 - regression_loss: 0.3254 - classification_loss: 0.0227 299/500 [================>.............] - ETA: 1:40 - loss: 0.3492 - regression_loss: 0.3264 - classification_loss: 0.0227 300/500 [=================>............] - ETA: 1:40 - loss: 0.3505 - regression_loss: 0.3277 - classification_loss: 0.0227 301/500 [=================>............] - ETA: 1:39 - loss: 0.3508 - regression_loss: 0.3280 - classification_loss: 0.0227 302/500 [=================>............] - ETA: 1:39 - loss: 0.3505 - regression_loss: 0.3278 - classification_loss: 0.0227 303/500 [=================>............] - ETA: 1:38 - loss: 0.3507 - regression_loss: 0.3280 - classification_loss: 0.0227 304/500 [=================>............] - ETA: 1:38 - loss: 0.3515 - regression_loss: 0.3287 - classification_loss: 0.0228 305/500 [=================>............] - ETA: 1:37 - loss: 0.3511 - regression_loss: 0.3284 - classification_loss: 0.0227 306/500 [=================>............] - ETA: 1:37 - loss: 0.3505 - regression_loss: 0.3278 - classification_loss: 0.0227 307/500 [=================>............] - ETA: 1:36 - loss: 0.3507 - regression_loss: 0.3280 - classification_loss: 0.0227 308/500 [=================>............] - ETA: 1:36 - loss: 0.3508 - regression_loss: 0.3281 - classification_loss: 0.0227 309/500 [=================>............] - ETA: 1:35 - loss: 0.3502 - regression_loss: 0.3276 - classification_loss: 0.0226 310/500 [=================>............] - ETA: 1:35 - loss: 0.3499 - regression_loss: 0.3273 - classification_loss: 0.0226 311/500 [=================>............] - ETA: 1:34 - loss: 0.3504 - regression_loss: 0.3278 - classification_loss: 0.0226 312/500 [=================>............] - ETA: 1:34 - loss: 0.3501 - regression_loss: 0.3276 - classification_loss: 0.0226 313/500 [=================>............] - ETA: 1:33 - loss: 0.3495 - regression_loss: 0.3269 - classification_loss: 0.0225 314/500 [=================>............] - ETA: 1:33 - loss: 0.3490 - regression_loss: 0.3265 - classification_loss: 0.0225 315/500 [=================>............] - ETA: 1:32 - loss: 0.3504 - regression_loss: 0.3276 - classification_loss: 0.0228 316/500 [=================>............] - ETA: 1:32 - loss: 0.3511 - regression_loss: 0.3282 - classification_loss: 0.0230 317/500 [==================>...........] - ETA: 1:31 - loss: 0.3521 - regression_loss: 0.3292 - classification_loss: 0.0229 318/500 [==================>...........] - ETA: 1:31 - loss: 0.3516 - regression_loss: 0.3287 - classification_loss: 0.0229 319/500 [==================>...........] - ETA: 1:30 - loss: 0.3517 - regression_loss: 0.3289 - classification_loss: 0.0229 320/500 [==================>...........] - ETA: 1:30 - loss: 0.3520 - regression_loss: 0.3290 - classification_loss: 0.0229 321/500 [==================>...........] - ETA: 1:29 - loss: 0.3526 - regression_loss: 0.3296 - classification_loss: 0.0230 322/500 [==================>...........] - ETA: 1:29 - loss: 0.3526 - regression_loss: 0.3296 - classification_loss: 0.0230 323/500 [==================>...........] - ETA: 1:28 - loss: 0.3525 - regression_loss: 0.3295 - classification_loss: 0.0230 324/500 [==================>...........] - ETA: 1:28 - loss: 0.3523 - regression_loss: 0.3293 - classification_loss: 0.0230 325/500 [==================>...........] - ETA: 1:27 - loss: 0.3527 - regression_loss: 0.3297 - classification_loss: 0.0230 326/500 [==================>...........] - ETA: 1:27 - loss: 0.3526 - regression_loss: 0.3296 - classification_loss: 0.0230 327/500 [==================>...........] - ETA: 1:26 - loss: 0.3528 - regression_loss: 0.3297 - classification_loss: 0.0231 328/500 [==================>...........] - ETA: 1:26 - loss: 0.3521 - regression_loss: 0.3290 - classification_loss: 0.0230 329/500 [==================>...........] - ETA: 1:25 - loss: 0.3523 - regression_loss: 0.3293 - classification_loss: 0.0231 330/500 [==================>...........] - ETA: 1:25 - loss: 0.3522 - regression_loss: 0.3291 - classification_loss: 0.0231 331/500 [==================>...........] - ETA: 1:24 - loss: 0.3519 - regression_loss: 0.3289 - classification_loss: 0.0230 332/500 [==================>...........] - ETA: 1:24 - loss: 0.3515 - regression_loss: 0.3286 - classification_loss: 0.0230 333/500 [==================>...........] - ETA: 1:23 - loss: 0.3527 - regression_loss: 0.3296 - classification_loss: 0.0230 334/500 [===================>..........] - ETA: 1:23 - loss: 0.3526 - regression_loss: 0.3296 - classification_loss: 0.0230 335/500 [===================>..........] - ETA: 1:22 - loss: 0.3530 - regression_loss: 0.3300 - classification_loss: 0.0231 336/500 [===================>..........] - ETA: 1:22 - loss: 0.3527 - regression_loss: 0.3296 - classification_loss: 0.0230 337/500 [===================>..........] - ETA: 1:21 - loss: 0.3524 - regression_loss: 0.3294 - classification_loss: 0.0230 338/500 [===================>..........] - ETA: 1:21 - loss: 0.3524 - regression_loss: 0.3294 - classification_loss: 0.0230 339/500 [===================>..........] - ETA: 1:20 - loss: 0.3517 - regression_loss: 0.3288 - classification_loss: 0.0229 340/500 [===================>..........] - ETA: 1:20 - loss: 0.3518 - regression_loss: 0.3289 - classification_loss: 0.0229 341/500 [===================>..........] - ETA: 1:19 - loss: 0.3524 - regression_loss: 0.3295 - classification_loss: 0.0229 342/500 [===================>..........] - ETA: 1:19 - loss: 0.3528 - regression_loss: 0.3299 - classification_loss: 0.0229 343/500 [===================>..........] - ETA: 1:18 - loss: 0.3524 - regression_loss: 0.3295 - classification_loss: 0.0229 344/500 [===================>..........] - ETA: 1:18 - loss: 0.3518 - regression_loss: 0.3290 - classification_loss: 0.0228 345/500 [===================>..........] - ETA: 1:17 - loss: 0.3519 - regression_loss: 0.3291 - classification_loss: 0.0228 346/500 [===================>..........] - ETA: 1:17 - loss: 0.3516 - regression_loss: 0.3288 - classification_loss: 0.0228 347/500 [===================>..........] - ETA: 1:16 - loss: 0.3517 - regression_loss: 0.3289 - classification_loss: 0.0228 348/500 [===================>..........] - ETA: 1:16 - loss: 0.3515 - regression_loss: 0.3287 - classification_loss: 0.0227 349/500 [===================>..........] - ETA: 1:15 - loss: 0.3520 - regression_loss: 0.3293 - classification_loss: 0.0228 350/500 [====================>.........] - ETA: 1:15 - loss: 0.3527 - regression_loss: 0.3299 - classification_loss: 0.0228 351/500 [====================>.........] - ETA: 1:14 - loss: 0.3522 - regression_loss: 0.3294 - classification_loss: 0.0228 352/500 [====================>.........] - ETA: 1:14 - loss: 0.3521 - regression_loss: 0.3294 - classification_loss: 0.0227 353/500 [====================>.........] - ETA: 1:13 - loss: 0.3517 - regression_loss: 0.3290 - classification_loss: 0.0227 354/500 [====================>.........] - ETA: 1:13 - loss: 0.3517 - regression_loss: 0.3290 - classification_loss: 0.0227 355/500 [====================>.........] - ETA: 1:12 - loss: 0.3525 - regression_loss: 0.3297 - classification_loss: 0.0228 356/500 [====================>.........] - ETA: 1:12 - loss: 0.3520 - regression_loss: 0.3292 - classification_loss: 0.0227 357/500 [====================>.........] - ETA: 1:11 - loss: 0.3514 - regression_loss: 0.3287 - classification_loss: 0.0227 358/500 [====================>.........] - ETA: 1:11 - loss: 0.3515 - regression_loss: 0.3287 - classification_loss: 0.0227 359/500 [====================>.........] - ETA: 1:10 - loss: 0.3515 - regression_loss: 0.3288 - classification_loss: 0.0227 360/500 [====================>.........] - ETA: 1:10 - loss: 0.3517 - regression_loss: 0.3290 - classification_loss: 0.0227 361/500 [====================>.........] - ETA: 1:09 - loss: 0.3520 - regression_loss: 0.3293 - classification_loss: 0.0227 362/500 [====================>.........] - ETA: 1:09 - loss: 0.3527 - regression_loss: 0.3299 - classification_loss: 0.0228 363/500 [====================>.........] - ETA: 1:08 - loss: 0.3530 - regression_loss: 0.3302 - classification_loss: 0.0228 364/500 [====================>.........] - ETA: 1:08 - loss: 0.3529 - regression_loss: 0.3300 - classification_loss: 0.0228 365/500 [====================>.........] - ETA: 1:07 - loss: 0.3528 - regression_loss: 0.3299 - classification_loss: 0.0228 366/500 [====================>.........] - ETA: 1:07 - loss: 0.3529 - regression_loss: 0.3300 - classification_loss: 0.0229 367/500 [=====================>........] - ETA: 1:06 - loss: 0.3528 - regression_loss: 0.3299 - classification_loss: 0.0229 368/500 [=====================>........] - ETA: 1:06 - loss: 0.3530 - regression_loss: 0.3300 - classification_loss: 0.0229 369/500 [=====================>........] - ETA: 1:05 - loss: 0.3532 - regression_loss: 0.3303 - classification_loss: 0.0229 370/500 [=====================>........] - ETA: 1:05 - loss: 0.3533 - regression_loss: 0.3304 - classification_loss: 0.0229 371/500 [=====================>........] - ETA: 1:04 - loss: 0.3533 - regression_loss: 0.3304 - classification_loss: 0.0229 372/500 [=====================>........] - ETA: 1:04 - loss: 0.3537 - regression_loss: 0.3308 - classification_loss: 0.0229 373/500 [=====================>........] - ETA: 1:03 - loss: 0.3532 - regression_loss: 0.3304 - classification_loss: 0.0229 374/500 [=====================>........] - ETA: 1:03 - loss: 0.3537 - regression_loss: 0.3308 - classification_loss: 0.0229 375/500 [=====================>........] - ETA: 1:02 - loss: 0.3533 - regression_loss: 0.3305 - classification_loss: 0.0228 376/500 [=====================>........] - ETA: 1:02 - loss: 0.3537 - regression_loss: 0.3308 - classification_loss: 0.0229 377/500 [=====================>........] - ETA: 1:01 - loss: 0.3536 - regression_loss: 0.3308 - classification_loss: 0.0228 378/500 [=====================>........] - ETA: 1:01 - loss: 0.3534 - regression_loss: 0.3305 - classification_loss: 0.0228 379/500 [=====================>........] - ETA: 1:00 - loss: 0.3530 - regression_loss: 0.3302 - classification_loss: 0.0228 380/500 [=====================>........] - ETA: 1:00 - loss: 0.3534 - regression_loss: 0.3306 - classification_loss: 0.0228 381/500 [=====================>........] - ETA: 59s - loss: 0.3528 - regression_loss: 0.3300 - classification_loss: 0.0228  382/500 [=====================>........] - ETA: 59s - loss: 0.3525 - regression_loss: 0.3298 - classification_loss: 0.0228 383/500 [=====================>........] - ETA: 58s - loss: 0.3523 - regression_loss: 0.3295 - classification_loss: 0.0228 384/500 [======================>.......] - ETA: 58s - loss: 0.3525 - regression_loss: 0.3297 - classification_loss: 0.0228 385/500 [======================>.......] - ETA: 57s - loss: 0.3523 - regression_loss: 0.3295 - classification_loss: 0.0228 386/500 [======================>.......] - ETA: 57s - loss: 0.3522 - regression_loss: 0.3295 - classification_loss: 0.0228 387/500 [======================>.......] - ETA: 56s - loss: 0.3525 - regression_loss: 0.3297 - classification_loss: 0.0228 388/500 [======================>.......] - ETA: 56s - loss: 0.3525 - regression_loss: 0.3297 - classification_loss: 0.0227 389/500 [======================>.......] - ETA: 55s - loss: 0.3521 - regression_loss: 0.3294 - classification_loss: 0.0227 390/500 [======================>.......] - ETA: 55s - loss: 0.3520 - regression_loss: 0.3293 - classification_loss: 0.0227 391/500 [======================>.......] - ETA: 54s - loss: 0.3517 - regression_loss: 0.3290 - classification_loss: 0.0227 392/500 [======================>.......] - ETA: 54s - loss: 0.3520 - regression_loss: 0.3293 - classification_loss: 0.0227 393/500 [======================>.......] - ETA: 53s - loss: 0.3515 - regression_loss: 0.3289 - classification_loss: 0.0226 394/500 [======================>.......] - ETA: 53s - loss: 0.3519 - regression_loss: 0.3292 - classification_loss: 0.0227 395/500 [======================>.......] - ETA: 52s - loss: 0.3527 - regression_loss: 0.3300 - classification_loss: 0.0227 396/500 [======================>.......] - ETA: 52s - loss: 0.3525 - regression_loss: 0.3298 - classification_loss: 0.0227 397/500 [======================>.......] - ETA: 51s - loss: 0.3528 - regression_loss: 0.3300 - classification_loss: 0.0227 398/500 [======================>.......] - ETA: 51s - loss: 0.3522 - regression_loss: 0.3295 - classification_loss: 0.0227 399/500 [======================>.......] - ETA: 50s - loss: 0.3518 - regression_loss: 0.3292 - classification_loss: 0.0227 400/500 [=======================>......] - ETA: 50s - loss: 0.3517 - regression_loss: 0.3291 - classification_loss: 0.0227 401/500 [=======================>......] - ETA: 49s - loss: 0.3516 - regression_loss: 0.3290 - classification_loss: 0.0226 402/500 [=======================>......] - ETA: 49s - loss: 0.3513 - regression_loss: 0.3287 - classification_loss: 0.0226 403/500 [=======================>......] - ETA: 48s - loss: 0.3514 - regression_loss: 0.3288 - classification_loss: 0.0226 404/500 [=======================>......] - ETA: 48s - loss: 0.3516 - regression_loss: 0.3290 - classification_loss: 0.0226 405/500 [=======================>......] - ETA: 47s - loss: 0.3518 - regression_loss: 0.3292 - classification_loss: 0.0226 406/500 [=======================>......] - ETA: 47s - loss: 0.3519 - regression_loss: 0.3293 - classification_loss: 0.0226 407/500 [=======================>......] - ETA: 46s - loss: 0.3518 - regression_loss: 0.3291 - classification_loss: 0.0226 408/500 [=======================>......] - ETA: 46s - loss: 0.3519 - regression_loss: 0.3292 - classification_loss: 0.0227 409/500 [=======================>......] - ETA: 45s - loss: 0.3518 - regression_loss: 0.3290 - classification_loss: 0.0227 410/500 [=======================>......] - ETA: 45s - loss: 0.3526 - regression_loss: 0.3298 - classification_loss: 0.0228 411/500 [=======================>......] - ETA: 44s - loss: 0.3525 - regression_loss: 0.3298 - classification_loss: 0.0228 412/500 [=======================>......] - ETA: 44s - loss: 0.3526 - regression_loss: 0.3298 - classification_loss: 0.0228 413/500 [=======================>......] - ETA: 43s - loss: 0.3528 - regression_loss: 0.3301 - classification_loss: 0.0228 414/500 [=======================>......] - ETA: 43s - loss: 0.3524 - regression_loss: 0.3296 - classification_loss: 0.0227 415/500 [=======================>......] - ETA: 42s - loss: 0.3521 - regression_loss: 0.3294 - classification_loss: 0.0227 416/500 [=======================>......] - ETA: 42s - loss: 0.3522 - regression_loss: 0.3296 - classification_loss: 0.0227 417/500 [========================>.....] - ETA: 41s - loss: 0.3531 - regression_loss: 0.3303 - classification_loss: 0.0227 418/500 [========================>.....] - ETA: 41s - loss: 0.3532 - regression_loss: 0.3304 - classification_loss: 0.0228 419/500 [========================>.....] - ETA: 40s - loss: 0.3537 - regression_loss: 0.3308 - classification_loss: 0.0229 420/500 [========================>.....] - ETA: 40s - loss: 0.3534 - regression_loss: 0.3305 - classification_loss: 0.0229 421/500 [========================>.....] - ETA: 39s - loss: 0.3532 - regression_loss: 0.3304 - classification_loss: 0.0229 422/500 [========================>.....] - ETA: 39s - loss: 0.3531 - regression_loss: 0.3303 - classification_loss: 0.0229 423/500 [========================>.....] - ETA: 38s - loss: 0.3529 - regression_loss: 0.3300 - classification_loss: 0.0229 424/500 [========================>.....] - ETA: 38s - loss: 0.3525 - regression_loss: 0.3297 - classification_loss: 0.0228 425/500 [========================>.....] - ETA: 37s - loss: 0.3529 - regression_loss: 0.3301 - classification_loss: 0.0229 426/500 [========================>.....] - ETA: 37s - loss: 0.3533 - regression_loss: 0.3305 - classification_loss: 0.0228 427/500 [========================>.....] - ETA: 36s - loss: 0.3531 - regression_loss: 0.3302 - classification_loss: 0.0228 428/500 [========================>.....] - ETA: 36s - loss: 0.3528 - regression_loss: 0.3300 - classification_loss: 0.0228 429/500 [========================>.....] - ETA: 35s - loss: 0.3541 - regression_loss: 0.3313 - classification_loss: 0.0228 430/500 [========================>.....] - ETA: 35s - loss: 0.3542 - regression_loss: 0.3314 - classification_loss: 0.0228 431/500 [========================>.....] - ETA: 34s - loss: 0.3543 - regression_loss: 0.3315 - classification_loss: 0.0228 432/500 [========================>.....] - ETA: 34s - loss: 0.3542 - regression_loss: 0.3314 - classification_loss: 0.0228 433/500 [========================>.....] - ETA: 33s - loss: 0.3549 - regression_loss: 0.3321 - classification_loss: 0.0229 434/500 [=========================>....] - ETA: 33s - loss: 0.3552 - regression_loss: 0.3323 - classification_loss: 0.0229 435/500 [=========================>....] - ETA: 32s - loss: 0.3551 - regression_loss: 0.3323 - classification_loss: 0.0229 436/500 [=========================>....] - ETA: 32s - loss: 0.3549 - regression_loss: 0.3321 - classification_loss: 0.0229 437/500 [=========================>....] - ETA: 31s - loss: 0.3544 - regression_loss: 0.3316 - classification_loss: 0.0228 438/500 [=========================>....] - ETA: 31s - loss: 0.3548 - regression_loss: 0.3319 - classification_loss: 0.0229 439/500 [=========================>....] - ETA: 30s - loss: 0.3547 - regression_loss: 0.3318 - classification_loss: 0.0229 440/500 [=========================>....] - ETA: 30s - loss: 0.3548 - regression_loss: 0.3319 - classification_loss: 0.0229 441/500 [=========================>....] - ETA: 29s - loss: 0.3545 - regression_loss: 0.3317 - classification_loss: 0.0229 442/500 [=========================>....] - ETA: 29s - loss: 0.3555 - regression_loss: 0.3324 - classification_loss: 0.0230 443/500 [=========================>....] - ETA: 28s - loss: 0.3557 - regression_loss: 0.3326 - classification_loss: 0.0230 444/500 [=========================>....] - ETA: 28s - loss: 0.3563 - regression_loss: 0.3332 - classification_loss: 0.0231 445/500 [=========================>....] - ETA: 27s - loss: 0.3562 - regression_loss: 0.3331 - classification_loss: 0.0231 446/500 [=========================>....] - ETA: 27s - loss: 0.3564 - regression_loss: 0.3332 - classification_loss: 0.0232 447/500 [=========================>....] - ETA: 26s - loss: 0.3564 - regression_loss: 0.3332 - classification_loss: 0.0231 448/500 [=========================>....] - ETA: 26s - loss: 0.3570 - regression_loss: 0.3339 - classification_loss: 0.0231 449/500 [=========================>....] - ETA: 25s - loss: 0.3576 - regression_loss: 0.3345 - classification_loss: 0.0231 450/500 [==========================>...] - ETA: 25s - loss: 0.3579 - regression_loss: 0.3348 - classification_loss: 0.0231 451/500 [==========================>...] - ETA: 24s - loss: 0.3585 - regression_loss: 0.3354 - classification_loss: 0.0231 452/500 [==========================>...] - ETA: 24s - loss: 0.3583 - regression_loss: 0.3353 - classification_loss: 0.0231 453/500 [==========================>...] - ETA: 23s - loss: 0.3585 - regression_loss: 0.3354 - classification_loss: 0.0231 454/500 [==========================>...] - ETA: 23s - loss: 0.3579 - regression_loss: 0.3349 - classification_loss: 0.0230 455/500 [==========================>...] - ETA: 22s - loss: 0.3576 - regression_loss: 0.3346 - classification_loss: 0.0230 456/500 [==========================>...] - ETA: 22s - loss: 0.3574 - regression_loss: 0.3344 - classification_loss: 0.0230 457/500 [==========================>...] - ETA: 21s - loss: 0.3577 - regression_loss: 0.3347 - classification_loss: 0.0230 458/500 [==========================>...] - ETA: 21s - loss: 0.3585 - regression_loss: 0.3355 - classification_loss: 0.0230 459/500 [==========================>...] - ETA: 20s - loss: 0.3592 - regression_loss: 0.3361 - classification_loss: 0.0231 460/500 [==========================>...] - ETA: 20s - loss: 0.3594 - regression_loss: 0.3364 - classification_loss: 0.0231 461/500 [==========================>...] - ETA: 19s - loss: 0.3596 - regression_loss: 0.3366 - classification_loss: 0.0231 462/500 [==========================>...] - ETA: 19s - loss: 0.3593 - regression_loss: 0.3363 - classification_loss: 0.0230 463/500 [==========================>...] - ETA: 18s - loss: 0.3589 - regression_loss: 0.3359 - classification_loss: 0.0230 464/500 [==========================>...] - ETA: 18s - loss: 0.3588 - regression_loss: 0.3358 - classification_loss: 0.0230 465/500 [==========================>...] - ETA: 17s - loss: 0.3585 - regression_loss: 0.3355 - classification_loss: 0.0230 466/500 [==========================>...] - ETA: 17s - loss: 0.3591 - regression_loss: 0.3361 - classification_loss: 0.0230 467/500 [===========================>..] - ETA: 16s - loss: 0.3593 - regression_loss: 0.3362 - classification_loss: 0.0231 468/500 [===========================>..] - ETA: 16s - loss: 0.3598 - regression_loss: 0.3368 - classification_loss: 0.0230 469/500 [===========================>..] - ETA: 15s - loss: 0.3602 - regression_loss: 0.3371 - classification_loss: 0.0230 470/500 [===========================>..] - ETA: 15s - loss: 0.3597 - regression_loss: 0.3367 - classification_loss: 0.0230 471/500 [===========================>..] - ETA: 14s - loss: 0.3593 - regression_loss: 0.3363 - classification_loss: 0.0230 472/500 [===========================>..] - ETA: 14s - loss: 0.3592 - regression_loss: 0.3362 - classification_loss: 0.0230 473/500 [===========================>..] - ETA: 13s - loss: 0.3595 - regression_loss: 0.3365 - classification_loss: 0.0230 474/500 [===========================>..] - ETA: 13s - loss: 0.3591 - regression_loss: 0.3361 - classification_loss: 0.0230 475/500 [===========================>..] - ETA: 12s - loss: 0.3589 - regression_loss: 0.3359 - classification_loss: 0.0230 476/500 [===========================>..] - ETA: 12s - loss: 0.3592 - regression_loss: 0.3362 - classification_loss: 0.0230 477/500 [===========================>..] - ETA: 11s - loss: 0.3587 - regression_loss: 0.3357 - classification_loss: 0.0229 478/500 [===========================>..] - ETA: 11s - loss: 0.3584 - regression_loss: 0.3355 - classification_loss: 0.0229 479/500 [===========================>..] - ETA: 10s - loss: 0.3584 - regression_loss: 0.3355 - classification_loss: 0.0229 480/500 [===========================>..] - ETA: 10s - loss: 0.3584 - regression_loss: 0.3355 - classification_loss: 0.0229 481/500 [===========================>..] - ETA: 9s - loss: 0.3580 - regression_loss: 0.3351 - classification_loss: 0.0229  482/500 [===========================>..] - ETA: 9s - loss: 0.3579 - regression_loss: 0.3350 - classification_loss: 0.0229 483/500 [===========================>..] - ETA: 8s - loss: 0.3580 - regression_loss: 0.3352 - classification_loss: 0.0228 484/500 [============================>.] - ETA: 8s - loss: 0.3576 - regression_loss: 0.3348 - classification_loss: 0.0228 485/500 [============================>.] - ETA: 7s - loss: 0.3574 - regression_loss: 0.3346 - classification_loss: 0.0228 486/500 [============================>.] - ETA: 7s - loss: 0.3574 - regression_loss: 0.3346 - classification_loss: 0.0228 487/500 [============================>.] - ETA: 6s - loss: 0.3572 - regression_loss: 0.3344 - classification_loss: 0.0228 488/500 [============================>.] - ETA: 6s - loss: 0.3569 - regression_loss: 0.3341 - classification_loss: 0.0228 489/500 [============================>.] - ETA: 5s - loss: 0.3571 - regression_loss: 0.3343 - classification_loss: 0.0228 490/500 [============================>.] - ETA: 5s - loss: 0.3568 - regression_loss: 0.3340 - classification_loss: 0.0228 491/500 [============================>.] - ETA: 4s - loss: 0.3564 - regression_loss: 0.3337 - classification_loss: 0.0228 492/500 [============================>.] - ETA: 4s - loss: 0.3561 - regression_loss: 0.3333 - classification_loss: 0.0227 493/500 [============================>.] - ETA: 3s - loss: 0.3563 - regression_loss: 0.3335 - classification_loss: 0.0227 494/500 [============================>.] - ETA: 3s - loss: 0.3560 - regression_loss: 0.3333 - classification_loss: 0.0227 495/500 [============================>.] - ETA: 2s - loss: 0.3559 - regression_loss: 0.3332 - classification_loss: 0.0227 496/500 [============================>.] - ETA: 2s - loss: 0.3558 - regression_loss: 0.3330 - classification_loss: 0.0227 497/500 [============================>.] - ETA: 1s - loss: 0.3555 - regression_loss: 0.3328 - classification_loss: 0.0227 498/500 [============================>.] - ETA: 1s - loss: 0.3552 - regression_loss: 0.3326 - classification_loss: 0.0227 499/500 [============================>.] - ETA: 0s - loss: 0.3550 - regression_loss: 0.3323 - classification_loss: 0.0226 500/500 [==============================] - 251s 502ms/step - loss: 0.3550 - regression_loss: 0.3323 - classification_loss: 0.0227 1172 instances of class plum with average precision: 0.7330 mAP: 0.7330 Epoch 00014: saving model to ./training/snapshots/resnet101_pascal_14.h5 Epoch 15/150 1/500 [..............................] - ETA: 4:26 - loss: 0.7081 - regression_loss: 0.6752 - classification_loss: 0.0329 2/500 [..............................] - ETA: 4:14 - loss: 0.6715 - regression_loss: 0.6155 - classification_loss: 0.0560 3/500 [..............................] - ETA: 4:11 - loss: 0.5158 - regression_loss: 0.4761 - classification_loss: 0.0397 4/500 [..............................] - ETA: 4:09 - loss: 0.5252 - regression_loss: 0.4868 - classification_loss: 0.0384 5/500 [..............................] - ETA: 4:08 - loss: 0.4806 - regression_loss: 0.4471 - classification_loss: 0.0335 6/500 [..............................] - ETA: 4:10 - loss: 0.4554 - regression_loss: 0.4249 - classification_loss: 0.0304 7/500 [..............................] - ETA: 4:09 - loss: 0.4115 - regression_loss: 0.3849 - classification_loss: 0.0266 8/500 [..............................] - ETA: 4:09 - loss: 0.3908 - regression_loss: 0.3666 - classification_loss: 0.0243 9/500 [..............................] - ETA: 4:10 - loss: 0.3734 - regression_loss: 0.3507 - classification_loss: 0.0227 10/500 [..............................] - ETA: 4:09 - loss: 0.3553 - regression_loss: 0.3341 - classification_loss: 0.0213 11/500 [..............................] - ETA: 4:07 - loss: 0.3618 - regression_loss: 0.3396 - classification_loss: 0.0222 12/500 [..............................] - ETA: 4:06 - loss: 0.3572 - regression_loss: 0.3354 - classification_loss: 0.0218 13/500 [..............................] - ETA: 4:06 - loss: 0.3549 - regression_loss: 0.3331 - classification_loss: 0.0218 14/500 [..............................] - ETA: 4:05 - loss: 0.3469 - regression_loss: 0.3263 - classification_loss: 0.0207 15/500 [..............................] - ETA: 4:04 - loss: 0.3431 - regression_loss: 0.3231 - classification_loss: 0.0199 16/500 [..............................] - ETA: 4:04 - loss: 0.3393 - regression_loss: 0.3190 - classification_loss: 0.0202 17/500 [>.............................] - ETA: 4:03 - loss: 0.3334 - regression_loss: 0.3132 - classification_loss: 0.0202 18/500 [>.............................] - ETA: 4:03 - loss: 0.3318 - regression_loss: 0.3117 - classification_loss: 0.0202 19/500 [>.............................] - ETA: 4:03 - loss: 0.3500 - regression_loss: 0.3272 - classification_loss: 0.0228 20/500 [>.............................] - ETA: 4:02 - loss: 0.3475 - regression_loss: 0.3249 - classification_loss: 0.0226 21/500 [>.............................] - ETA: 4:01 - loss: 0.3467 - regression_loss: 0.3241 - classification_loss: 0.0226 22/500 [>.............................] - ETA: 4:01 - loss: 0.3513 - regression_loss: 0.3281 - classification_loss: 0.0232 23/500 [>.............................] - ETA: 4:01 - loss: 0.3460 - regression_loss: 0.3233 - classification_loss: 0.0227 24/500 [>.............................] - ETA: 4:00 - loss: 0.3434 - regression_loss: 0.3210 - classification_loss: 0.0224 25/500 [>.............................] - ETA: 3:59 - loss: 0.3449 - regression_loss: 0.3226 - classification_loss: 0.0223 26/500 [>.............................] - ETA: 3:58 - loss: 0.3536 - regression_loss: 0.3309 - classification_loss: 0.0226 27/500 [>.............................] - ETA: 3:58 - loss: 0.3608 - regression_loss: 0.3378 - classification_loss: 0.0230 28/500 [>.............................] - ETA: 3:57 - loss: 0.3555 - regression_loss: 0.3330 - classification_loss: 0.0225 29/500 [>.............................] - ETA: 3:57 - loss: 0.3500 - regression_loss: 0.3279 - classification_loss: 0.0221 30/500 [>.............................] - ETA: 3:57 - loss: 0.3437 - regression_loss: 0.3222 - classification_loss: 0.0216 31/500 [>.............................] - ETA: 3:56 - loss: 0.3395 - regression_loss: 0.3182 - classification_loss: 0.0212 32/500 [>.............................] - ETA: 3:56 - loss: 0.3397 - regression_loss: 0.3185 - classification_loss: 0.0212 33/500 [>.............................] - ETA: 3:55 - loss: 0.3449 - regression_loss: 0.3228 - classification_loss: 0.0221 34/500 [=>............................] - ETA: 3:54 - loss: 0.3394 - regression_loss: 0.3178 - classification_loss: 0.0216 35/500 [=>............................] - ETA: 3:54 - loss: 0.3421 - regression_loss: 0.3202 - classification_loss: 0.0219 36/500 [=>............................] - ETA: 3:53 - loss: 0.3424 - regression_loss: 0.3199 - classification_loss: 0.0225 37/500 [=>............................] - ETA: 3:53 - loss: 0.3381 - regression_loss: 0.3159 - classification_loss: 0.0222 38/500 [=>............................] - ETA: 3:52 - loss: 0.3325 - regression_loss: 0.3107 - classification_loss: 0.0218 39/500 [=>............................] - ETA: 3:52 - loss: 0.3313 - regression_loss: 0.3091 - classification_loss: 0.0222 40/500 [=>............................] - ETA: 3:51 - loss: 0.3336 - regression_loss: 0.3115 - classification_loss: 0.0221 41/500 [=>............................] - ETA: 3:50 - loss: 0.3364 - regression_loss: 0.3142 - classification_loss: 0.0222 42/500 [=>............................] - ETA: 3:49 - loss: 0.3337 - regression_loss: 0.3117 - classification_loss: 0.0219 43/500 [=>............................] - ETA: 3:49 - loss: 0.3330 - regression_loss: 0.3111 - classification_loss: 0.0219 44/500 [=>............................] - ETA: 3:48 - loss: 0.3389 - regression_loss: 0.3164 - classification_loss: 0.0225 45/500 [=>............................] - ETA: 3:48 - loss: 0.3355 - regression_loss: 0.3133 - classification_loss: 0.0222 46/500 [=>............................] - ETA: 3:47 - loss: 0.3396 - regression_loss: 0.3161 - classification_loss: 0.0235 47/500 [=>............................] - ETA: 3:47 - loss: 0.3354 - regression_loss: 0.3123 - classification_loss: 0.0231 48/500 [=>............................] - ETA: 3:46 - loss: 0.3353 - regression_loss: 0.3121 - classification_loss: 0.0231 49/500 [=>............................] - ETA: 3:46 - loss: 0.3325 - regression_loss: 0.3096 - classification_loss: 0.0229 50/500 [==>...........................] - ETA: 3:45 - loss: 0.3329 - regression_loss: 0.3097 - classification_loss: 0.0231 51/500 [==>...........................] - ETA: 3:45 - loss: 0.3339 - regression_loss: 0.3109 - classification_loss: 0.0230 52/500 [==>...........................] - ETA: 3:44 - loss: 0.3368 - regression_loss: 0.3134 - classification_loss: 0.0234 53/500 [==>...........................] - ETA: 3:44 - loss: 0.3366 - regression_loss: 0.3130 - classification_loss: 0.0237 54/500 [==>...........................] - ETA: 3:43 - loss: 0.3350 - regression_loss: 0.3114 - classification_loss: 0.0236 55/500 [==>...........................] - ETA: 3:43 - loss: 0.3352 - regression_loss: 0.3112 - classification_loss: 0.0239 56/500 [==>...........................] - ETA: 3:42 - loss: 0.3348 - regression_loss: 0.3111 - classification_loss: 0.0238 57/500 [==>...........................] - ETA: 3:42 - loss: 0.3334 - regression_loss: 0.3097 - classification_loss: 0.0237 58/500 [==>...........................] - ETA: 3:42 - loss: 0.3336 - regression_loss: 0.3101 - classification_loss: 0.0235 59/500 [==>...........................] - ETA: 3:41 - loss: 0.3332 - regression_loss: 0.3098 - classification_loss: 0.0234 60/500 [==>...........................] - ETA: 3:40 - loss: 0.3342 - regression_loss: 0.3106 - classification_loss: 0.0235 61/500 [==>...........................] - ETA: 3:40 - loss: 0.3359 - regression_loss: 0.3122 - classification_loss: 0.0236 62/500 [==>...........................] - ETA: 3:39 - loss: 0.3372 - regression_loss: 0.3136 - classification_loss: 0.0236 63/500 [==>...........................] - ETA: 3:39 - loss: 0.3389 - regression_loss: 0.3151 - classification_loss: 0.0238 64/500 [==>...........................] - ETA: 3:38 - loss: 0.3387 - regression_loss: 0.3150 - classification_loss: 0.0237 65/500 [==>...........................] - ETA: 3:38 - loss: 0.3376 - regression_loss: 0.3140 - classification_loss: 0.0237 66/500 [==>...........................] - ETA: 3:37 - loss: 0.3340 - regression_loss: 0.3106 - classification_loss: 0.0234 67/500 [===>..........................] - ETA: 3:37 - loss: 0.3325 - regression_loss: 0.3093 - classification_loss: 0.0232 68/500 [===>..........................] - ETA: 3:36 - loss: 0.3344 - regression_loss: 0.3111 - classification_loss: 0.0233 69/500 [===>..........................] - ETA: 3:36 - loss: 0.3365 - regression_loss: 0.3128 - classification_loss: 0.0238 70/500 [===>..........................] - ETA: 3:35 - loss: 0.3372 - regression_loss: 0.3132 - classification_loss: 0.0240 71/500 [===>..........................] - ETA: 3:35 - loss: 0.3341 - regression_loss: 0.3104 - classification_loss: 0.0237 72/500 [===>..........................] - ETA: 3:34 - loss: 0.3330 - regression_loss: 0.3094 - classification_loss: 0.0236 73/500 [===>..........................] - ETA: 3:34 - loss: 0.3309 - regression_loss: 0.3075 - classification_loss: 0.0234 74/500 [===>..........................] - ETA: 3:33 - loss: 0.3327 - regression_loss: 0.3092 - classification_loss: 0.0234 75/500 [===>..........................] - ETA: 3:33 - loss: 0.3319 - regression_loss: 0.3086 - classification_loss: 0.0234 76/500 [===>..........................] - ETA: 3:32 - loss: 0.3305 - regression_loss: 0.3072 - classification_loss: 0.0232 77/500 [===>..........................] - ETA: 3:32 - loss: 0.3288 - regression_loss: 0.3058 - classification_loss: 0.0230 78/500 [===>..........................] - ETA: 3:31 - loss: 0.3332 - regression_loss: 0.3097 - classification_loss: 0.0235 79/500 [===>..........................] - ETA: 3:31 - loss: 0.3348 - regression_loss: 0.3109 - classification_loss: 0.0239 80/500 [===>..........................] - ETA: 3:30 - loss: 0.3320 - regression_loss: 0.3083 - classification_loss: 0.0237 81/500 [===>..........................] - ETA: 3:30 - loss: 0.3322 - regression_loss: 0.3085 - classification_loss: 0.0238 82/500 [===>..........................] - ETA: 3:30 - loss: 0.3326 - regression_loss: 0.3089 - classification_loss: 0.0238 83/500 [===>..........................] - ETA: 3:29 - loss: 0.3316 - regression_loss: 0.3081 - classification_loss: 0.0236 84/500 [====>.........................] - ETA: 3:28 - loss: 0.3315 - regression_loss: 0.3080 - classification_loss: 0.0235 85/500 [====>.........................] - ETA: 3:28 - loss: 0.3290 - regression_loss: 0.3058 - classification_loss: 0.0233 86/500 [====>.........................] - ETA: 3:28 - loss: 0.3288 - regression_loss: 0.3057 - classification_loss: 0.0231 87/500 [====>.........................] - ETA: 3:27 - loss: 0.3279 - regression_loss: 0.3049 - classification_loss: 0.0230 88/500 [====>.........................] - ETA: 3:26 - loss: 0.3283 - regression_loss: 0.3055 - classification_loss: 0.0228 89/500 [====>.........................] - ETA: 3:26 - loss: 0.3309 - regression_loss: 0.3081 - classification_loss: 0.0229 90/500 [====>.........................] - ETA: 3:25 - loss: 0.3316 - regression_loss: 0.3086 - classification_loss: 0.0230 91/500 [====>.........................] - ETA: 3:25 - loss: 0.3304 - regression_loss: 0.3076 - classification_loss: 0.0228 92/500 [====>.........................] - ETA: 3:24 - loss: 0.3293 - regression_loss: 0.3066 - classification_loss: 0.0227 93/500 [====>.........................] - ETA: 3:24 - loss: 0.3302 - regression_loss: 0.3076 - classification_loss: 0.0226 94/500 [====>.........................] - ETA: 3:23 - loss: 0.3306 - regression_loss: 0.3080 - classification_loss: 0.0226 95/500 [====>.........................] - ETA: 3:23 - loss: 0.3313 - regression_loss: 0.3087 - classification_loss: 0.0225 96/500 [====>.........................] - ETA: 3:22 - loss: 0.3331 - regression_loss: 0.3105 - classification_loss: 0.0226 97/500 [====>.........................] - ETA: 3:22 - loss: 0.3341 - regression_loss: 0.3116 - classification_loss: 0.0226 98/500 [====>.........................] - ETA: 3:22 - loss: 0.3337 - regression_loss: 0.3112 - classification_loss: 0.0225 99/500 [====>.........................] - ETA: 3:21 - loss: 0.3333 - regression_loss: 0.3109 - classification_loss: 0.0225 100/500 [=====>........................] - ETA: 3:20 - loss: 0.3327 - regression_loss: 0.3103 - classification_loss: 0.0224 101/500 [=====>........................] - ETA: 3:20 - loss: 0.3316 - regression_loss: 0.3093 - classification_loss: 0.0223 102/500 [=====>........................] - ETA: 3:19 - loss: 0.3304 - regression_loss: 0.3081 - classification_loss: 0.0223 103/500 [=====>........................] - ETA: 3:19 - loss: 0.3289 - regression_loss: 0.3067 - classification_loss: 0.0222 104/500 [=====>........................] - ETA: 3:18 - loss: 0.3301 - regression_loss: 0.3079 - classification_loss: 0.0222 105/500 [=====>........................] - ETA: 3:18 - loss: 0.3312 - regression_loss: 0.3088 - classification_loss: 0.0224 106/500 [=====>........................] - ETA: 3:17 - loss: 0.3300 - regression_loss: 0.3077 - classification_loss: 0.0223 107/500 [=====>........................] - ETA: 3:17 - loss: 0.3311 - regression_loss: 0.3089 - classification_loss: 0.0222 108/500 [=====>........................] - ETA: 3:17 - loss: 0.3306 - regression_loss: 0.3086 - classification_loss: 0.0221 109/500 [=====>........................] - ETA: 3:16 - loss: 0.3306 - regression_loss: 0.3086 - classification_loss: 0.0220 110/500 [=====>........................] - ETA: 3:15 - loss: 0.3302 - regression_loss: 0.3081 - classification_loss: 0.0220 111/500 [=====>........................] - ETA: 3:15 - loss: 0.3300 - regression_loss: 0.3080 - classification_loss: 0.0220 112/500 [=====>........................] - ETA: 3:14 - loss: 0.3293 - regression_loss: 0.3074 - classification_loss: 0.0219 113/500 [=====>........................] - ETA: 3:14 - loss: 0.3311 - regression_loss: 0.3091 - classification_loss: 0.0220 114/500 [=====>........................] - ETA: 3:13 - loss: 0.3301 - regression_loss: 0.3082 - classification_loss: 0.0218 115/500 [=====>........................] - ETA: 3:13 - loss: 0.3296 - regression_loss: 0.3078 - classification_loss: 0.0218 116/500 [=====>........................] - ETA: 3:12 - loss: 0.3332 - regression_loss: 0.3111 - classification_loss: 0.0220 117/500 [======>.......................] - ETA: 3:12 - loss: 0.3330 - regression_loss: 0.3111 - classification_loss: 0.0220 118/500 [======>.......................] - ETA: 3:11 - loss: 0.3322 - regression_loss: 0.3103 - classification_loss: 0.0219 119/500 [======>.......................] - ETA: 3:11 - loss: 0.3313 - regression_loss: 0.3094 - classification_loss: 0.0218 120/500 [======>.......................] - ETA: 3:10 - loss: 0.3300 - regression_loss: 0.3083 - classification_loss: 0.0217 121/500 [======>.......................] - ETA: 3:10 - loss: 0.3290 - regression_loss: 0.3073 - classification_loss: 0.0217 122/500 [======>.......................] - ETA: 3:10 - loss: 0.3292 - regression_loss: 0.3075 - classification_loss: 0.0217 123/500 [======>.......................] - ETA: 3:09 - loss: 0.3295 - regression_loss: 0.3078 - classification_loss: 0.0217 124/500 [======>.......................] - ETA: 3:08 - loss: 0.3301 - regression_loss: 0.3084 - classification_loss: 0.0217 125/500 [======>.......................] - ETA: 3:08 - loss: 0.3292 - regression_loss: 0.3076 - classification_loss: 0.0216 126/500 [======>.......................] - ETA: 3:07 - loss: 0.3298 - regression_loss: 0.3084 - classification_loss: 0.0215 127/500 [======>.......................] - ETA: 3:07 - loss: 0.3290 - regression_loss: 0.3076 - classification_loss: 0.0213 128/500 [======>.......................] - ETA: 3:06 - loss: 0.3294 - regression_loss: 0.3081 - classification_loss: 0.0213 129/500 [======>.......................] - ETA: 3:06 - loss: 0.3312 - regression_loss: 0.3099 - classification_loss: 0.0214 130/500 [======>.......................] - ETA: 3:05 - loss: 0.3320 - regression_loss: 0.3106 - classification_loss: 0.0213 131/500 [======>.......................] - ETA: 3:05 - loss: 0.3333 - regression_loss: 0.3119 - classification_loss: 0.0214 132/500 [======>.......................] - ETA: 3:04 - loss: 0.3345 - regression_loss: 0.3131 - classification_loss: 0.0214 133/500 [======>.......................] - ETA: 3:04 - loss: 0.3347 - regression_loss: 0.3134 - classification_loss: 0.0213 134/500 [=======>......................] - ETA: 3:03 - loss: 0.3353 - regression_loss: 0.3139 - classification_loss: 0.0214 135/500 [=======>......................] - ETA: 3:03 - loss: 0.3356 - regression_loss: 0.3142 - classification_loss: 0.0214 136/500 [=======>......................] - ETA: 3:02 - loss: 0.3343 - regression_loss: 0.3130 - classification_loss: 0.0213 137/500 [=======>......................] - ETA: 3:02 - loss: 0.3340 - regression_loss: 0.3128 - classification_loss: 0.0212 138/500 [=======>......................] - ETA: 3:01 - loss: 0.3338 - regression_loss: 0.3126 - classification_loss: 0.0212 139/500 [=======>......................] - ETA: 3:01 - loss: 0.3347 - regression_loss: 0.3135 - classification_loss: 0.0212 140/500 [=======>......................] - ETA: 3:00 - loss: 0.3350 - regression_loss: 0.3139 - classification_loss: 0.0211 141/500 [=======>......................] - ETA: 3:00 - loss: 0.3360 - regression_loss: 0.3147 - classification_loss: 0.0212 142/500 [=======>......................] - ETA: 2:59 - loss: 0.3358 - regression_loss: 0.3146 - classification_loss: 0.0212 143/500 [=======>......................] - ETA: 2:59 - loss: 0.3364 - regression_loss: 0.3152 - classification_loss: 0.0212 144/500 [=======>......................] - ETA: 2:58 - loss: 0.3356 - regression_loss: 0.3145 - classification_loss: 0.0212 145/500 [=======>......................] - ETA: 2:58 - loss: 0.3343 - regression_loss: 0.3132 - classification_loss: 0.0211 146/500 [=======>......................] - ETA: 2:57 - loss: 0.3348 - regression_loss: 0.3134 - classification_loss: 0.0214 147/500 [=======>......................] - ETA: 2:57 - loss: 0.3346 - regression_loss: 0.3132 - classification_loss: 0.0214 148/500 [=======>......................] - ETA: 2:56 - loss: 0.3342 - regression_loss: 0.3128 - classification_loss: 0.0213 149/500 [=======>......................] - ETA: 2:56 - loss: 0.3357 - regression_loss: 0.3141 - classification_loss: 0.0216 150/500 [========>.....................] - ETA: 2:55 - loss: 0.3347 - regression_loss: 0.3132 - classification_loss: 0.0215 151/500 [========>.....................] - ETA: 2:55 - loss: 0.3343 - regression_loss: 0.3129 - classification_loss: 0.0215 152/500 [========>.....................] - ETA: 2:54 - loss: 0.3340 - regression_loss: 0.3126 - classification_loss: 0.0214 153/500 [========>.....................] - ETA: 2:54 - loss: 0.3336 - regression_loss: 0.3123 - classification_loss: 0.0213 154/500 [========>.....................] - ETA: 2:53 - loss: 0.3339 - regression_loss: 0.3125 - classification_loss: 0.0214 155/500 [========>.....................] - ETA: 2:53 - loss: 0.3338 - regression_loss: 0.3124 - classification_loss: 0.0214 156/500 [========>.....................] - ETA: 2:52 - loss: 0.3358 - regression_loss: 0.3145 - classification_loss: 0.0213 157/500 [========>.....................] - ETA: 2:52 - loss: 0.3363 - regression_loss: 0.3148 - classification_loss: 0.0214 158/500 [========>.....................] - ETA: 2:51 - loss: 0.3354 - regression_loss: 0.3141 - classification_loss: 0.0213 159/500 [========>.....................] - ETA: 2:51 - loss: 0.3357 - regression_loss: 0.3144 - classification_loss: 0.0213 160/500 [========>.....................] - ETA: 2:50 - loss: 0.3360 - regression_loss: 0.3147 - classification_loss: 0.0213 161/500 [========>.....................] - ETA: 2:50 - loss: 0.3369 - regression_loss: 0.3157 - classification_loss: 0.0212 162/500 [========>.....................] - ETA: 2:49 - loss: 0.3374 - regression_loss: 0.3161 - classification_loss: 0.0212 163/500 [========>.....................] - ETA: 2:49 - loss: 0.3374 - regression_loss: 0.3162 - classification_loss: 0.0212 164/500 [========>.....................] - ETA: 2:48 - loss: 0.3364 - regression_loss: 0.3153 - classification_loss: 0.0211 165/500 [========>.....................] - ETA: 2:48 - loss: 0.3359 - regression_loss: 0.3148 - classification_loss: 0.0210 166/500 [========>.....................] - ETA: 2:47 - loss: 0.3372 - regression_loss: 0.3162 - classification_loss: 0.0210 167/500 [=========>....................] - ETA: 2:47 - loss: 0.3366 - regression_loss: 0.3157 - classification_loss: 0.0209 168/500 [=========>....................] - ETA: 2:46 - loss: 0.3370 - regression_loss: 0.3161 - classification_loss: 0.0209 169/500 [=========>....................] - ETA: 2:46 - loss: 0.3369 - regression_loss: 0.3160 - classification_loss: 0.0208 170/500 [=========>....................] - ETA: 2:45 - loss: 0.3360 - regression_loss: 0.3153 - classification_loss: 0.0207 171/500 [=========>....................] - ETA: 2:45 - loss: 0.3359 - regression_loss: 0.3152 - classification_loss: 0.0207 172/500 [=========>....................] - ETA: 2:44 - loss: 0.3343 - regression_loss: 0.3137 - classification_loss: 0.0206 173/500 [=========>....................] - ETA: 2:44 - loss: 0.3360 - regression_loss: 0.3151 - classification_loss: 0.0209 174/500 [=========>....................] - ETA: 2:43 - loss: 0.3364 - regression_loss: 0.3155 - classification_loss: 0.0209 175/500 [=========>....................] - ETA: 2:43 - loss: 0.3360 - regression_loss: 0.3151 - classification_loss: 0.0209 176/500 [=========>....................] - ETA: 2:42 - loss: 0.3363 - regression_loss: 0.3154 - classification_loss: 0.0209 177/500 [=========>....................] - ETA: 2:42 - loss: 0.3364 - regression_loss: 0.3155 - classification_loss: 0.0209 178/500 [=========>....................] - ETA: 2:41 - loss: 0.3378 - regression_loss: 0.3166 - classification_loss: 0.0212 179/500 [=========>....................] - ETA: 2:41 - loss: 0.3379 - regression_loss: 0.3167 - classification_loss: 0.0212 180/500 [=========>....................] - ETA: 2:40 - loss: 0.3376 - regression_loss: 0.3165 - classification_loss: 0.0211 181/500 [=========>....................] - ETA: 2:40 - loss: 0.3361 - regression_loss: 0.3151 - classification_loss: 0.0210 182/500 [=========>....................] - ETA: 2:39 - loss: 0.3356 - regression_loss: 0.3146 - classification_loss: 0.0210 183/500 [=========>....................] - ETA: 2:39 - loss: 0.3354 - regression_loss: 0.3144 - classification_loss: 0.0210 184/500 [==========>...................] - ETA: 2:38 - loss: 0.3344 - regression_loss: 0.3135 - classification_loss: 0.0209 185/500 [==========>...................] - ETA: 2:38 - loss: 0.3341 - regression_loss: 0.3132 - classification_loss: 0.0209 186/500 [==========>...................] - ETA: 2:37 - loss: 0.3337 - regression_loss: 0.3129 - classification_loss: 0.0208 187/500 [==========>...................] - ETA: 2:37 - loss: 0.3340 - regression_loss: 0.3131 - classification_loss: 0.0209 188/500 [==========>...................] - ETA: 2:36 - loss: 0.3334 - regression_loss: 0.3126 - classification_loss: 0.0208 189/500 [==========>...................] - ETA: 2:36 - loss: 0.3328 - regression_loss: 0.3120 - classification_loss: 0.0208 190/500 [==========>...................] - ETA: 2:35 - loss: 0.3327 - regression_loss: 0.3119 - classification_loss: 0.0208 191/500 [==========>...................] - ETA: 2:35 - loss: 0.3323 - regression_loss: 0.3115 - classification_loss: 0.0208 192/500 [==========>...................] - ETA: 2:34 - loss: 0.3329 - regression_loss: 0.3121 - classification_loss: 0.0208 193/500 [==========>...................] - ETA: 2:34 - loss: 0.3329 - regression_loss: 0.3121 - classification_loss: 0.0208 194/500 [==========>...................] - ETA: 2:33 - loss: 0.3329 - regression_loss: 0.3121 - classification_loss: 0.0208 195/500 [==========>...................] - ETA: 2:33 - loss: 0.3331 - regression_loss: 0.3123 - classification_loss: 0.0208 196/500 [==========>...................] - ETA: 2:32 - loss: 0.3335 - regression_loss: 0.3126 - classification_loss: 0.0209 197/500 [==========>...................] - ETA: 2:32 - loss: 0.3342 - regression_loss: 0.3131 - classification_loss: 0.0210 198/500 [==========>...................] - ETA: 2:31 - loss: 0.3351 - regression_loss: 0.3139 - classification_loss: 0.0212 199/500 [==========>...................] - ETA: 2:31 - loss: 0.3353 - regression_loss: 0.3141 - classification_loss: 0.0212 200/500 [===========>..................] - ETA: 2:30 - loss: 0.3348 - regression_loss: 0.3137 - classification_loss: 0.0212 201/500 [===========>..................] - ETA: 2:30 - loss: 0.3346 - regression_loss: 0.3135 - classification_loss: 0.0211 202/500 [===========>..................] - ETA: 2:29 - loss: 0.3350 - regression_loss: 0.3139 - classification_loss: 0.0211 203/500 [===========>..................] - ETA: 2:29 - loss: 0.3339 - regression_loss: 0.3129 - classification_loss: 0.0210 204/500 [===========>..................] - ETA: 2:28 - loss: 0.3344 - regression_loss: 0.3133 - classification_loss: 0.0210 205/500 [===========>..................] - ETA: 2:28 - loss: 0.3335 - regression_loss: 0.3126 - classification_loss: 0.0210 206/500 [===========>..................] - ETA: 2:27 - loss: 0.3330 - regression_loss: 0.3121 - classification_loss: 0.0209 207/500 [===========>..................] - ETA: 2:27 - loss: 0.3324 - regression_loss: 0.3115 - classification_loss: 0.0208 208/500 [===========>..................] - ETA: 2:26 - loss: 0.3325 - regression_loss: 0.3117 - classification_loss: 0.0208 209/500 [===========>..................] - ETA: 2:26 - loss: 0.3318 - regression_loss: 0.3110 - classification_loss: 0.0208 210/500 [===========>..................] - ETA: 2:25 - loss: 0.3314 - regression_loss: 0.3106 - classification_loss: 0.0208 211/500 [===========>..................] - ETA: 2:25 - loss: 0.3317 - regression_loss: 0.3109 - classification_loss: 0.0209 212/500 [===========>..................] - ETA: 2:24 - loss: 0.3310 - regression_loss: 0.3102 - classification_loss: 0.0208 213/500 [===========>..................] - ETA: 2:24 - loss: 0.3308 - regression_loss: 0.3100 - classification_loss: 0.0208 214/500 [===========>..................] - ETA: 2:23 - loss: 0.3306 - regression_loss: 0.3097 - classification_loss: 0.0209 215/500 [===========>..................] - ETA: 2:22 - loss: 0.3299 - regression_loss: 0.3091 - classification_loss: 0.0208 216/500 [===========>..................] - ETA: 2:22 - loss: 0.3299 - regression_loss: 0.3091 - classification_loss: 0.0208 217/500 [============>.................] - ETA: 2:21 - loss: 0.3301 - regression_loss: 0.3093 - classification_loss: 0.0208 218/500 [============>.................] - ETA: 2:21 - loss: 0.3296 - regression_loss: 0.3088 - classification_loss: 0.0208 219/500 [============>.................] - ETA: 2:21 - loss: 0.3298 - regression_loss: 0.3090 - classification_loss: 0.0208 220/500 [============>.................] - ETA: 2:20 - loss: 0.3294 - regression_loss: 0.3087 - classification_loss: 0.0207 221/500 [============>.................] - ETA: 2:20 - loss: 0.3297 - regression_loss: 0.3089 - classification_loss: 0.0208 222/500 [============>.................] - ETA: 2:19 - loss: 0.3287 - regression_loss: 0.3080 - classification_loss: 0.0207 223/500 [============>.................] - ETA: 2:19 - loss: 0.3284 - regression_loss: 0.3077 - classification_loss: 0.0207 224/500 [============>.................] - ETA: 2:18 - loss: 0.3276 - regression_loss: 0.3070 - classification_loss: 0.0206 225/500 [============>.................] - ETA: 2:18 - loss: 0.3293 - regression_loss: 0.3087 - classification_loss: 0.0206 226/500 [============>.................] - ETA: 2:17 - loss: 0.3302 - regression_loss: 0.3094 - classification_loss: 0.0207 227/500 [============>.................] - ETA: 2:16 - loss: 0.3304 - regression_loss: 0.3097 - classification_loss: 0.0207 228/500 [============>.................] - ETA: 2:16 - loss: 0.3311 - regression_loss: 0.3104 - classification_loss: 0.0207 229/500 [============>.................] - ETA: 2:15 - loss: 0.3309 - regression_loss: 0.3102 - classification_loss: 0.0207 230/500 [============>.................] - ETA: 2:15 - loss: 0.3305 - regression_loss: 0.3098 - classification_loss: 0.0207 231/500 [============>.................] - ETA: 2:14 - loss: 0.3299 - regression_loss: 0.3093 - classification_loss: 0.0206 232/500 [============>.................] - ETA: 2:14 - loss: 0.3302 - regression_loss: 0.3096 - classification_loss: 0.0206 233/500 [============>.................] - ETA: 2:13 - loss: 0.3302 - regression_loss: 0.3096 - classification_loss: 0.0206 234/500 [=============>................] - ETA: 2:13 - loss: 0.3302 - regression_loss: 0.3095 - classification_loss: 0.0206 235/500 [=============>................] - ETA: 2:12 - loss: 0.3299 - regression_loss: 0.3093 - classification_loss: 0.0206 236/500 [=============>................] - ETA: 2:12 - loss: 0.3307 - regression_loss: 0.3099 - classification_loss: 0.0208 237/500 [=============>................] - ETA: 2:11 - loss: 0.3314 - regression_loss: 0.3106 - classification_loss: 0.0208 238/500 [=============>................] - ETA: 2:11 - loss: 0.3309 - regression_loss: 0.3101 - classification_loss: 0.0207 239/500 [=============>................] - ETA: 2:10 - loss: 0.3311 - regression_loss: 0.3103 - classification_loss: 0.0208 240/500 [=============>................] - ETA: 2:10 - loss: 0.3309 - regression_loss: 0.3101 - classification_loss: 0.0207 241/500 [=============>................] - ETA: 2:09 - loss: 0.3303 - regression_loss: 0.3096 - classification_loss: 0.0207 242/500 [=============>................] - ETA: 2:09 - loss: 0.3298 - regression_loss: 0.3091 - classification_loss: 0.0207 243/500 [=============>................] - ETA: 2:08 - loss: 0.3301 - regression_loss: 0.3094 - classification_loss: 0.0207 244/500 [=============>................] - ETA: 2:08 - loss: 0.3301 - regression_loss: 0.3094 - classification_loss: 0.0207 245/500 [=============>................] - ETA: 2:07 - loss: 0.3299 - regression_loss: 0.3092 - classification_loss: 0.0207 246/500 [=============>................] - ETA: 2:07 - loss: 0.3309 - regression_loss: 0.3101 - classification_loss: 0.0208 247/500 [=============>................] - ETA: 2:06 - loss: 0.3306 - regression_loss: 0.3099 - classification_loss: 0.0207 248/500 [=============>................] - ETA: 2:06 - loss: 0.3303 - regression_loss: 0.3096 - classification_loss: 0.0207 249/500 [=============>................] - ETA: 2:05 - loss: 0.3294 - regression_loss: 0.3087 - classification_loss: 0.0207 250/500 [==============>...............] - ETA: 2:05 - loss: 0.3289 - regression_loss: 0.3082 - classification_loss: 0.0207 251/500 [==============>...............] - ETA: 2:04 - loss: 0.3282 - regression_loss: 0.3075 - classification_loss: 0.0206 252/500 [==============>...............] - ETA: 2:04 - loss: 0.3276 - regression_loss: 0.3070 - classification_loss: 0.0205 253/500 [==============>...............] - ETA: 2:03 - loss: 0.3273 - regression_loss: 0.3067 - classification_loss: 0.0206 254/500 [==============>...............] - ETA: 2:03 - loss: 0.3278 - regression_loss: 0.3072 - classification_loss: 0.0205 255/500 [==============>...............] - ETA: 2:02 - loss: 0.3275 - regression_loss: 0.3070 - classification_loss: 0.0205 256/500 [==============>...............] - ETA: 2:02 - loss: 0.3278 - regression_loss: 0.3073 - classification_loss: 0.0205 257/500 [==============>...............] - ETA: 2:01 - loss: 0.3276 - regression_loss: 0.3072 - classification_loss: 0.0205 258/500 [==============>...............] - ETA: 2:01 - loss: 0.3275 - regression_loss: 0.3071 - classification_loss: 0.0204 259/500 [==============>...............] - ETA: 2:00 - loss: 0.3275 - regression_loss: 0.3071 - classification_loss: 0.0204 260/500 [==============>...............] - ETA: 2:00 - loss: 0.3278 - regression_loss: 0.3073 - classification_loss: 0.0204 261/500 [==============>...............] - ETA: 1:59 - loss: 0.3277 - regression_loss: 0.3073 - classification_loss: 0.0204 262/500 [==============>...............] - ETA: 1:59 - loss: 0.3272 - regression_loss: 0.3068 - classification_loss: 0.0204 263/500 [==============>...............] - ETA: 1:58 - loss: 0.3274 - regression_loss: 0.3070 - classification_loss: 0.0204 264/500 [==============>...............] - ETA: 1:58 - loss: 0.3275 - regression_loss: 0.3072 - classification_loss: 0.0204 265/500 [==============>...............] - ETA: 1:57 - loss: 0.3266 - regression_loss: 0.3063 - classification_loss: 0.0203 266/500 [==============>...............] - ETA: 1:57 - loss: 0.3263 - regression_loss: 0.3060 - classification_loss: 0.0203 267/500 [===============>..............] - ETA: 1:56 - loss: 0.3258 - regression_loss: 0.3055 - classification_loss: 0.0203 268/500 [===============>..............] - ETA: 1:56 - loss: 0.3259 - regression_loss: 0.3056 - classification_loss: 0.0203 269/500 [===============>..............] - ETA: 1:55 - loss: 0.3266 - regression_loss: 0.3062 - classification_loss: 0.0204 270/500 [===============>..............] - ETA: 1:55 - loss: 0.3278 - regression_loss: 0.3074 - classification_loss: 0.0204 271/500 [===============>..............] - ETA: 1:54 - loss: 0.3278 - regression_loss: 0.3074 - classification_loss: 0.0204 272/500 [===============>..............] - ETA: 1:54 - loss: 0.3279 - regression_loss: 0.3075 - classification_loss: 0.0204 273/500 [===============>..............] - ETA: 1:53 - loss: 0.3279 - regression_loss: 0.3075 - classification_loss: 0.0204 274/500 [===============>..............] - ETA: 1:53 - loss: 0.3276 - regression_loss: 0.3072 - classification_loss: 0.0204 275/500 [===============>..............] - ETA: 1:52 - loss: 0.3273 - regression_loss: 0.3070 - classification_loss: 0.0203 276/500 [===============>..............] - ETA: 1:52 - loss: 0.3269 - regression_loss: 0.3065 - classification_loss: 0.0203 277/500 [===============>..............] - ETA: 1:51 - loss: 0.3267 - regression_loss: 0.3063 - classification_loss: 0.0204 278/500 [===============>..............] - ETA: 1:51 - loss: 0.3265 - regression_loss: 0.3061 - classification_loss: 0.0204 279/500 [===============>..............] - ETA: 1:50 - loss: 0.3265 - regression_loss: 0.3062 - classification_loss: 0.0204 280/500 [===============>..............] - ETA: 1:50 - loss: 0.3260 - regression_loss: 0.3056 - classification_loss: 0.0203 281/500 [===============>..............] - ETA: 1:49 - loss: 0.3261 - regression_loss: 0.3058 - classification_loss: 0.0203 282/500 [===============>..............] - ETA: 1:49 - loss: 0.3257 - regression_loss: 0.3054 - classification_loss: 0.0203 283/500 [===============>..............] - ETA: 1:48 - loss: 0.3256 - regression_loss: 0.3053 - classification_loss: 0.0203 284/500 [================>.............] - ETA: 1:48 - loss: 0.3252 - regression_loss: 0.3049 - classification_loss: 0.0203 285/500 [================>.............] - ETA: 1:47 - loss: 0.3249 - regression_loss: 0.3047 - classification_loss: 0.0202 286/500 [================>.............] - ETA: 1:47 - loss: 0.3253 - regression_loss: 0.3050 - classification_loss: 0.0202 287/500 [================>.............] - ETA: 1:46 - loss: 0.3255 - regression_loss: 0.3051 - classification_loss: 0.0204 288/500 [================>.............] - ETA: 1:46 - loss: 0.3256 - regression_loss: 0.3052 - classification_loss: 0.0204 289/500 [================>.............] - ETA: 1:45 - loss: 0.3253 - regression_loss: 0.3050 - classification_loss: 0.0203 290/500 [================>.............] - ETA: 1:45 - loss: 0.3261 - regression_loss: 0.3058 - classification_loss: 0.0204 291/500 [================>.............] - ETA: 1:44 - loss: 0.3258 - regression_loss: 0.3054 - classification_loss: 0.0203 292/500 [================>.............] - ETA: 1:44 - loss: 0.3260 - regression_loss: 0.3056 - classification_loss: 0.0203 293/500 [================>.............] - ETA: 1:43 - loss: 0.3263 - regression_loss: 0.3060 - classification_loss: 0.0203 294/500 [================>.............] - ETA: 1:43 - loss: 0.3256 - regression_loss: 0.3053 - classification_loss: 0.0203 295/500 [================>.............] - ETA: 1:42 - loss: 0.3251 - regression_loss: 0.3049 - classification_loss: 0.0202 296/500 [================>.............] - ETA: 1:42 - loss: 0.3249 - regression_loss: 0.3047 - classification_loss: 0.0202 297/500 [================>.............] - ETA: 1:41 - loss: 0.3252 - regression_loss: 0.3049 - classification_loss: 0.0203 298/500 [================>.............] - ETA: 1:41 - loss: 0.3256 - regression_loss: 0.3053 - classification_loss: 0.0203 299/500 [================>.............] - ETA: 1:40 - loss: 0.3249 - regression_loss: 0.3046 - classification_loss: 0.0202 300/500 [=================>............] - ETA: 1:40 - loss: 0.3246 - regression_loss: 0.3044 - classification_loss: 0.0202 301/500 [=================>............] - ETA: 1:39 - loss: 0.3248 - regression_loss: 0.3046 - classification_loss: 0.0203 302/500 [=================>............] - ETA: 1:39 - loss: 0.3249 - regression_loss: 0.3046 - classification_loss: 0.0203 303/500 [=================>............] - ETA: 1:38 - loss: 0.3250 - regression_loss: 0.3048 - classification_loss: 0.0203 304/500 [=================>............] - ETA: 1:38 - loss: 0.3249 - regression_loss: 0.3047 - classification_loss: 0.0203 305/500 [=================>............] - ETA: 1:37 - loss: 0.3249 - regression_loss: 0.3047 - classification_loss: 0.0202 306/500 [=================>............] - ETA: 1:37 - loss: 0.3248 - regression_loss: 0.3046 - classification_loss: 0.0202 307/500 [=================>............] - ETA: 1:36 - loss: 0.3254 - regression_loss: 0.3052 - classification_loss: 0.0202 308/500 [=================>............] - ETA: 1:36 - loss: 0.3260 - regression_loss: 0.3056 - classification_loss: 0.0204 309/500 [=================>............] - ETA: 1:35 - loss: 0.3257 - regression_loss: 0.3053 - classification_loss: 0.0204 310/500 [=================>............] - ETA: 1:35 - loss: 0.3260 - regression_loss: 0.3057 - classification_loss: 0.0204 311/500 [=================>............] - ETA: 1:34 - loss: 0.3262 - regression_loss: 0.3058 - classification_loss: 0.0204 312/500 [=================>............] - ETA: 1:34 - loss: 0.3256 - regression_loss: 0.3053 - classification_loss: 0.0203 313/500 [=================>............] - ETA: 1:33 - loss: 0.3254 - regression_loss: 0.3051 - classification_loss: 0.0203 314/500 [=================>............] - ETA: 1:33 - loss: 0.3252 - regression_loss: 0.3050 - classification_loss: 0.0202 315/500 [=================>............] - ETA: 1:32 - loss: 0.3251 - regression_loss: 0.3048 - classification_loss: 0.0202 316/500 [=================>............] - ETA: 1:32 - loss: 0.3256 - regression_loss: 0.3054 - classification_loss: 0.0203 317/500 [==================>...........] - ETA: 1:31 - loss: 0.3254 - regression_loss: 0.3051 - classification_loss: 0.0202 318/500 [==================>...........] - ETA: 1:31 - loss: 0.3259 - regression_loss: 0.3056 - classification_loss: 0.0203 319/500 [==================>...........] - ETA: 1:30 - loss: 0.3262 - regression_loss: 0.3059 - classification_loss: 0.0203 320/500 [==================>...........] - ETA: 1:30 - loss: 0.3261 - regression_loss: 0.3058 - classification_loss: 0.0203 321/500 [==================>...........] - ETA: 1:29 - loss: 0.3267 - regression_loss: 0.3064 - classification_loss: 0.0203 322/500 [==================>...........] - ETA: 1:29 - loss: 0.3265 - regression_loss: 0.3062 - classification_loss: 0.0203 323/500 [==================>...........] - ETA: 1:28 - loss: 0.3268 - regression_loss: 0.3065 - classification_loss: 0.0203 324/500 [==================>...........] - ETA: 1:28 - loss: 0.3269 - regression_loss: 0.3066 - classification_loss: 0.0203 325/500 [==================>...........] - ETA: 1:27 - loss: 0.3272 - regression_loss: 0.3068 - classification_loss: 0.0204 326/500 [==================>...........] - ETA: 1:27 - loss: 0.3278 - regression_loss: 0.3074 - classification_loss: 0.0204 327/500 [==================>...........] - ETA: 1:26 - loss: 0.3277 - regression_loss: 0.3073 - classification_loss: 0.0204 328/500 [==================>...........] - ETA: 1:26 - loss: 0.3275 - regression_loss: 0.3071 - classification_loss: 0.0204 329/500 [==================>...........] - ETA: 1:25 - loss: 0.3274 - regression_loss: 0.3070 - classification_loss: 0.0204 330/500 [==================>...........] - ETA: 1:25 - loss: 0.3271 - regression_loss: 0.3068 - classification_loss: 0.0204 331/500 [==================>...........] - ETA: 1:24 - loss: 0.3272 - regression_loss: 0.3068 - classification_loss: 0.0204 332/500 [==================>...........] - ETA: 1:24 - loss: 0.3271 - regression_loss: 0.3068 - classification_loss: 0.0203 333/500 [==================>...........] - ETA: 1:23 - loss: 0.3281 - regression_loss: 0.3077 - classification_loss: 0.0204 334/500 [===================>..........] - ETA: 1:23 - loss: 0.3279 - regression_loss: 0.3075 - classification_loss: 0.0204 335/500 [===================>..........] - ETA: 1:22 - loss: 0.3279 - regression_loss: 0.3075 - classification_loss: 0.0204 336/500 [===================>..........] - ETA: 1:22 - loss: 0.3274 - regression_loss: 0.3070 - classification_loss: 0.0204 337/500 [===================>..........] - ETA: 1:21 - loss: 0.3283 - regression_loss: 0.3078 - classification_loss: 0.0205 338/500 [===================>..........] - ETA: 1:21 - loss: 0.3284 - regression_loss: 0.3079 - classification_loss: 0.0206 339/500 [===================>..........] - ETA: 1:20 - loss: 0.3286 - regression_loss: 0.3080 - classification_loss: 0.0206 340/500 [===================>..........] - ETA: 1:20 - loss: 0.3287 - regression_loss: 0.3081 - classification_loss: 0.0206 341/500 [===================>..........] - ETA: 1:19 - loss: 0.3286 - regression_loss: 0.3080 - classification_loss: 0.0206 342/500 [===================>..........] - ETA: 1:19 - loss: 0.3283 - regression_loss: 0.3077 - classification_loss: 0.0206 343/500 [===================>..........] - ETA: 1:18 - loss: 0.3285 - regression_loss: 0.3079 - classification_loss: 0.0206 344/500 [===================>..........] - ETA: 1:18 - loss: 0.3283 - regression_loss: 0.3078 - classification_loss: 0.0205 345/500 [===================>..........] - ETA: 1:17 - loss: 0.3291 - regression_loss: 0.3086 - classification_loss: 0.0205 346/500 [===================>..........] - ETA: 1:17 - loss: 0.3291 - regression_loss: 0.3086 - classification_loss: 0.0205 347/500 [===================>..........] - ETA: 1:16 - loss: 0.3287 - regression_loss: 0.3082 - classification_loss: 0.0205 348/500 [===================>..........] - ETA: 1:16 - loss: 0.3287 - regression_loss: 0.3082 - classification_loss: 0.0205 349/500 [===================>..........] - ETA: 1:15 - loss: 0.3289 - regression_loss: 0.3084 - classification_loss: 0.0205 350/500 [====================>.........] - ETA: 1:15 - loss: 0.3289 - regression_loss: 0.3084 - classification_loss: 0.0205 351/500 [====================>.........] - ETA: 1:14 - loss: 0.3289 - regression_loss: 0.3084 - classification_loss: 0.0205 352/500 [====================>.........] - ETA: 1:14 - loss: 0.3296 - regression_loss: 0.3091 - classification_loss: 0.0205 353/500 [====================>.........] - ETA: 1:13 - loss: 0.3297 - regression_loss: 0.3092 - classification_loss: 0.0205 354/500 [====================>.........] - ETA: 1:13 - loss: 0.3303 - regression_loss: 0.3097 - classification_loss: 0.0206 355/500 [====================>.........] - ETA: 1:12 - loss: 0.3299 - regression_loss: 0.3093 - classification_loss: 0.0206 356/500 [====================>.........] - ETA: 1:12 - loss: 0.3300 - regression_loss: 0.3094 - classification_loss: 0.0206 357/500 [====================>.........] - ETA: 1:11 - loss: 0.3296 - regression_loss: 0.3091 - classification_loss: 0.0205 358/500 [====================>.........] - ETA: 1:11 - loss: 0.3295 - regression_loss: 0.3090 - classification_loss: 0.0205 359/500 [====================>.........] - ETA: 1:10 - loss: 0.3290 - regression_loss: 0.3085 - classification_loss: 0.0205 360/500 [====================>.........] - ETA: 1:10 - loss: 0.3292 - regression_loss: 0.3087 - classification_loss: 0.0205 361/500 [====================>.........] - ETA: 1:09 - loss: 0.3295 - regression_loss: 0.3089 - classification_loss: 0.0206 362/500 [====================>.........] - ETA: 1:09 - loss: 0.3291 - regression_loss: 0.3086 - classification_loss: 0.0205 363/500 [====================>.........] - ETA: 1:08 - loss: 0.3290 - regression_loss: 0.3085 - classification_loss: 0.0205 364/500 [====================>.........] - ETA: 1:08 - loss: 0.3291 - regression_loss: 0.3086 - classification_loss: 0.0206 365/500 [====================>.........] - ETA: 1:07 - loss: 0.3289 - regression_loss: 0.3083 - classification_loss: 0.0205 366/500 [====================>.........] - ETA: 1:07 - loss: 0.3291 - regression_loss: 0.3085 - classification_loss: 0.0205 367/500 [=====================>........] - ETA: 1:06 - loss: 0.3290 - regression_loss: 0.3085 - classification_loss: 0.0205 368/500 [=====================>........] - ETA: 1:06 - loss: 0.3288 - regression_loss: 0.3083 - classification_loss: 0.0205 369/500 [=====================>........] - ETA: 1:05 - loss: 0.3289 - regression_loss: 0.3084 - classification_loss: 0.0205 370/500 [=====================>........] - ETA: 1:05 - loss: 0.3286 - regression_loss: 0.3082 - classification_loss: 0.0205 371/500 [=====================>........] - ETA: 1:04 - loss: 0.3288 - regression_loss: 0.3083 - classification_loss: 0.0205 372/500 [=====================>........] - ETA: 1:04 - loss: 0.3290 - regression_loss: 0.3085 - classification_loss: 0.0205 373/500 [=====================>........] - ETA: 1:03 - loss: 0.3283 - regression_loss: 0.3079 - classification_loss: 0.0204 374/500 [=====================>........] - ETA: 1:03 - loss: 0.3282 - regression_loss: 0.3078 - classification_loss: 0.0204 375/500 [=====================>........] - ETA: 1:02 - loss: 0.3278 - regression_loss: 0.3074 - classification_loss: 0.0204 376/500 [=====================>........] - ETA: 1:02 - loss: 0.3279 - regression_loss: 0.3075 - classification_loss: 0.0204 377/500 [=====================>........] - ETA: 1:01 - loss: 0.3279 - regression_loss: 0.3075 - classification_loss: 0.0204 378/500 [=====================>........] - ETA: 1:01 - loss: 0.3280 - regression_loss: 0.3076 - classification_loss: 0.0204 379/500 [=====================>........] - ETA: 1:00 - loss: 0.3280 - regression_loss: 0.3076 - classification_loss: 0.0204 380/500 [=====================>........] - ETA: 1:00 - loss: 0.3275 - regression_loss: 0.3072 - classification_loss: 0.0204 381/500 [=====================>........] - ETA: 59s - loss: 0.3274 - regression_loss: 0.3070 - classification_loss: 0.0204  382/500 [=====================>........] - ETA: 59s - loss: 0.3277 - regression_loss: 0.3073 - classification_loss: 0.0204 383/500 [=====================>........] - ETA: 58s - loss: 0.3274 - regression_loss: 0.3071 - classification_loss: 0.0203 384/500 [======================>.......] - ETA: 58s - loss: 0.3277 - regression_loss: 0.3074 - classification_loss: 0.0204 385/500 [======================>.......] - ETA: 57s - loss: 0.3286 - regression_loss: 0.3082 - classification_loss: 0.0204 386/500 [======================>.......] - ETA: 57s - loss: 0.3290 - regression_loss: 0.3086 - classification_loss: 0.0204 387/500 [======================>.......] - ETA: 56s - loss: 0.3288 - regression_loss: 0.3084 - classification_loss: 0.0204 388/500 [======================>.......] - ETA: 56s - loss: 0.3284 - regression_loss: 0.3081 - classification_loss: 0.0203 389/500 [======================>.......] - ETA: 55s - loss: 0.3285 - regression_loss: 0.3082 - classification_loss: 0.0203 390/500 [======================>.......] - ETA: 55s - loss: 0.3284 - regression_loss: 0.3081 - classification_loss: 0.0203 391/500 [======================>.......] - ETA: 54s - loss: 0.3284 - regression_loss: 0.3081 - classification_loss: 0.0203 392/500 [======================>.......] - ETA: 54s - loss: 0.3297 - regression_loss: 0.3092 - classification_loss: 0.0205 393/500 [======================>.......] - ETA: 53s - loss: 0.3303 - regression_loss: 0.3097 - classification_loss: 0.0205 394/500 [======================>.......] - ETA: 53s - loss: 0.3310 - regression_loss: 0.3105 - classification_loss: 0.0205 395/500 [======================>.......] - ETA: 52s - loss: 0.3315 - regression_loss: 0.3110 - classification_loss: 0.0205 396/500 [======================>.......] - ETA: 52s - loss: 0.3327 - regression_loss: 0.3122 - classification_loss: 0.0205 397/500 [======================>.......] - ETA: 51s - loss: 0.3323 - regression_loss: 0.3119 - classification_loss: 0.0205 398/500 [======================>.......] - ETA: 51s - loss: 0.3321 - regression_loss: 0.3117 - classification_loss: 0.0205 399/500 [======================>.......] - ETA: 50s - loss: 0.3321 - regression_loss: 0.3116 - classification_loss: 0.0205 400/500 [=======================>......] - ETA: 50s - loss: 0.3321 - regression_loss: 0.3117 - classification_loss: 0.0205 401/500 [=======================>......] - ETA: 49s - loss: 0.3323 - regression_loss: 0.3118 - classification_loss: 0.0205 402/500 [=======================>......] - ETA: 49s - loss: 0.3329 - regression_loss: 0.3123 - classification_loss: 0.0207 403/500 [=======================>......] - ETA: 48s - loss: 0.3330 - regression_loss: 0.3124 - classification_loss: 0.0207 404/500 [=======================>......] - ETA: 48s - loss: 0.3329 - regression_loss: 0.3122 - classification_loss: 0.0207 405/500 [=======================>......] - ETA: 47s - loss: 0.3328 - regression_loss: 0.3121 - classification_loss: 0.0207 406/500 [=======================>......] - ETA: 47s - loss: 0.3323 - regression_loss: 0.3117 - classification_loss: 0.0207 407/500 [=======================>......] - ETA: 46s - loss: 0.3323 - regression_loss: 0.3116 - classification_loss: 0.0206 408/500 [=======================>......] - ETA: 46s - loss: 0.3318 - regression_loss: 0.3112 - classification_loss: 0.0206 409/500 [=======================>......] - ETA: 45s - loss: 0.3321 - regression_loss: 0.3115 - classification_loss: 0.0206 410/500 [=======================>......] - ETA: 45s - loss: 0.3323 - regression_loss: 0.3117 - classification_loss: 0.0206 411/500 [=======================>......] - ETA: 44s - loss: 0.3330 - regression_loss: 0.3124 - classification_loss: 0.0206 412/500 [=======================>......] - ETA: 44s - loss: 0.3327 - regression_loss: 0.3121 - classification_loss: 0.0206 413/500 [=======================>......] - ETA: 43s - loss: 0.3326 - regression_loss: 0.3121 - classification_loss: 0.0206 414/500 [=======================>......] - ETA: 43s - loss: 0.3331 - regression_loss: 0.3125 - classification_loss: 0.0206 415/500 [=======================>......] - ETA: 42s - loss: 0.3327 - regression_loss: 0.3122 - classification_loss: 0.0205 416/500 [=======================>......] - ETA: 42s - loss: 0.3325 - regression_loss: 0.3120 - classification_loss: 0.0205 417/500 [========================>.....] - ETA: 41s - loss: 0.3330 - regression_loss: 0.3124 - classification_loss: 0.0206 418/500 [========================>.....] - ETA: 41s - loss: 0.3329 - regression_loss: 0.3123 - classification_loss: 0.0206 419/500 [========================>.....] - ETA: 40s - loss: 0.3327 - regression_loss: 0.3121 - classification_loss: 0.0206 420/500 [========================>.....] - ETA: 40s - loss: 0.3324 - regression_loss: 0.3118 - classification_loss: 0.0206 421/500 [========================>.....] - ETA: 39s - loss: 0.3325 - regression_loss: 0.3120 - classification_loss: 0.0206 422/500 [========================>.....] - ETA: 39s - loss: 0.3326 - regression_loss: 0.3120 - classification_loss: 0.0206 423/500 [========================>.....] - ETA: 38s - loss: 0.3322 - regression_loss: 0.3117 - classification_loss: 0.0205 424/500 [========================>.....] - ETA: 38s - loss: 0.3327 - regression_loss: 0.3121 - classification_loss: 0.0206 425/500 [========================>.....] - ETA: 37s - loss: 0.3322 - regression_loss: 0.3116 - classification_loss: 0.0205 426/500 [========================>.....] - ETA: 37s - loss: 0.3326 - regression_loss: 0.3120 - classification_loss: 0.0205 427/500 [========================>.....] - ETA: 36s - loss: 0.3327 - regression_loss: 0.3121 - classification_loss: 0.0206 428/500 [========================>.....] - ETA: 36s - loss: 0.3329 - regression_loss: 0.3123 - classification_loss: 0.0206 429/500 [========================>.....] - ETA: 35s - loss: 0.3330 - regression_loss: 0.3124 - classification_loss: 0.0206 430/500 [========================>.....] - ETA: 35s - loss: 0.3337 - regression_loss: 0.3131 - classification_loss: 0.0206 431/500 [========================>.....] - ETA: 34s - loss: 0.3334 - regression_loss: 0.3128 - classification_loss: 0.0206 432/500 [========================>.....] - ETA: 34s - loss: 0.3336 - regression_loss: 0.3130 - classification_loss: 0.0206 433/500 [========================>.....] - ETA: 33s - loss: 0.3335 - regression_loss: 0.3129 - classification_loss: 0.0206 434/500 [=========================>....] - ETA: 33s - loss: 0.3333 - regression_loss: 0.3127 - classification_loss: 0.0206 435/500 [=========================>....] - ETA: 32s - loss: 0.3332 - regression_loss: 0.3127 - classification_loss: 0.0205 436/500 [=========================>....] - ETA: 32s - loss: 0.3338 - regression_loss: 0.3132 - classification_loss: 0.0206 437/500 [=========================>....] - ETA: 31s - loss: 0.3343 - regression_loss: 0.3137 - classification_loss: 0.0206 438/500 [=========================>....] - ETA: 31s - loss: 0.3350 - regression_loss: 0.3143 - classification_loss: 0.0207 439/500 [=========================>....] - ETA: 30s - loss: 0.3345 - regression_loss: 0.3138 - classification_loss: 0.0206 440/500 [=========================>....] - ETA: 30s - loss: 0.3343 - regression_loss: 0.3136 - classification_loss: 0.0206 441/500 [=========================>....] - ETA: 29s - loss: 0.3346 - regression_loss: 0.3139 - classification_loss: 0.0207 442/500 [=========================>....] - ETA: 29s - loss: 0.3344 - regression_loss: 0.3138 - classification_loss: 0.0207 443/500 [=========================>....] - ETA: 28s - loss: 0.3343 - regression_loss: 0.3136 - classification_loss: 0.0206 444/500 [=========================>....] - ETA: 28s - loss: 0.3346 - regression_loss: 0.3138 - classification_loss: 0.0207 445/500 [=========================>....] - ETA: 27s - loss: 0.3348 - regression_loss: 0.3140 - classification_loss: 0.0208 446/500 [=========================>....] - ETA: 27s - loss: 0.3345 - regression_loss: 0.3138 - classification_loss: 0.0207 447/500 [=========================>....] - ETA: 26s - loss: 0.3346 - regression_loss: 0.3139 - classification_loss: 0.0207 448/500 [=========================>....] - ETA: 26s - loss: 0.3350 - regression_loss: 0.3143 - classification_loss: 0.0207 449/500 [=========================>....] - ETA: 25s - loss: 0.3347 - regression_loss: 0.3140 - classification_loss: 0.0207 450/500 [==========================>...] - ETA: 25s - loss: 0.3347 - regression_loss: 0.3140 - classification_loss: 0.0207 451/500 [==========================>...] - ETA: 24s - loss: 0.3349 - regression_loss: 0.3142 - classification_loss: 0.0207 452/500 [==========================>...] - ETA: 24s - loss: 0.3345 - regression_loss: 0.3138 - classification_loss: 0.0207 453/500 [==========================>...] - ETA: 23s - loss: 0.3343 - regression_loss: 0.3136 - classification_loss: 0.0207 454/500 [==========================>...] - ETA: 23s - loss: 0.3342 - regression_loss: 0.3135 - classification_loss: 0.0207 455/500 [==========================>...] - ETA: 22s - loss: 0.3339 - regression_loss: 0.3132 - classification_loss: 0.0207 456/500 [==========================>...] - ETA: 22s - loss: 0.3334 - regression_loss: 0.3127 - classification_loss: 0.0206 457/500 [==========================>...] - ETA: 21s - loss: 0.3333 - regression_loss: 0.3127 - classification_loss: 0.0206 458/500 [==========================>...] - ETA: 21s - loss: 0.3331 - regression_loss: 0.3125 - classification_loss: 0.0206 459/500 [==========================>...] - ETA: 20s - loss: 0.3331 - regression_loss: 0.3125 - classification_loss: 0.0206 460/500 [==========================>...] - ETA: 20s - loss: 0.3332 - regression_loss: 0.3126 - classification_loss: 0.0206 461/500 [==========================>...] - ETA: 19s - loss: 0.3335 - regression_loss: 0.3128 - classification_loss: 0.0207 462/500 [==========================>...] - ETA: 19s - loss: 0.3333 - regression_loss: 0.3126 - classification_loss: 0.0207 463/500 [==========================>...] - ETA: 18s - loss: 0.3329 - regression_loss: 0.3122 - classification_loss: 0.0207 464/500 [==========================>...] - ETA: 18s - loss: 0.3327 - regression_loss: 0.3121 - classification_loss: 0.0206 465/500 [==========================>...] - ETA: 17s - loss: 0.3331 - regression_loss: 0.3124 - classification_loss: 0.0207 466/500 [==========================>...] - ETA: 17s - loss: 0.3333 - regression_loss: 0.3126 - classification_loss: 0.0207 467/500 [===========================>..] - ETA: 16s - loss: 0.3331 - regression_loss: 0.3124 - classification_loss: 0.0207 468/500 [===========================>..] - ETA: 16s - loss: 0.3330 - regression_loss: 0.3123 - classification_loss: 0.0207 469/500 [===========================>..] - ETA: 15s - loss: 0.3329 - regression_loss: 0.3122 - classification_loss: 0.0207 470/500 [===========================>..] - ETA: 15s - loss: 0.3327 - regression_loss: 0.3121 - classification_loss: 0.0207 471/500 [===========================>..] - ETA: 14s - loss: 0.3331 - regression_loss: 0.3125 - classification_loss: 0.0206 472/500 [===========================>..] - ETA: 14s - loss: 0.3331 - regression_loss: 0.3124 - classification_loss: 0.0207 473/500 [===========================>..] - ETA: 13s - loss: 0.3331 - regression_loss: 0.3124 - classification_loss: 0.0207 474/500 [===========================>..] - ETA: 13s - loss: 0.3329 - regression_loss: 0.3123 - classification_loss: 0.0207 475/500 [===========================>..] - ETA: 12s - loss: 0.3332 - regression_loss: 0.3125 - classification_loss: 0.0207 476/500 [===========================>..] - ETA: 12s - loss: 0.3329 - regression_loss: 0.3123 - classification_loss: 0.0207 477/500 [===========================>..] - ETA: 11s - loss: 0.3326 - regression_loss: 0.3120 - classification_loss: 0.0207 478/500 [===========================>..] - ETA: 11s - loss: 0.3324 - regression_loss: 0.3117 - classification_loss: 0.0207 479/500 [===========================>..] - ETA: 10s - loss: 0.3322 - regression_loss: 0.3115 - classification_loss: 0.0207 480/500 [===========================>..] - ETA: 10s - loss: 0.3319 - regression_loss: 0.3112 - classification_loss: 0.0206 481/500 [===========================>..] - ETA: 9s - loss: 0.3317 - regression_loss: 0.3110 - classification_loss: 0.0206  482/500 [===========================>..] - ETA: 9s - loss: 0.3311 - regression_loss: 0.3105 - classification_loss: 0.0206 483/500 [===========================>..] - ETA: 8s - loss: 0.3311 - regression_loss: 0.3105 - classification_loss: 0.0206 484/500 [============================>.] - ETA: 8s - loss: 0.3310 - regression_loss: 0.3104 - classification_loss: 0.0206 485/500 [============================>.] - ETA: 7s - loss: 0.3317 - regression_loss: 0.3110 - classification_loss: 0.0206 486/500 [============================>.] - ETA: 7s - loss: 0.3316 - regression_loss: 0.3109 - classification_loss: 0.0206 487/500 [============================>.] - ETA: 6s - loss: 0.3311 - regression_loss: 0.3105 - classification_loss: 0.0206 488/500 [============================>.] - ETA: 6s - loss: 0.3310 - regression_loss: 0.3104 - classification_loss: 0.0206 489/500 [============================>.] - ETA: 5s - loss: 0.3313 - regression_loss: 0.3107 - classification_loss: 0.0206 490/500 [============================>.] - ETA: 5s - loss: 0.3313 - regression_loss: 0.3107 - classification_loss: 0.0206 491/500 [============================>.] - ETA: 4s - loss: 0.3312 - regression_loss: 0.3106 - classification_loss: 0.0206 492/500 [============================>.] - ETA: 4s - loss: 0.3314 - regression_loss: 0.3108 - classification_loss: 0.0206 493/500 [============================>.] - ETA: 3s - loss: 0.3312 - regression_loss: 0.3106 - classification_loss: 0.0206 494/500 [============================>.] - ETA: 3s - loss: 0.3309 - regression_loss: 0.3103 - classification_loss: 0.0206 495/500 [============================>.] - ETA: 2s - loss: 0.3309 - regression_loss: 0.3104 - classification_loss: 0.0206 496/500 [============================>.] - ETA: 2s - loss: 0.3310 - regression_loss: 0.3104 - classification_loss: 0.0206 497/500 [============================>.] - ETA: 1s - loss: 0.3310 - regression_loss: 0.3104 - classification_loss: 0.0206 498/500 [============================>.] - ETA: 1s - loss: 0.3308 - regression_loss: 0.3102 - classification_loss: 0.0205 499/500 [============================>.] - ETA: 0s - loss: 0.3312 - regression_loss: 0.3107 - classification_loss: 0.0205 500/500 [==============================] - 251s 502ms/step - loss: 0.3312 - regression_loss: 0.3106 - classification_loss: 0.0206 1172 instances of class plum with average precision: 0.7312 mAP: 0.7312 Epoch 00015: saving model to ./training/snapshots/resnet101_pascal_15.h5 Epoch 16/150 1/500 [..............................] - ETA: 4:06 - loss: 0.3179 - regression_loss: 0.3009 - classification_loss: 0.0170 2/500 [..............................] - ETA: 4:05 - loss: 0.4513 - regression_loss: 0.4357 - classification_loss: 0.0156 3/500 [..............................] - ETA: 4:03 - loss: 0.4040 - regression_loss: 0.3880 - classification_loss: 0.0160 4/500 [..............................] - ETA: 4:02 - loss: 0.3503 - regression_loss: 0.3338 - classification_loss: 0.0165 5/500 [..............................] - ETA: 4:03 - loss: 0.3471 - regression_loss: 0.3303 - classification_loss: 0.0167 6/500 [..............................] - ETA: 4:03 - loss: 0.3181 - regression_loss: 0.3027 - classification_loss: 0.0154 7/500 [..............................] - ETA: 4:03 - loss: 0.3168 - regression_loss: 0.2986 - classification_loss: 0.0182 8/500 [..............................] - ETA: 4:04 - loss: 0.3012 - regression_loss: 0.2844 - classification_loss: 0.0168 9/500 [..............................] - ETA: 4:04 - loss: 0.3120 - regression_loss: 0.2944 - classification_loss: 0.0176 10/500 [..............................] - ETA: 4:03 - loss: 0.3006 - regression_loss: 0.2841 - classification_loss: 0.0165 11/500 [..............................] - ETA: 4:02 - loss: 0.2821 - regression_loss: 0.2665 - classification_loss: 0.0156 12/500 [..............................] - ETA: 4:03 - loss: 0.3121 - regression_loss: 0.2934 - classification_loss: 0.0187 13/500 [..............................] - ETA: 4:05 - loss: 0.3088 - regression_loss: 0.2905 - classification_loss: 0.0183 14/500 [..............................] - ETA: 4:04 - loss: 0.3051 - regression_loss: 0.2866 - classification_loss: 0.0185 15/500 [..............................] - ETA: 4:04 - loss: 0.3118 - regression_loss: 0.2928 - classification_loss: 0.0190 16/500 [..............................] - ETA: 4:04 - loss: 0.3031 - regression_loss: 0.2848 - classification_loss: 0.0183 17/500 [>.............................] - ETA: 4:03 - loss: 0.2980 - regression_loss: 0.2797 - classification_loss: 0.0182 18/500 [>.............................] - ETA: 4:03 - loss: 0.2864 - regression_loss: 0.2689 - classification_loss: 0.0176 19/500 [>.............................] - ETA: 4:03 - loss: 0.2951 - regression_loss: 0.2779 - classification_loss: 0.0172 20/500 [>.............................] - ETA: 4:02 - loss: 0.2981 - regression_loss: 0.2795 - classification_loss: 0.0186 21/500 [>.............................] - ETA: 4:02 - loss: 0.2999 - regression_loss: 0.2805 - classification_loss: 0.0195 22/500 [>.............................] - ETA: 4:01 - loss: 0.2982 - regression_loss: 0.2788 - classification_loss: 0.0194 23/500 [>.............................] - ETA: 4:00 - loss: 0.2929 - regression_loss: 0.2740 - classification_loss: 0.0189 24/500 [>.............................] - ETA: 3:59 - loss: 0.2899 - regression_loss: 0.2716 - classification_loss: 0.0184 25/500 [>.............................] - ETA: 3:59 - loss: 0.2907 - regression_loss: 0.2723 - classification_loss: 0.0184 26/500 [>.............................] - ETA: 3:59 - loss: 0.3054 - regression_loss: 0.2853 - classification_loss: 0.0202 27/500 [>.............................] - ETA: 3:58 - loss: 0.2993 - regression_loss: 0.2796 - classification_loss: 0.0197 28/500 [>.............................] - ETA: 3:57 - loss: 0.3011 - regression_loss: 0.2814 - classification_loss: 0.0197 29/500 [>.............................] - ETA: 3:56 - loss: 0.3016 - regression_loss: 0.2821 - classification_loss: 0.0195 30/500 [>.............................] - ETA: 3:55 - loss: 0.2975 - regression_loss: 0.2783 - classification_loss: 0.0192 31/500 [>.............................] - ETA: 3:55 - loss: 0.2952 - regression_loss: 0.2763 - classification_loss: 0.0188 32/500 [>.............................] - ETA: 3:54 - loss: 0.2903 - regression_loss: 0.2719 - classification_loss: 0.0185 33/500 [>.............................] - ETA: 3:54 - loss: 0.2971 - regression_loss: 0.2772 - classification_loss: 0.0199 34/500 [=>............................] - ETA: 3:53 - loss: 0.2920 - regression_loss: 0.2725 - classification_loss: 0.0195 35/500 [=>............................] - ETA: 3:52 - loss: 0.2938 - regression_loss: 0.2744 - classification_loss: 0.0195 36/500 [=>............................] - ETA: 3:52 - loss: 0.2949 - regression_loss: 0.2754 - classification_loss: 0.0194 37/500 [=>............................] - ETA: 3:52 - loss: 0.2973 - regression_loss: 0.2780 - classification_loss: 0.0193 38/500 [=>............................] - ETA: 3:51 - loss: 0.3027 - regression_loss: 0.2829 - classification_loss: 0.0198 39/500 [=>............................] - ETA: 3:51 - loss: 0.3055 - regression_loss: 0.2851 - classification_loss: 0.0204 40/500 [=>............................] - ETA: 3:50 - loss: 0.3032 - regression_loss: 0.2830 - classification_loss: 0.0201 41/500 [=>............................] - ETA: 3:50 - loss: 0.2997 - regression_loss: 0.2800 - classification_loss: 0.0197 42/500 [=>............................] - ETA: 3:49 - loss: 0.2980 - regression_loss: 0.2782 - classification_loss: 0.0198 43/500 [=>............................] - ETA: 3:49 - loss: 0.2978 - regression_loss: 0.2782 - classification_loss: 0.0197 44/500 [=>............................] - ETA: 3:49 - loss: 0.2973 - regression_loss: 0.2778 - classification_loss: 0.0195 45/500 [=>............................] - ETA: 3:48 - loss: 0.2970 - regression_loss: 0.2772 - classification_loss: 0.0198 46/500 [=>............................] - ETA: 3:47 - loss: 0.2964 - regression_loss: 0.2767 - classification_loss: 0.0197 47/500 [=>............................] - ETA: 3:47 - loss: 0.2942 - regression_loss: 0.2748 - classification_loss: 0.0194 48/500 [=>............................] - ETA: 3:47 - loss: 0.2944 - regression_loss: 0.2752 - classification_loss: 0.0193 49/500 [=>............................] - ETA: 3:46 - loss: 0.2923 - regression_loss: 0.2734 - classification_loss: 0.0189 50/500 [==>...........................] - ETA: 3:45 - loss: 0.2921 - regression_loss: 0.2731 - classification_loss: 0.0190 51/500 [==>...........................] - ETA: 3:45 - loss: 0.2943 - regression_loss: 0.2754 - classification_loss: 0.0189 52/500 [==>...........................] - ETA: 3:44 - loss: 0.2934 - regression_loss: 0.2745 - classification_loss: 0.0189 53/500 [==>...........................] - ETA: 3:44 - loss: 0.2932 - regression_loss: 0.2740 - classification_loss: 0.0192 54/500 [==>...........................] - ETA: 3:43 - loss: 0.2941 - regression_loss: 0.2748 - classification_loss: 0.0193 55/500 [==>...........................] - ETA: 3:43 - loss: 0.2944 - regression_loss: 0.2751 - classification_loss: 0.0193 56/500 [==>...........................] - ETA: 3:43 - loss: 0.2971 - regression_loss: 0.2777 - classification_loss: 0.0193 57/500 [==>...........................] - ETA: 3:42 - loss: 0.3020 - regression_loss: 0.2824 - classification_loss: 0.0196 58/500 [==>...........................] - ETA: 3:41 - loss: 0.3028 - regression_loss: 0.2832 - classification_loss: 0.0196 59/500 [==>...........................] - ETA: 3:41 - loss: 0.3013 - regression_loss: 0.2818 - classification_loss: 0.0195 60/500 [==>...........................] - ETA: 3:40 - loss: 0.2984 - regression_loss: 0.2790 - classification_loss: 0.0194 61/500 [==>...........................] - ETA: 3:40 - loss: 0.2976 - regression_loss: 0.2782 - classification_loss: 0.0194 62/500 [==>...........................] - ETA: 3:39 - loss: 0.2978 - regression_loss: 0.2784 - classification_loss: 0.0194 63/500 [==>...........................] - ETA: 3:39 - loss: 0.2966 - regression_loss: 0.2773 - classification_loss: 0.0193 64/500 [==>...........................] - ETA: 3:38 - loss: 0.2950 - regression_loss: 0.2756 - classification_loss: 0.0194 65/500 [==>...........................] - ETA: 3:38 - loss: 0.2926 - regression_loss: 0.2734 - classification_loss: 0.0192 66/500 [==>...........................] - ETA: 3:37 - loss: 0.2969 - regression_loss: 0.2775 - classification_loss: 0.0194 67/500 [===>..........................] - ETA: 3:37 - loss: 0.2953 - regression_loss: 0.2762 - classification_loss: 0.0192 68/500 [===>..........................] - ETA: 3:37 - loss: 0.2948 - regression_loss: 0.2757 - classification_loss: 0.0191 69/500 [===>..........................] - ETA: 3:36 - loss: 0.2962 - regression_loss: 0.2771 - classification_loss: 0.0191 70/500 [===>..........................] - ETA: 3:36 - loss: 0.2993 - regression_loss: 0.2802 - classification_loss: 0.0192 71/500 [===>..........................] - ETA: 3:35 - loss: 0.3001 - regression_loss: 0.2809 - classification_loss: 0.0192 72/500 [===>..........................] - ETA: 3:35 - loss: 0.3001 - regression_loss: 0.2808 - classification_loss: 0.0192 73/500 [===>..........................] - ETA: 3:34 - loss: 0.2985 - regression_loss: 0.2795 - classification_loss: 0.0191 74/500 [===>..........................] - ETA: 3:33 - loss: 0.2984 - regression_loss: 0.2794 - classification_loss: 0.0190 75/500 [===>..........................] - ETA: 3:33 - loss: 0.2990 - regression_loss: 0.2802 - classification_loss: 0.0189 76/500 [===>..........................] - ETA: 3:32 - loss: 0.2979 - regression_loss: 0.2791 - classification_loss: 0.0188 77/500 [===>..........................] - ETA: 3:32 - loss: 0.2991 - regression_loss: 0.2802 - classification_loss: 0.0189 78/500 [===>..........................] - ETA: 3:31 - loss: 0.2970 - regression_loss: 0.2783 - classification_loss: 0.0187 79/500 [===>..........................] - ETA: 3:31 - loss: 0.2958 - regression_loss: 0.2772 - classification_loss: 0.0186 80/500 [===>..........................] - ETA: 3:31 - loss: 0.2949 - regression_loss: 0.2764 - classification_loss: 0.0184 81/500 [===>..........................] - ETA: 3:30 - loss: 0.2941 - regression_loss: 0.2758 - classification_loss: 0.0183 82/500 [===>..........................] - ETA: 3:29 - loss: 0.2965 - regression_loss: 0.2777 - classification_loss: 0.0188 83/500 [===>..........................] - ETA: 3:29 - loss: 0.2989 - regression_loss: 0.2799 - classification_loss: 0.0190 84/500 [====>.........................] - ETA: 3:29 - loss: 0.2967 - regression_loss: 0.2778 - classification_loss: 0.0189 85/500 [====>.........................] - ETA: 3:28 - loss: 0.2965 - regression_loss: 0.2778 - classification_loss: 0.0187 86/500 [====>.........................] - ETA: 3:28 - loss: 0.2972 - regression_loss: 0.2785 - classification_loss: 0.0187 87/500 [====>.........................] - ETA: 3:27 - loss: 0.2980 - regression_loss: 0.2794 - classification_loss: 0.0186 88/500 [====>.........................] - ETA: 3:27 - loss: 0.2963 - regression_loss: 0.2778 - classification_loss: 0.0185 89/500 [====>.........................] - ETA: 3:26 - loss: 0.2964 - regression_loss: 0.2780 - classification_loss: 0.0184 90/500 [====>.........................] - ETA: 3:26 - loss: 0.2970 - regression_loss: 0.2785 - classification_loss: 0.0185 91/500 [====>.........................] - ETA: 3:25 - loss: 0.2991 - regression_loss: 0.2806 - classification_loss: 0.0185 92/500 [====>.........................] - ETA: 3:25 - loss: 0.2988 - regression_loss: 0.2802 - classification_loss: 0.0187 93/500 [====>.........................] - ETA: 3:25 - loss: 0.2994 - regression_loss: 0.2807 - classification_loss: 0.0187 94/500 [====>.........................] - ETA: 3:24 - loss: 0.2982 - regression_loss: 0.2796 - classification_loss: 0.0186 95/500 [====>.........................] - ETA: 3:23 - loss: 0.2968 - regression_loss: 0.2784 - classification_loss: 0.0184 96/500 [====>.........................] - ETA: 3:23 - loss: 0.3014 - regression_loss: 0.2826 - classification_loss: 0.0188 97/500 [====>.........................] - ETA: 3:22 - loss: 0.3023 - regression_loss: 0.2834 - classification_loss: 0.0189 98/500 [====>.........................] - ETA: 3:22 - loss: 0.3020 - regression_loss: 0.2831 - classification_loss: 0.0189 99/500 [====>.........................] - ETA: 3:21 - loss: 0.3010 - regression_loss: 0.2822 - classification_loss: 0.0188 100/500 [=====>........................] - ETA: 3:21 - loss: 0.3013 - regression_loss: 0.2825 - classification_loss: 0.0188 101/500 [=====>........................] - ETA: 3:20 - loss: 0.3025 - regression_loss: 0.2838 - classification_loss: 0.0187 102/500 [=====>........................] - ETA: 3:20 - loss: 0.3006 - regression_loss: 0.2821 - classification_loss: 0.0186 103/500 [=====>........................] - ETA: 3:19 - loss: 0.3033 - regression_loss: 0.2847 - classification_loss: 0.0187 104/500 [=====>........................] - ETA: 3:19 - loss: 0.3018 - regression_loss: 0.2832 - classification_loss: 0.0186 105/500 [=====>........................] - ETA: 3:18 - loss: 0.3027 - regression_loss: 0.2842 - classification_loss: 0.0185 106/500 [=====>........................] - ETA: 3:18 - loss: 0.3005 - regression_loss: 0.2821 - classification_loss: 0.0184 107/500 [=====>........................] - ETA: 3:17 - loss: 0.3016 - regression_loss: 0.2831 - classification_loss: 0.0184 108/500 [=====>........................] - ETA: 3:17 - loss: 0.3006 - regression_loss: 0.2822 - classification_loss: 0.0184 109/500 [=====>........................] - ETA: 3:16 - loss: 0.3008 - regression_loss: 0.2825 - classification_loss: 0.0183 110/500 [=====>........................] - ETA: 3:16 - loss: 0.3002 - regression_loss: 0.2821 - classification_loss: 0.0182 111/500 [=====>........................] - ETA: 3:15 - loss: 0.3015 - regression_loss: 0.2833 - classification_loss: 0.0182 112/500 [=====>........................] - ETA: 3:15 - loss: 0.3007 - regression_loss: 0.2826 - classification_loss: 0.0181 113/500 [=====>........................] - ETA: 3:14 - loss: 0.3005 - regression_loss: 0.2824 - classification_loss: 0.0181 114/500 [=====>........................] - ETA: 3:14 - loss: 0.2999 - regression_loss: 0.2819 - classification_loss: 0.0181 115/500 [=====>........................] - ETA: 3:13 - loss: 0.2997 - regression_loss: 0.2817 - classification_loss: 0.0180 116/500 [=====>........................] - ETA: 3:13 - loss: 0.2994 - regression_loss: 0.2815 - classification_loss: 0.0179 117/500 [======>.......................] - ETA: 3:12 - loss: 0.3009 - regression_loss: 0.2824 - classification_loss: 0.0185 118/500 [======>.......................] - ETA: 3:12 - loss: 0.3004 - regression_loss: 0.2820 - classification_loss: 0.0184 119/500 [======>.......................] - ETA: 3:11 - loss: 0.3008 - regression_loss: 0.2822 - classification_loss: 0.0185 120/500 [======>.......................] - ETA: 3:11 - loss: 0.3006 - regression_loss: 0.2821 - classification_loss: 0.0185 121/500 [======>.......................] - ETA: 3:10 - loss: 0.3006 - regression_loss: 0.2821 - classification_loss: 0.0185 122/500 [======>.......................] - ETA: 3:10 - loss: 0.3008 - regression_loss: 0.2821 - classification_loss: 0.0187 123/500 [======>.......................] - ETA: 3:09 - loss: 0.3013 - regression_loss: 0.2826 - classification_loss: 0.0188 124/500 [======>.......................] - ETA: 3:09 - loss: 0.3028 - regression_loss: 0.2838 - classification_loss: 0.0190 125/500 [======>.......................] - ETA: 3:08 - loss: 0.3025 - regression_loss: 0.2836 - classification_loss: 0.0189 126/500 [======>.......................] - ETA: 3:08 - loss: 0.3016 - regression_loss: 0.2826 - classification_loss: 0.0190 127/500 [======>.......................] - ETA: 3:07 - loss: 0.3025 - regression_loss: 0.2834 - classification_loss: 0.0191 128/500 [======>.......................] - ETA: 3:07 - loss: 0.3017 - regression_loss: 0.2826 - classification_loss: 0.0191 129/500 [======>.......................] - ETA: 3:06 - loss: 0.3017 - regression_loss: 0.2826 - classification_loss: 0.0191 130/500 [======>.......................] - ETA: 3:06 - loss: 0.3007 - regression_loss: 0.2817 - classification_loss: 0.0190 131/500 [======>.......................] - ETA: 3:05 - loss: 0.3008 - regression_loss: 0.2818 - classification_loss: 0.0190 132/500 [======>.......................] - ETA: 3:05 - loss: 0.3019 - regression_loss: 0.2828 - classification_loss: 0.0191 133/500 [======>.......................] - ETA: 3:04 - loss: 0.3019 - regression_loss: 0.2828 - classification_loss: 0.0191 134/500 [=======>......................] - ETA: 3:04 - loss: 0.3038 - regression_loss: 0.2846 - classification_loss: 0.0192 135/500 [=======>......................] - ETA: 3:03 - loss: 0.3038 - regression_loss: 0.2846 - classification_loss: 0.0192 136/500 [=======>......................] - ETA: 3:03 - loss: 0.3032 - regression_loss: 0.2840 - classification_loss: 0.0191 137/500 [=======>......................] - ETA: 3:02 - loss: 0.3039 - regression_loss: 0.2848 - classification_loss: 0.0191 138/500 [=======>......................] - ETA: 3:02 - loss: 0.3037 - regression_loss: 0.2846 - classification_loss: 0.0190 139/500 [=======>......................] - ETA: 3:01 - loss: 0.3040 - regression_loss: 0.2849 - classification_loss: 0.0190 140/500 [=======>......................] - ETA: 3:01 - loss: 0.3038 - regression_loss: 0.2849 - classification_loss: 0.0189 141/500 [=======>......................] - ETA: 3:00 - loss: 0.3036 - regression_loss: 0.2848 - classification_loss: 0.0188 142/500 [=======>......................] - ETA: 3:00 - loss: 0.3056 - regression_loss: 0.2865 - classification_loss: 0.0192 143/500 [=======>......................] - ETA: 2:59 - loss: 0.3048 - regression_loss: 0.2857 - classification_loss: 0.0191 144/500 [=======>......................] - ETA: 2:59 - loss: 0.3044 - regression_loss: 0.2853 - classification_loss: 0.0190 145/500 [=======>......................] - ETA: 2:58 - loss: 0.3053 - regression_loss: 0.2862 - classification_loss: 0.0191 146/500 [=======>......................] - ETA: 2:58 - loss: 0.3053 - regression_loss: 0.2862 - classification_loss: 0.0191 147/500 [=======>......................] - ETA: 2:57 - loss: 0.3049 - regression_loss: 0.2858 - classification_loss: 0.0191 148/500 [=======>......................] - ETA: 2:57 - loss: 0.3034 - regression_loss: 0.2844 - classification_loss: 0.0190 149/500 [=======>......................] - ETA: 2:56 - loss: 0.3030 - regression_loss: 0.2842 - classification_loss: 0.0189 150/500 [========>.....................] - ETA: 2:56 - loss: 0.3034 - regression_loss: 0.2846 - classification_loss: 0.0188 151/500 [========>.....................] - ETA: 2:55 - loss: 0.3023 - regression_loss: 0.2835 - classification_loss: 0.0187 152/500 [========>.....................] - ETA: 2:55 - loss: 0.3032 - regression_loss: 0.2845 - classification_loss: 0.0188 153/500 [========>.....................] - ETA: 2:54 - loss: 0.3027 - regression_loss: 0.2839 - classification_loss: 0.0188 154/500 [========>.....................] - ETA: 2:54 - loss: 0.3026 - regression_loss: 0.2839 - classification_loss: 0.0187 155/500 [========>.....................] - ETA: 2:53 - loss: 0.3020 - regression_loss: 0.2834 - classification_loss: 0.0187 156/500 [========>.....................] - ETA: 2:53 - loss: 0.3029 - regression_loss: 0.2843 - classification_loss: 0.0186 157/500 [========>.....................] - ETA: 2:52 - loss: 0.3049 - regression_loss: 0.2861 - classification_loss: 0.0188 158/500 [========>.....................] - ETA: 2:52 - loss: 0.3057 - regression_loss: 0.2869 - classification_loss: 0.0188 159/500 [========>.....................] - ETA: 2:51 - loss: 0.3052 - regression_loss: 0.2865 - classification_loss: 0.0188 160/500 [========>.....................] - ETA: 2:51 - loss: 0.3049 - regression_loss: 0.2861 - classification_loss: 0.0188 161/500 [========>.....................] - ETA: 2:50 - loss: 0.3051 - regression_loss: 0.2863 - classification_loss: 0.0188 162/500 [========>.....................] - ETA: 2:50 - loss: 0.3046 - regression_loss: 0.2859 - classification_loss: 0.0187 163/500 [========>.....................] - ETA: 2:49 - loss: 0.3040 - regression_loss: 0.2853 - classification_loss: 0.0187 164/500 [========>.....................] - ETA: 2:49 - loss: 0.3031 - regression_loss: 0.2845 - classification_loss: 0.0186 165/500 [========>.....................] - ETA: 2:48 - loss: 0.3039 - regression_loss: 0.2853 - classification_loss: 0.0186 166/500 [========>.....................] - ETA: 2:48 - loss: 0.3047 - regression_loss: 0.2860 - classification_loss: 0.0188 167/500 [=========>....................] - ETA: 2:47 - loss: 0.3035 - regression_loss: 0.2848 - classification_loss: 0.0187 168/500 [=========>....................] - ETA: 2:47 - loss: 0.3024 - regression_loss: 0.2838 - classification_loss: 0.0186 169/500 [=========>....................] - ETA: 2:46 - loss: 0.3021 - regression_loss: 0.2836 - classification_loss: 0.0186 170/500 [=========>....................] - ETA: 2:46 - loss: 0.3029 - regression_loss: 0.2842 - classification_loss: 0.0186 171/500 [=========>....................] - ETA: 2:45 - loss: 0.3034 - regression_loss: 0.2846 - classification_loss: 0.0188 172/500 [=========>....................] - ETA: 2:45 - loss: 0.3042 - regression_loss: 0.2854 - classification_loss: 0.0189 173/500 [=========>....................] - ETA: 2:44 - loss: 0.3038 - regression_loss: 0.2850 - classification_loss: 0.0188 174/500 [=========>....................] - ETA: 2:44 - loss: 0.3030 - regression_loss: 0.2842 - classification_loss: 0.0188 175/500 [=========>....................] - ETA: 2:43 - loss: 0.3034 - regression_loss: 0.2846 - classification_loss: 0.0188 176/500 [=========>....................] - ETA: 2:43 - loss: 0.3034 - regression_loss: 0.2846 - classification_loss: 0.0188 177/500 [=========>....................] - ETA: 2:42 - loss: 0.3026 - regression_loss: 0.2839 - classification_loss: 0.0187 178/500 [=========>....................] - ETA: 2:42 - loss: 0.3032 - regression_loss: 0.2845 - classification_loss: 0.0187 179/500 [=========>....................] - ETA: 2:41 - loss: 0.3036 - regression_loss: 0.2849 - classification_loss: 0.0187 180/500 [=========>....................] - ETA: 2:41 - loss: 0.3028 - regression_loss: 0.2842 - classification_loss: 0.0186 181/500 [=========>....................] - ETA: 2:40 - loss: 0.3032 - regression_loss: 0.2846 - classification_loss: 0.0186 182/500 [=========>....................] - ETA: 2:40 - loss: 0.3038 - regression_loss: 0.2852 - classification_loss: 0.0187 183/500 [=========>....................] - ETA: 2:39 - loss: 0.3051 - regression_loss: 0.2864 - classification_loss: 0.0187 184/500 [==========>...................] - ETA: 2:38 - loss: 0.3073 - regression_loss: 0.2885 - classification_loss: 0.0189 185/500 [==========>...................] - ETA: 2:38 - loss: 0.3079 - regression_loss: 0.2891 - classification_loss: 0.0188 186/500 [==========>...................] - ETA: 2:37 - loss: 0.3093 - regression_loss: 0.2904 - classification_loss: 0.0189 187/500 [==========>...................] - ETA: 2:37 - loss: 0.3091 - regression_loss: 0.2902 - classification_loss: 0.0189 188/500 [==========>...................] - ETA: 2:36 - loss: 0.3085 - regression_loss: 0.2897 - classification_loss: 0.0188 189/500 [==========>...................] - ETA: 2:36 - loss: 0.3083 - regression_loss: 0.2895 - classification_loss: 0.0188 190/500 [==========>...................] - ETA: 2:36 - loss: 0.3088 - regression_loss: 0.2900 - classification_loss: 0.0188 191/500 [==========>...................] - ETA: 2:35 - loss: 0.3093 - regression_loss: 0.2905 - classification_loss: 0.0188 192/500 [==========>...................] - ETA: 2:35 - loss: 0.3090 - regression_loss: 0.2903 - classification_loss: 0.0188 193/500 [==========>...................] - ETA: 2:34 - loss: 0.3097 - regression_loss: 0.2908 - classification_loss: 0.0189 194/500 [==========>...................] - ETA: 2:33 - loss: 0.3098 - regression_loss: 0.2909 - classification_loss: 0.0190 195/500 [==========>...................] - ETA: 2:33 - loss: 0.3103 - regression_loss: 0.2913 - classification_loss: 0.0189 196/500 [==========>...................] - ETA: 2:32 - loss: 0.3099 - regression_loss: 0.2910 - classification_loss: 0.0189 197/500 [==========>...................] - ETA: 2:32 - loss: 0.3097 - regression_loss: 0.2909 - classification_loss: 0.0188 198/500 [==========>...................] - ETA: 2:31 - loss: 0.3103 - regression_loss: 0.2914 - classification_loss: 0.0189 199/500 [==========>...................] - ETA: 2:31 - loss: 0.3099 - regression_loss: 0.2910 - classification_loss: 0.0189 200/500 [===========>..................] - ETA: 2:30 - loss: 0.3095 - regression_loss: 0.2907 - classification_loss: 0.0189 201/500 [===========>..................] - ETA: 2:30 - loss: 0.3104 - regression_loss: 0.2915 - classification_loss: 0.0189 202/500 [===========>..................] - ETA: 2:29 - loss: 0.3113 - regression_loss: 0.2924 - classification_loss: 0.0189 203/500 [===========>..................] - ETA: 2:29 - loss: 0.3117 - regression_loss: 0.2926 - classification_loss: 0.0191 204/500 [===========>..................] - ETA: 2:28 - loss: 0.3107 - regression_loss: 0.2918 - classification_loss: 0.0190 205/500 [===========>..................] - ETA: 2:28 - loss: 0.3105 - regression_loss: 0.2915 - classification_loss: 0.0189 206/500 [===========>..................] - ETA: 2:27 - loss: 0.3106 - regression_loss: 0.2916 - classification_loss: 0.0190 207/500 [===========>..................] - ETA: 2:27 - loss: 0.3105 - regression_loss: 0.2914 - classification_loss: 0.0190 208/500 [===========>..................] - ETA: 2:26 - loss: 0.3109 - regression_loss: 0.2918 - classification_loss: 0.0191 209/500 [===========>..................] - ETA: 2:26 - loss: 0.3100 - regression_loss: 0.2910 - classification_loss: 0.0190 210/500 [===========>..................] - ETA: 2:25 - loss: 0.3100 - regression_loss: 0.2910 - classification_loss: 0.0190 211/500 [===========>..................] - ETA: 2:25 - loss: 0.3095 - regression_loss: 0.2905 - classification_loss: 0.0190 212/500 [===========>..................] - ETA: 2:24 - loss: 0.3093 - regression_loss: 0.2903 - classification_loss: 0.0190 213/500 [===========>..................] - ETA: 2:24 - loss: 0.3091 - regression_loss: 0.2901 - classification_loss: 0.0189 214/500 [===========>..................] - ETA: 2:23 - loss: 0.3088 - regression_loss: 0.2899 - classification_loss: 0.0189 215/500 [===========>..................] - ETA: 2:23 - loss: 0.3084 - regression_loss: 0.2895 - classification_loss: 0.0189 216/500 [===========>..................] - ETA: 2:22 - loss: 0.3075 - regression_loss: 0.2886 - classification_loss: 0.0189 217/500 [============>.................] - ETA: 2:22 - loss: 0.3072 - regression_loss: 0.2883 - classification_loss: 0.0188 218/500 [============>.................] - ETA: 2:21 - loss: 0.3066 - regression_loss: 0.2878 - classification_loss: 0.0188 219/500 [============>.................] - ETA: 2:21 - loss: 0.3064 - regression_loss: 0.2876 - classification_loss: 0.0188 220/500 [============>.................] - ETA: 2:20 - loss: 0.3067 - regression_loss: 0.2879 - classification_loss: 0.0189 221/500 [============>.................] - ETA: 2:20 - loss: 0.3081 - regression_loss: 0.2892 - classification_loss: 0.0189 222/500 [============>.................] - ETA: 2:19 - loss: 0.3076 - regression_loss: 0.2888 - classification_loss: 0.0188 223/500 [============>.................] - ETA: 2:19 - loss: 0.3078 - regression_loss: 0.2890 - classification_loss: 0.0188 224/500 [============>.................] - ETA: 2:18 - loss: 0.3075 - regression_loss: 0.2888 - classification_loss: 0.0188 225/500 [============>.................] - ETA: 2:18 - loss: 0.3080 - regression_loss: 0.2893 - classification_loss: 0.0188 226/500 [============>.................] - ETA: 2:17 - loss: 0.3082 - regression_loss: 0.2894 - classification_loss: 0.0188 227/500 [============>.................] - ETA: 2:17 - loss: 0.3079 - regression_loss: 0.2891 - classification_loss: 0.0188 228/500 [============>.................] - ETA: 2:16 - loss: 0.3076 - regression_loss: 0.2889 - classification_loss: 0.0188 229/500 [============>.................] - ETA: 2:16 - loss: 0.3073 - regression_loss: 0.2885 - classification_loss: 0.0187 230/500 [============>.................] - ETA: 2:15 - loss: 0.3069 - regression_loss: 0.2883 - classification_loss: 0.0187 231/500 [============>.................] - ETA: 2:15 - loss: 0.3067 - regression_loss: 0.2880 - classification_loss: 0.0187 232/500 [============>.................] - ETA: 2:14 - loss: 0.3061 - regression_loss: 0.2875 - classification_loss: 0.0187 233/500 [============>.................] - ETA: 2:14 - loss: 0.3066 - regression_loss: 0.2879 - classification_loss: 0.0187 234/500 [=============>................] - ETA: 2:13 - loss: 0.3069 - regression_loss: 0.2881 - classification_loss: 0.0188 235/500 [=============>................] - ETA: 2:13 - loss: 0.3069 - regression_loss: 0.2882 - classification_loss: 0.0187 236/500 [=============>................] - ETA: 2:12 - loss: 0.3066 - regression_loss: 0.2879 - classification_loss: 0.0187 237/500 [=============>................] - ETA: 2:12 - loss: 0.3065 - regression_loss: 0.2878 - classification_loss: 0.0187 238/500 [=============>................] - ETA: 2:11 - loss: 0.3060 - regression_loss: 0.2874 - classification_loss: 0.0186 239/500 [=============>................] - ETA: 2:11 - loss: 0.3057 - regression_loss: 0.2871 - classification_loss: 0.0186 240/500 [=============>................] - ETA: 2:10 - loss: 0.3056 - regression_loss: 0.2870 - classification_loss: 0.0186 241/500 [=============>................] - ETA: 2:10 - loss: 0.3049 - regression_loss: 0.2863 - classification_loss: 0.0186 242/500 [=============>................] - ETA: 2:09 - loss: 0.3048 - regression_loss: 0.2862 - classification_loss: 0.0186 243/500 [=============>................] - ETA: 2:09 - loss: 0.3057 - regression_loss: 0.2869 - classification_loss: 0.0188 244/500 [=============>................] - ETA: 2:08 - loss: 0.3052 - regression_loss: 0.2865 - classification_loss: 0.0188 245/500 [=============>................] - ETA: 2:08 - loss: 0.3054 - regression_loss: 0.2867 - classification_loss: 0.0188 246/500 [=============>................] - ETA: 2:07 - loss: 0.3052 - regression_loss: 0.2865 - classification_loss: 0.0187 247/500 [=============>................] - ETA: 2:07 - loss: 0.3058 - regression_loss: 0.2870 - classification_loss: 0.0188 248/500 [=============>................] - ETA: 2:06 - loss: 0.3058 - regression_loss: 0.2870 - classification_loss: 0.0188 249/500 [=============>................] - ETA: 2:06 - loss: 0.3063 - regression_loss: 0.2873 - classification_loss: 0.0189 250/500 [==============>...............] - ETA: 2:05 - loss: 0.3058 - regression_loss: 0.2869 - classification_loss: 0.0189 251/500 [==============>...............] - ETA: 2:05 - loss: 0.3061 - regression_loss: 0.2872 - classification_loss: 0.0190 252/500 [==============>...............] - ETA: 2:04 - loss: 0.3067 - regression_loss: 0.2877 - classification_loss: 0.0190 253/500 [==============>...............] - ETA: 2:04 - loss: 0.3064 - regression_loss: 0.2874 - classification_loss: 0.0190 254/500 [==============>...............] - ETA: 2:03 - loss: 0.3055 - regression_loss: 0.2866 - classification_loss: 0.0189 255/500 [==============>...............] - ETA: 2:03 - loss: 0.3050 - regression_loss: 0.2862 - classification_loss: 0.0188 256/500 [==============>...............] - ETA: 2:02 - loss: 0.3055 - regression_loss: 0.2866 - classification_loss: 0.0189 257/500 [==============>...............] - ETA: 2:02 - loss: 0.3062 - regression_loss: 0.2872 - classification_loss: 0.0190 258/500 [==============>...............] - ETA: 2:01 - loss: 0.3071 - regression_loss: 0.2881 - classification_loss: 0.0191 259/500 [==============>...............] - ETA: 2:01 - loss: 0.3073 - regression_loss: 0.2882 - classification_loss: 0.0191 260/500 [==============>...............] - ETA: 2:00 - loss: 0.3069 - regression_loss: 0.2879 - classification_loss: 0.0190 261/500 [==============>...............] - ETA: 2:00 - loss: 0.3067 - regression_loss: 0.2877 - classification_loss: 0.0190 262/500 [==============>...............] - ETA: 1:59 - loss: 0.3061 - regression_loss: 0.2871 - classification_loss: 0.0190 263/500 [==============>...............] - ETA: 1:59 - loss: 0.3059 - regression_loss: 0.2869 - classification_loss: 0.0190 264/500 [==============>...............] - ETA: 1:58 - loss: 0.3061 - regression_loss: 0.2871 - classification_loss: 0.0190 265/500 [==============>...............] - ETA: 1:58 - loss: 0.3058 - regression_loss: 0.2869 - classification_loss: 0.0189 266/500 [==============>...............] - ETA: 1:57 - loss: 0.3058 - regression_loss: 0.2869 - classification_loss: 0.0189 267/500 [===============>..............] - ETA: 1:57 - loss: 0.3062 - regression_loss: 0.2872 - classification_loss: 0.0189 268/500 [===============>..............] - ETA: 1:56 - loss: 0.3061 - regression_loss: 0.2871 - classification_loss: 0.0190 269/500 [===============>..............] - ETA: 1:56 - loss: 0.3063 - regression_loss: 0.2873 - classification_loss: 0.0190 270/500 [===============>..............] - ETA: 1:55 - loss: 0.3062 - regression_loss: 0.2872 - classification_loss: 0.0189 271/500 [===============>..............] - ETA: 1:55 - loss: 0.3056 - regression_loss: 0.2867 - classification_loss: 0.0189 272/500 [===============>..............] - ETA: 1:54 - loss: 0.3056 - regression_loss: 0.2867 - classification_loss: 0.0189 273/500 [===============>..............] - ETA: 1:54 - loss: 0.3053 - regression_loss: 0.2864 - classification_loss: 0.0189 274/500 [===============>..............] - ETA: 1:53 - loss: 0.3050 - regression_loss: 0.2861 - classification_loss: 0.0188 275/500 [===============>..............] - ETA: 1:53 - loss: 0.3051 - regression_loss: 0.2863 - classification_loss: 0.0188 276/500 [===============>..............] - ETA: 1:52 - loss: 0.3054 - regression_loss: 0.2866 - classification_loss: 0.0188 277/500 [===============>..............] - ETA: 1:52 - loss: 0.3058 - regression_loss: 0.2870 - classification_loss: 0.0188 278/500 [===============>..............] - ETA: 1:51 - loss: 0.3053 - regression_loss: 0.2866 - classification_loss: 0.0188 279/500 [===============>..............] - ETA: 1:51 - loss: 0.3055 - regression_loss: 0.2868 - classification_loss: 0.0187 280/500 [===============>..............] - ETA: 1:50 - loss: 0.3051 - regression_loss: 0.2864 - classification_loss: 0.0187 281/500 [===============>..............] - ETA: 1:50 - loss: 0.3052 - regression_loss: 0.2865 - classification_loss: 0.0187 282/500 [===============>..............] - ETA: 1:49 - loss: 0.3053 - regression_loss: 0.2866 - classification_loss: 0.0187 283/500 [===============>..............] - ETA: 1:49 - loss: 0.3056 - regression_loss: 0.2868 - classification_loss: 0.0188 284/500 [================>.............] - ETA: 1:48 - loss: 0.3051 - regression_loss: 0.2864 - classification_loss: 0.0188 285/500 [================>.............] - ETA: 1:48 - loss: 0.3064 - regression_loss: 0.2875 - classification_loss: 0.0189 286/500 [================>.............] - ETA: 1:47 - loss: 0.3061 - regression_loss: 0.2872 - classification_loss: 0.0190 287/500 [================>.............] - ETA: 1:47 - loss: 0.3054 - regression_loss: 0.2865 - classification_loss: 0.0189 288/500 [================>.............] - ETA: 1:46 - loss: 0.3057 - regression_loss: 0.2868 - classification_loss: 0.0189 289/500 [================>.............] - ETA: 1:46 - loss: 0.3061 - regression_loss: 0.2872 - classification_loss: 0.0189 290/500 [================>.............] - ETA: 1:45 - loss: 0.3057 - regression_loss: 0.2868 - classification_loss: 0.0189 291/500 [================>.............] - ETA: 1:45 - loss: 0.3055 - regression_loss: 0.2866 - classification_loss: 0.0189 292/500 [================>.............] - ETA: 1:44 - loss: 0.3052 - regression_loss: 0.2863 - classification_loss: 0.0189 293/500 [================>.............] - ETA: 1:44 - loss: 0.3050 - regression_loss: 0.2861 - classification_loss: 0.0188 294/500 [================>.............] - ETA: 1:43 - loss: 0.3045 - regression_loss: 0.2857 - classification_loss: 0.0188 295/500 [================>.............] - ETA: 1:43 - loss: 0.3041 - regression_loss: 0.2854 - classification_loss: 0.0188 296/500 [================>.............] - ETA: 1:42 - loss: 0.3043 - regression_loss: 0.2856 - classification_loss: 0.0187 297/500 [================>.............] - ETA: 1:42 - loss: 0.3048 - regression_loss: 0.2861 - classification_loss: 0.0188 298/500 [================>.............] - ETA: 1:41 - loss: 0.3045 - regression_loss: 0.2858 - classification_loss: 0.0187 299/500 [================>.............] - ETA: 1:41 - loss: 0.3044 - regression_loss: 0.2857 - classification_loss: 0.0187 300/500 [=================>............] - ETA: 1:40 - loss: 0.3046 - regression_loss: 0.2858 - classification_loss: 0.0187 301/500 [=================>............] - ETA: 1:40 - loss: 0.3060 - regression_loss: 0.2871 - classification_loss: 0.0188 302/500 [=================>............] - ETA: 1:39 - loss: 0.3063 - regression_loss: 0.2874 - classification_loss: 0.0189 303/500 [=================>............] - ETA: 1:39 - loss: 0.3065 - regression_loss: 0.2876 - classification_loss: 0.0189 304/500 [=================>............] - ETA: 1:38 - loss: 0.3064 - regression_loss: 0.2876 - classification_loss: 0.0188 305/500 [=================>............] - ETA: 1:38 - loss: 0.3062 - regression_loss: 0.2874 - classification_loss: 0.0188 306/500 [=================>............] - ETA: 1:37 - loss: 0.3060 - regression_loss: 0.2872 - classification_loss: 0.0188 307/500 [=================>............] - ETA: 1:37 - loss: 0.3070 - regression_loss: 0.2881 - classification_loss: 0.0189 308/500 [=================>............] - ETA: 1:36 - loss: 0.3073 - regression_loss: 0.2884 - classification_loss: 0.0189 309/500 [=================>............] - ETA: 1:36 - loss: 0.3079 - regression_loss: 0.2889 - classification_loss: 0.0190 310/500 [=================>............] - ETA: 1:35 - loss: 0.3076 - regression_loss: 0.2887 - classification_loss: 0.0189 311/500 [=================>............] - ETA: 1:35 - loss: 0.3076 - regression_loss: 0.2886 - classification_loss: 0.0190 312/500 [=================>............] - ETA: 1:34 - loss: 0.3077 - regression_loss: 0.2887 - classification_loss: 0.0190 313/500 [=================>............] - ETA: 1:34 - loss: 0.3077 - regression_loss: 0.2886 - classification_loss: 0.0190 314/500 [=================>............] - ETA: 1:33 - loss: 0.3083 - regression_loss: 0.2893 - classification_loss: 0.0190 315/500 [=================>............] - ETA: 1:33 - loss: 0.3087 - regression_loss: 0.2896 - classification_loss: 0.0191 316/500 [=================>............] - ETA: 1:32 - loss: 0.3102 - regression_loss: 0.2910 - classification_loss: 0.0192 317/500 [==================>...........] - ETA: 1:32 - loss: 0.3104 - regression_loss: 0.2911 - classification_loss: 0.0192 318/500 [==================>...........] - ETA: 1:31 - loss: 0.3108 - regression_loss: 0.2915 - classification_loss: 0.0193 319/500 [==================>...........] - ETA: 1:31 - loss: 0.3108 - regression_loss: 0.2914 - classification_loss: 0.0193 320/500 [==================>...........] - ETA: 1:30 - loss: 0.3102 - regression_loss: 0.2909 - classification_loss: 0.0193 321/500 [==================>...........] - ETA: 1:30 - loss: 0.3104 - regression_loss: 0.2910 - classification_loss: 0.0193 322/500 [==================>...........] - ETA: 1:29 - loss: 0.3100 - regression_loss: 0.2907 - classification_loss: 0.0193 323/500 [==================>...........] - ETA: 1:28 - loss: 0.3094 - regression_loss: 0.2902 - classification_loss: 0.0192 324/500 [==================>...........] - ETA: 1:28 - loss: 0.3094 - regression_loss: 0.2902 - classification_loss: 0.0192 325/500 [==================>...........] - ETA: 1:27 - loss: 0.3092 - regression_loss: 0.2900 - classification_loss: 0.0192 326/500 [==================>...........] - ETA: 1:27 - loss: 0.3095 - regression_loss: 0.2902 - classification_loss: 0.0192 327/500 [==================>...........] - ETA: 1:26 - loss: 0.3096 - regression_loss: 0.2904 - classification_loss: 0.0193 328/500 [==================>...........] - ETA: 1:26 - loss: 0.3100 - regression_loss: 0.2908 - classification_loss: 0.0192 329/500 [==================>...........] - ETA: 1:25 - loss: 0.3098 - regression_loss: 0.2906 - classification_loss: 0.0192 330/500 [==================>...........] - ETA: 1:25 - loss: 0.3104 - regression_loss: 0.2913 - classification_loss: 0.0192 331/500 [==================>...........] - ETA: 1:24 - loss: 0.3113 - regression_loss: 0.2921 - classification_loss: 0.0192 332/500 [==================>...........] - ETA: 1:24 - loss: 0.3119 - regression_loss: 0.2927 - classification_loss: 0.0192 333/500 [==================>...........] - ETA: 1:23 - loss: 0.3114 - regression_loss: 0.2922 - classification_loss: 0.0192 334/500 [===================>..........] - ETA: 1:23 - loss: 0.3115 - regression_loss: 0.2923 - classification_loss: 0.0192 335/500 [===================>..........] - ETA: 1:22 - loss: 0.3119 - regression_loss: 0.2926 - classification_loss: 0.0193 336/500 [===================>..........] - ETA: 1:22 - loss: 0.3125 - regression_loss: 0.2931 - classification_loss: 0.0193 337/500 [===================>..........] - ETA: 1:21 - loss: 0.3119 - regression_loss: 0.2926 - classification_loss: 0.0193 338/500 [===================>..........] - ETA: 1:21 - loss: 0.3124 - regression_loss: 0.2930 - classification_loss: 0.0194 339/500 [===================>..........] - ETA: 1:20 - loss: 0.3126 - regression_loss: 0.2931 - classification_loss: 0.0195 340/500 [===================>..........] - ETA: 1:20 - loss: 0.3130 - regression_loss: 0.2934 - classification_loss: 0.0195 341/500 [===================>..........] - ETA: 1:19 - loss: 0.3132 - regression_loss: 0.2936 - classification_loss: 0.0196 342/500 [===================>..........] - ETA: 1:19 - loss: 0.3131 - regression_loss: 0.2935 - classification_loss: 0.0195 343/500 [===================>..........] - ETA: 1:18 - loss: 0.3136 - regression_loss: 0.2939 - classification_loss: 0.0196 344/500 [===================>..........] - ETA: 1:18 - loss: 0.3137 - regression_loss: 0.2940 - classification_loss: 0.0197 345/500 [===================>..........] - ETA: 1:17 - loss: 0.3138 - regression_loss: 0.2941 - classification_loss: 0.0197 346/500 [===================>..........] - ETA: 1:17 - loss: 0.3141 - regression_loss: 0.2944 - classification_loss: 0.0197 347/500 [===================>..........] - ETA: 1:16 - loss: 0.3145 - regression_loss: 0.2948 - classification_loss: 0.0197 348/500 [===================>..........] - ETA: 1:16 - loss: 0.3142 - regression_loss: 0.2946 - classification_loss: 0.0196 349/500 [===================>..........] - ETA: 1:15 - loss: 0.3146 - regression_loss: 0.2950 - classification_loss: 0.0197 350/500 [====================>.........] - ETA: 1:15 - loss: 0.3140 - regression_loss: 0.2944 - classification_loss: 0.0196 351/500 [====================>.........] - ETA: 1:14 - loss: 0.3142 - regression_loss: 0.2946 - classification_loss: 0.0196 352/500 [====================>.........] - ETA: 1:14 - loss: 0.3147 - regression_loss: 0.2951 - classification_loss: 0.0196 353/500 [====================>.........] - ETA: 1:13 - loss: 0.3143 - regression_loss: 0.2947 - classification_loss: 0.0195 354/500 [====================>.........] - ETA: 1:13 - loss: 0.3148 - regression_loss: 0.2952 - classification_loss: 0.0196 355/500 [====================>.........] - ETA: 1:12 - loss: 0.3146 - regression_loss: 0.2951 - classification_loss: 0.0196 356/500 [====================>.........] - ETA: 1:12 - loss: 0.3142 - regression_loss: 0.2946 - classification_loss: 0.0195 357/500 [====================>.........] - ETA: 1:11 - loss: 0.3144 - regression_loss: 0.2948 - classification_loss: 0.0195 358/500 [====================>.........] - ETA: 1:11 - loss: 0.3148 - regression_loss: 0.2951 - classification_loss: 0.0197 359/500 [====================>.........] - ETA: 1:10 - loss: 0.3148 - regression_loss: 0.2951 - classification_loss: 0.0197 360/500 [====================>.........] - ETA: 1:10 - loss: 0.3147 - regression_loss: 0.2950 - classification_loss: 0.0197 361/500 [====================>.........] - ETA: 1:09 - loss: 0.3147 - regression_loss: 0.2950 - classification_loss: 0.0197 362/500 [====================>.........] - ETA: 1:09 - loss: 0.3145 - regression_loss: 0.2948 - classification_loss: 0.0197 363/500 [====================>.........] - ETA: 1:08 - loss: 0.3144 - regression_loss: 0.2948 - classification_loss: 0.0196 364/500 [====================>.........] - ETA: 1:08 - loss: 0.3141 - regression_loss: 0.2945 - classification_loss: 0.0196 365/500 [====================>.........] - ETA: 1:07 - loss: 0.3139 - regression_loss: 0.2943 - classification_loss: 0.0196 366/500 [====================>.........] - ETA: 1:07 - loss: 0.3137 - regression_loss: 0.2941 - classification_loss: 0.0196 367/500 [=====================>........] - ETA: 1:06 - loss: 0.3137 - regression_loss: 0.2940 - classification_loss: 0.0196 368/500 [=====================>........] - ETA: 1:06 - loss: 0.3140 - regression_loss: 0.2943 - classification_loss: 0.0196 369/500 [=====================>........] - ETA: 1:05 - loss: 0.3146 - regression_loss: 0.2950 - classification_loss: 0.0197 370/500 [=====================>........] - ETA: 1:05 - loss: 0.3149 - regression_loss: 0.2952 - classification_loss: 0.0197 371/500 [=====================>........] - ETA: 1:04 - loss: 0.3149 - regression_loss: 0.2951 - classification_loss: 0.0197 372/500 [=====================>........] - ETA: 1:04 - loss: 0.3146 - regression_loss: 0.2949 - classification_loss: 0.0197 373/500 [=====================>........] - ETA: 1:03 - loss: 0.3142 - regression_loss: 0.2945 - classification_loss: 0.0197 374/500 [=====================>........] - ETA: 1:03 - loss: 0.3138 - regression_loss: 0.2942 - classification_loss: 0.0197 375/500 [=====================>........] - ETA: 1:02 - loss: 0.3135 - regression_loss: 0.2939 - classification_loss: 0.0196 376/500 [=====================>........] - ETA: 1:02 - loss: 0.3135 - regression_loss: 0.2939 - classification_loss: 0.0196 377/500 [=====================>........] - ETA: 1:01 - loss: 0.3133 - regression_loss: 0.2937 - classification_loss: 0.0196 378/500 [=====================>........] - ETA: 1:01 - loss: 0.3131 - regression_loss: 0.2935 - classification_loss: 0.0196 379/500 [=====================>........] - ETA: 1:00 - loss: 0.3133 - regression_loss: 0.2937 - classification_loss: 0.0195 380/500 [=====================>........] - ETA: 1:00 - loss: 0.3132 - regression_loss: 0.2937 - classification_loss: 0.0196 381/500 [=====================>........] - ETA: 59s - loss: 0.3135 - regression_loss: 0.2939 - classification_loss: 0.0196  382/500 [=====================>........] - ETA: 59s - loss: 0.3131 - regression_loss: 0.2935 - classification_loss: 0.0196 383/500 [=====================>........] - ETA: 58s - loss: 0.3136 - regression_loss: 0.2941 - classification_loss: 0.0196 384/500 [======================>.......] - ETA: 58s - loss: 0.3139 - regression_loss: 0.2943 - classification_loss: 0.0196 385/500 [======================>.......] - ETA: 57s - loss: 0.3135 - regression_loss: 0.2940 - classification_loss: 0.0195 386/500 [======================>.......] - ETA: 57s - loss: 0.3134 - regression_loss: 0.2938 - classification_loss: 0.0196 387/500 [======================>.......] - ETA: 56s - loss: 0.3131 - regression_loss: 0.2935 - classification_loss: 0.0195 388/500 [======================>.......] - ETA: 56s - loss: 0.3129 - regression_loss: 0.2934 - classification_loss: 0.0195 389/500 [======================>.......] - ETA: 55s - loss: 0.3123 - regression_loss: 0.2928 - classification_loss: 0.0195 390/500 [======================>.......] - ETA: 55s - loss: 0.3125 - regression_loss: 0.2930 - classification_loss: 0.0195 391/500 [======================>.......] - ETA: 54s - loss: 0.3132 - regression_loss: 0.2936 - classification_loss: 0.0195 392/500 [======================>.......] - ETA: 54s - loss: 0.3130 - regression_loss: 0.2935 - classification_loss: 0.0195 393/500 [======================>.......] - ETA: 53s - loss: 0.3127 - regression_loss: 0.2932 - classification_loss: 0.0195 394/500 [======================>.......] - ETA: 53s - loss: 0.3124 - regression_loss: 0.2929 - classification_loss: 0.0194 395/500 [======================>.......] - ETA: 52s - loss: 0.3130 - regression_loss: 0.2935 - classification_loss: 0.0195 396/500 [======================>.......] - ETA: 52s - loss: 0.3132 - regression_loss: 0.2937 - classification_loss: 0.0195 397/500 [======================>.......] - ETA: 51s - loss: 0.3138 - regression_loss: 0.2943 - classification_loss: 0.0195 398/500 [======================>.......] - ETA: 51s - loss: 0.3134 - regression_loss: 0.2939 - classification_loss: 0.0195 399/500 [======================>.......] - ETA: 50s - loss: 0.3131 - regression_loss: 0.2936 - classification_loss: 0.0194 400/500 [=======================>......] - ETA: 50s - loss: 0.3129 - regression_loss: 0.2935 - classification_loss: 0.0194 401/500 [=======================>......] - ETA: 49s - loss: 0.3124 - regression_loss: 0.2931 - classification_loss: 0.0194 402/500 [=======================>......] - ETA: 49s - loss: 0.3129 - regression_loss: 0.2935 - classification_loss: 0.0194 403/500 [=======================>......] - ETA: 48s - loss: 0.3126 - regression_loss: 0.2933 - classification_loss: 0.0194 404/500 [=======================>......] - ETA: 48s - loss: 0.3126 - regression_loss: 0.2932 - classification_loss: 0.0194 405/500 [=======================>......] - ETA: 47s - loss: 0.3124 - regression_loss: 0.2931 - classification_loss: 0.0194 406/500 [=======================>......] - ETA: 47s - loss: 0.3125 - regression_loss: 0.2932 - classification_loss: 0.0193 407/500 [=======================>......] - ETA: 46s - loss: 0.3128 - regression_loss: 0.2934 - classification_loss: 0.0193 408/500 [=======================>......] - ETA: 46s - loss: 0.3126 - regression_loss: 0.2933 - classification_loss: 0.0193 409/500 [=======================>......] - ETA: 45s - loss: 0.3124 - regression_loss: 0.2931 - classification_loss: 0.0193 410/500 [=======================>......] - ETA: 45s - loss: 0.3125 - regression_loss: 0.2932 - classification_loss: 0.0193 411/500 [=======================>......] - ETA: 44s - loss: 0.3125 - regression_loss: 0.2932 - classification_loss: 0.0193 412/500 [=======================>......] - ETA: 44s - loss: 0.3128 - regression_loss: 0.2934 - classification_loss: 0.0194 413/500 [=======================>......] - ETA: 43s - loss: 0.3128 - regression_loss: 0.2934 - classification_loss: 0.0194 414/500 [=======================>......] - ETA: 43s - loss: 0.3134 - regression_loss: 0.2940 - classification_loss: 0.0194 415/500 [=======================>......] - ETA: 42s - loss: 0.3130 - regression_loss: 0.2936 - classification_loss: 0.0194 416/500 [=======================>......] - ETA: 42s - loss: 0.3131 - regression_loss: 0.2938 - classification_loss: 0.0194 417/500 [========================>.....] - ETA: 41s - loss: 0.3127 - regression_loss: 0.2933 - classification_loss: 0.0193 418/500 [========================>.....] - ETA: 41s - loss: 0.3123 - regression_loss: 0.2930 - classification_loss: 0.0193 419/500 [========================>.....] - ETA: 40s - loss: 0.3121 - regression_loss: 0.2928 - classification_loss: 0.0193 420/500 [========================>.....] - ETA: 40s - loss: 0.3120 - regression_loss: 0.2927 - classification_loss: 0.0193 421/500 [========================>.....] - ETA: 39s - loss: 0.3126 - regression_loss: 0.2933 - classification_loss: 0.0193 422/500 [========================>.....] - ETA: 39s - loss: 0.3121 - regression_loss: 0.2928 - classification_loss: 0.0193 423/500 [========================>.....] - ETA: 38s - loss: 0.3119 - regression_loss: 0.2927 - classification_loss: 0.0193 424/500 [========================>.....] - ETA: 38s - loss: 0.3125 - regression_loss: 0.2932 - classification_loss: 0.0193 425/500 [========================>.....] - ETA: 37s - loss: 0.3121 - regression_loss: 0.2928 - classification_loss: 0.0192 426/500 [========================>.....] - ETA: 37s - loss: 0.3118 - regression_loss: 0.2925 - classification_loss: 0.0193 427/500 [========================>.....] - ETA: 36s - loss: 0.3114 - regression_loss: 0.2922 - classification_loss: 0.0192 428/500 [========================>.....] - ETA: 36s - loss: 0.3112 - regression_loss: 0.2920 - classification_loss: 0.0192 429/500 [========================>.....] - ETA: 35s - loss: 0.3112 - regression_loss: 0.2920 - classification_loss: 0.0192 430/500 [========================>.....] - ETA: 35s - loss: 0.3110 - regression_loss: 0.2918 - classification_loss: 0.0192 431/500 [========================>.....] - ETA: 34s - loss: 0.3113 - regression_loss: 0.2922 - classification_loss: 0.0192 432/500 [========================>.....] - ETA: 34s - loss: 0.3110 - regression_loss: 0.2918 - classification_loss: 0.0192 433/500 [========================>.....] - ETA: 33s - loss: 0.3110 - regression_loss: 0.2919 - classification_loss: 0.0192 434/500 [=========================>....] - ETA: 33s - loss: 0.3110 - regression_loss: 0.2919 - classification_loss: 0.0191 435/500 [=========================>....] - ETA: 32s - loss: 0.3108 - regression_loss: 0.2917 - classification_loss: 0.0191 436/500 [=========================>....] - ETA: 32s - loss: 0.3109 - regression_loss: 0.2918 - classification_loss: 0.0191 437/500 [=========================>....] - ETA: 31s - loss: 0.3111 - regression_loss: 0.2919 - classification_loss: 0.0191 438/500 [=========================>....] - ETA: 31s - loss: 0.3109 - regression_loss: 0.2918 - classification_loss: 0.0191 439/500 [=========================>....] - ETA: 30s - loss: 0.3106 - regression_loss: 0.2916 - classification_loss: 0.0191 440/500 [=========================>....] - ETA: 30s - loss: 0.3109 - regression_loss: 0.2918 - classification_loss: 0.0191 441/500 [=========================>....] - ETA: 29s - loss: 0.3111 - regression_loss: 0.2919 - classification_loss: 0.0192 442/500 [=========================>....] - ETA: 29s - loss: 0.3109 - regression_loss: 0.2917 - classification_loss: 0.0191 443/500 [=========================>....] - ETA: 28s - loss: 0.3111 - regression_loss: 0.2919 - classification_loss: 0.0192 444/500 [=========================>....] - ETA: 28s - loss: 0.3111 - regression_loss: 0.2919 - classification_loss: 0.0192 445/500 [=========================>....] - ETA: 27s - loss: 0.3109 - regression_loss: 0.2917 - classification_loss: 0.0191 446/500 [=========================>....] - ETA: 27s - loss: 0.3105 - regression_loss: 0.2914 - classification_loss: 0.0191 447/500 [=========================>....] - ETA: 26s - loss: 0.3111 - regression_loss: 0.2920 - classification_loss: 0.0191 448/500 [=========================>....] - ETA: 26s - loss: 0.3111 - regression_loss: 0.2920 - classification_loss: 0.0191 449/500 [=========================>....] - ETA: 25s - loss: 0.3111 - regression_loss: 0.2920 - classification_loss: 0.0191 450/500 [==========================>...] - ETA: 25s - loss: 0.3110 - regression_loss: 0.2919 - classification_loss: 0.0191 451/500 [==========================>...] - ETA: 24s - loss: 0.3111 - regression_loss: 0.2920 - classification_loss: 0.0191 452/500 [==========================>...] - ETA: 24s - loss: 0.3110 - regression_loss: 0.2920 - classification_loss: 0.0191 453/500 [==========================>...] - ETA: 23s - loss: 0.3114 - regression_loss: 0.2923 - classification_loss: 0.0191 454/500 [==========================>...] - ETA: 23s - loss: 0.3111 - regression_loss: 0.2920 - classification_loss: 0.0191 455/500 [==========================>...] - ETA: 22s - loss: 0.3117 - regression_loss: 0.2926 - classification_loss: 0.0191 456/500 [==========================>...] - ETA: 22s - loss: 0.3119 - regression_loss: 0.2928 - classification_loss: 0.0191 457/500 [==========================>...] - ETA: 21s - loss: 0.3122 - regression_loss: 0.2931 - classification_loss: 0.0191 458/500 [==========================>...] - ETA: 21s - loss: 0.3123 - regression_loss: 0.2931 - classification_loss: 0.0191 459/500 [==========================>...] - ETA: 20s - loss: 0.3123 - regression_loss: 0.2932 - classification_loss: 0.0191 460/500 [==========================>...] - ETA: 20s - loss: 0.3126 - regression_loss: 0.2935 - classification_loss: 0.0191 461/500 [==========================>...] - ETA: 19s - loss: 0.3125 - regression_loss: 0.2934 - classification_loss: 0.0191 462/500 [==========================>...] - ETA: 19s - loss: 0.3127 - regression_loss: 0.2936 - classification_loss: 0.0192 463/500 [==========================>...] - ETA: 18s - loss: 0.3126 - regression_loss: 0.2934 - classification_loss: 0.0191 464/500 [==========================>...] - ETA: 18s - loss: 0.3122 - regression_loss: 0.2931 - classification_loss: 0.0191 465/500 [==========================>...] - ETA: 17s - loss: 0.3124 - regression_loss: 0.2933 - classification_loss: 0.0191 466/500 [==========================>...] - ETA: 17s - loss: 0.3123 - regression_loss: 0.2932 - classification_loss: 0.0191 467/500 [===========================>..] - ETA: 16s - loss: 0.3121 - regression_loss: 0.2930 - classification_loss: 0.0191 468/500 [===========================>..] - ETA: 16s - loss: 0.3117 - regression_loss: 0.2926 - classification_loss: 0.0191 469/500 [===========================>..] - ETA: 15s - loss: 0.3117 - regression_loss: 0.2926 - classification_loss: 0.0191 470/500 [===========================>..] - ETA: 15s - loss: 0.3115 - regression_loss: 0.2924 - classification_loss: 0.0191 471/500 [===========================>..] - ETA: 14s - loss: 0.3116 - regression_loss: 0.2925 - classification_loss: 0.0191 472/500 [===========================>..] - ETA: 14s - loss: 0.3115 - regression_loss: 0.2924 - classification_loss: 0.0190 473/500 [===========================>..] - ETA: 13s - loss: 0.3114 - regression_loss: 0.2924 - classification_loss: 0.0190 474/500 [===========================>..] - ETA: 13s - loss: 0.3116 - regression_loss: 0.2925 - classification_loss: 0.0191 475/500 [===========================>..] - ETA: 12s - loss: 0.3120 - regression_loss: 0.2929 - classification_loss: 0.0191 476/500 [===========================>..] - ETA: 12s - loss: 0.3122 - regression_loss: 0.2930 - classification_loss: 0.0191 477/500 [===========================>..] - ETA: 11s - loss: 0.3126 - regression_loss: 0.2935 - classification_loss: 0.0191 478/500 [===========================>..] - ETA: 11s - loss: 0.3130 - regression_loss: 0.2939 - classification_loss: 0.0191 479/500 [===========================>..] - ETA: 10s - loss: 0.3129 - regression_loss: 0.2938 - classification_loss: 0.0191 480/500 [===========================>..] - ETA: 10s - loss: 0.3131 - regression_loss: 0.2939 - classification_loss: 0.0191 481/500 [===========================>..] - ETA: 9s - loss: 0.3128 - regression_loss: 0.2937 - classification_loss: 0.0191  482/500 [===========================>..] - ETA: 9s - loss: 0.3127 - regression_loss: 0.2936 - classification_loss: 0.0191 483/500 [===========================>..] - ETA: 8s - loss: 0.3126 - regression_loss: 0.2935 - classification_loss: 0.0191 484/500 [============================>.] - ETA: 8s - loss: 0.3126 - regression_loss: 0.2935 - classification_loss: 0.0191 485/500 [============================>.] - ETA: 7s - loss: 0.3127 - regression_loss: 0.2936 - classification_loss: 0.0191 486/500 [============================>.] - ETA: 7s - loss: 0.3128 - regression_loss: 0.2937 - classification_loss: 0.0191 487/500 [============================>.] - ETA: 6s - loss: 0.3126 - regression_loss: 0.2935 - classification_loss: 0.0191 488/500 [============================>.] - ETA: 6s - loss: 0.3131 - regression_loss: 0.2940 - classification_loss: 0.0191 489/500 [============================>.] - ETA: 5s - loss: 0.3129 - regression_loss: 0.2937 - classification_loss: 0.0191 490/500 [============================>.] - ETA: 5s - loss: 0.3132 - regression_loss: 0.2941 - classification_loss: 0.0191 491/500 [============================>.] - ETA: 4s - loss: 0.3133 - regression_loss: 0.2942 - classification_loss: 0.0191 492/500 [============================>.] - ETA: 4s - loss: 0.3133 - regression_loss: 0.2941 - classification_loss: 0.0191 493/500 [============================>.] - ETA: 3s - loss: 0.3136 - regression_loss: 0.2945 - classification_loss: 0.0191 494/500 [============================>.] - ETA: 3s - loss: 0.3137 - regression_loss: 0.2946 - classification_loss: 0.0191 495/500 [============================>.] - ETA: 2s - loss: 0.3137 - regression_loss: 0.2946 - classification_loss: 0.0191 496/500 [============================>.] - ETA: 2s - loss: 0.3139 - regression_loss: 0.2948 - classification_loss: 0.0191 497/500 [============================>.] - ETA: 1s - loss: 0.3136 - regression_loss: 0.2944 - classification_loss: 0.0191 498/500 [============================>.] - ETA: 1s - loss: 0.3134 - regression_loss: 0.2943 - classification_loss: 0.0191 499/500 [============================>.] - ETA: 0s - loss: 0.3139 - regression_loss: 0.2948 - classification_loss: 0.0191 500/500 [==============================] - 251s 502ms/step - loss: 0.3146 - regression_loss: 0.2953 - classification_loss: 0.0192 1172 instances of class plum with average precision: 0.7370 mAP: 0.7370 Epoch 00016: saving model to ./training/snapshots/resnet101_pascal_16.h5 Epoch 17/150 1/500 [..............................] - ETA: 4:31 - loss: 0.2997 - regression_loss: 0.2855 - classification_loss: 0.0142 2/500 [..............................] - ETA: 4:24 - loss: 0.3604 - regression_loss: 0.3390 - classification_loss: 0.0214 3/500 [..............................] - ETA: 4:21 - loss: 0.3450 - regression_loss: 0.3237 - classification_loss: 0.0214 4/500 [..............................] - ETA: 4:18 - loss: 0.3426 - regression_loss: 0.3182 - classification_loss: 0.0245 5/500 [..............................] - ETA: 4:16 - loss: 0.3484 - regression_loss: 0.3251 - classification_loss: 0.0233 6/500 [..............................] - ETA: 4:12 - loss: 0.3563 - regression_loss: 0.3343 - classification_loss: 0.0220 7/500 [..............................] - ETA: 4:12 - loss: 0.3874 - regression_loss: 0.3606 - classification_loss: 0.0269 8/500 [..............................] - ETA: 4:12 - loss: 0.4156 - regression_loss: 0.3878 - classification_loss: 0.0277 9/500 [..............................] - ETA: 4:11 - loss: 0.4021 - regression_loss: 0.3758 - classification_loss: 0.0262 10/500 [..............................] - ETA: 4:10 - loss: 0.4055 - regression_loss: 0.3794 - classification_loss: 0.0262 11/500 [..............................] - ETA: 4:09 - loss: 0.4020 - regression_loss: 0.3751 - classification_loss: 0.0269 12/500 [..............................] - ETA: 4:08 - loss: 0.3815 - regression_loss: 0.3559 - classification_loss: 0.0256 13/500 [..............................] - ETA: 4:08 - loss: 0.3820 - regression_loss: 0.3561 - classification_loss: 0.0258 14/500 [..............................] - ETA: 4:07 - loss: 0.3636 - regression_loss: 0.3390 - classification_loss: 0.0246 15/500 [..............................] - ETA: 4:06 - loss: 0.3636 - regression_loss: 0.3394 - classification_loss: 0.0242 16/500 [..............................] - ETA: 4:06 - loss: 0.3506 - regression_loss: 0.3272 - classification_loss: 0.0234 17/500 [>.............................] - ETA: 4:05 - loss: 0.3467 - regression_loss: 0.3234 - classification_loss: 0.0234 18/500 [>.............................] - ETA: 4:04 - loss: 0.3476 - regression_loss: 0.3243 - classification_loss: 0.0233 19/500 [>.............................] - ETA: 4:04 - loss: 0.3542 - regression_loss: 0.3300 - classification_loss: 0.0242 20/500 [>.............................] - ETA: 4:03 - loss: 0.3490 - regression_loss: 0.3253 - classification_loss: 0.0237 21/500 [>.............................] - ETA: 4:03 - loss: 0.3491 - regression_loss: 0.3262 - classification_loss: 0.0230 22/500 [>.............................] - ETA: 4:02 - loss: 0.3474 - regression_loss: 0.3240 - classification_loss: 0.0233 23/500 [>.............................] - ETA: 4:01 - loss: 0.3440 - regression_loss: 0.3209 - classification_loss: 0.0231 24/500 [>.............................] - ETA: 4:00 - loss: 0.3403 - regression_loss: 0.3178 - classification_loss: 0.0225 25/500 [>.............................] - ETA: 4:00 - loss: 0.3333 - regression_loss: 0.3114 - classification_loss: 0.0219 26/500 [>.............................] - ETA: 3:59 - loss: 0.3323 - regression_loss: 0.3111 - classification_loss: 0.0212 27/500 [>.............................] - ETA: 3:58 - loss: 0.3259 - regression_loss: 0.3052 - classification_loss: 0.0207 28/500 [>.............................] - ETA: 3:57 - loss: 0.3176 - regression_loss: 0.2975 - classification_loss: 0.0201 29/500 [>.............................] - ETA: 4:07 - loss: 0.3263 - regression_loss: 0.3066 - classification_loss: 0.0197 30/500 [>.............................] - ETA: 4:06 - loss: 0.3256 - regression_loss: 0.3060 - classification_loss: 0.0196 31/500 [>.............................] - ETA: 4:05 - loss: 0.3261 - regression_loss: 0.3068 - classification_loss: 0.0193 32/500 [>.............................] - ETA: 4:04 - loss: 0.3284 - regression_loss: 0.3092 - classification_loss: 0.0192 33/500 [>.............................] - ETA: 4:03 - loss: 0.3226 - regression_loss: 0.3039 - classification_loss: 0.0187 34/500 [=>............................] - ETA: 4:02 - loss: 0.3300 - regression_loss: 0.3094 - classification_loss: 0.0206 35/500 [=>............................] - ETA: 4:01 - loss: 0.3366 - regression_loss: 0.3150 - classification_loss: 0.0216 36/500 [=>............................] - ETA: 4:00 - loss: 0.3322 - regression_loss: 0.3111 - classification_loss: 0.0211 37/500 [=>............................] - ETA: 4:00 - loss: 0.3310 - regression_loss: 0.3098 - classification_loss: 0.0212 38/500 [=>............................] - ETA: 3:59 - loss: 0.3295 - regression_loss: 0.3080 - classification_loss: 0.0215 39/500 [=>............................] - ETA: 3:58 - loss: 0.3300 - regression_loss: 0.3083 - classification_loss: 0.0217 40/500 [=>............................] - ETA: 3:58 - loss: 0.3313 - regression_loss: 0.3097 - classification_loss: 0.0216 41/500 [=>............................] - ETA: 3:57 - loss: 0.3281 - regression_loss: 0.3066 - classification_loss: 0.0215 42/500 [=>............................] - ETA: 3:57 - loss: 0.3292 - regression_loss: 0.3072 - classification_loss: 0.0220 43/500 [=>............................] - ETA: 3:56 - loss: 0.3281 - regression_loss: 0.3064 - classification_loss: 0.0217 44/500 [=>............................] - ETA: 3:55 - loss: 0.3233 - regression_loss: 0.3020 - classification_loss: 0.0213 45/500 [=>............................] - ETA: 3:54 - loss: 0.3215 - regression_loss: 0.3005 - classification_loss: 0.0210 46/500 [=>............................] - ETA: 3:54 - loss: 0.3205 - regression_loss: 0.2997 - classification_loss: 0.0208 47/500 [=>............................] - ETA: 3:53 - loss: 0.3161 - regression_loss: 0.2956 - classification_loss: 0.0205 48/500 [=>............................] - ETA: 3:53 - loss: 0.3178 - regression_loss: 0.2973 - classification_loss: 0.0205 49/500 [=>............................] - ETA: 3:52 - loss: 0.3203 - regression_loss: 0.2999 - classification_loss: 0.0204 50/500 [==>...........................] - ETA: 3:51 - loss: 0.3194 - regression_loss: 0.2989 - classification_loss: 0.0206 51/500 [==>...........................] - ETA: 3:51 - loss: 0.3222 - regression_loss: 0.3017 - classification_loss: 0.0205 52/500 [==>...........................] - ETA: 3:50 - loss: 0.3218 - regression_loss: 0.3014 - classification_loss: 0.0204 53/500 [==>...........................] - ETA: 3:49 - loss: 0.3205 - regression_loss: 0.3002 - classification_loss: 0.0203 54/500 [==>...........................] - ETA: 3:49 - loss: 0.3208 - regression_loss: 0.3007 - classification_loss: 0.0201 55/500 [==>...........................] - ETA: 3:48 - loss: 0.3190 - regression_loss: 0.2989 - classification_loss: 0.0201 56/500 [==>...........................] - ETA: 3:47 - loss: 0.3179 - regression_loss: 0.2978 - classification_loss: 0.0201 57/500 [==>...........................] - ETA: 3:47 - loss: 0.3193 - regression_loss: 0.2993 - classification_loss: 0.0200 58/500 [==>...........................] - ETA: 3:46 - loss: 0.3202 - regression_loss: 0.3001 - classification_loss: 0.0201 59/500 [==>...........................] - ETA: 3:46 - loss: 0.3211 - regression_loss: 0.3011 - classification_loss: 0.0200 60/500 [==>...........................] - ETA: 3:45 - loss: 0.3196 - regression_loss: 0.2997 - classification_loss: 0.0199 61/500 [==>...........................] - ETA: 3:44 - loss: 0.3195 - regression_loss: 0.2996 - classification_loss: 0.0199 62/500 [==>...........................] - ETA: 3:44 - loss: 0.3186 - regression_loss: 0.2987 - classification_loss: 0.0199 63/500 [==>...........................] - ETA: 3:43 - loss: 0.3175 - regression_loss: 0.2976 - classification_loss: 0.0198 64/500 [==>...........................] - ETA: 3:43 - loss: 0.3173 - regression_loss: 0.2975 - classification_loss: 0.0198 65/500 [==>...........................] - ETA: 3:42 - loss: 0.3166 - regression_loss: 0.2968 - classification_loss: 0.0198 66/500 [==>...........................] - ETA: 3:42 - loss: 0.3156 - regression_loss: 0.2960 - classification_loss: 0.0196 67/500 [===>..........................] - ETA: 3:41 - loss: 0.3146 - regression_loss: 0.2952 - classification_loss: 0.0195 68/500 [===>..........................] - ETA: 3:40 - loss: 0.3139 - regression_loss: 0.2945 - classification_loss: 0.0194 69/500 [===>..........................] - ETA: 3:40 - loss: 0.3131 - regression_loss: 0.2938 - classification_loss: 0.0193 70/500 [===>..........................] - ETA: 3:39 - loss: 0.3141 - regression_loss: 0.2947 - classification_loss: 0.0194 71/500 [===>..........................] - ETA: 3:39 - loss: 0.3113 - regression_loss: 0.2921 - classification_loss: 0.0192 72/500 [===>..........................] - ETA: 3:38 - loss: 0.3098 - regression_loss: 0.2907 - classification_loss: 0.0190 73/500 [===>..........................] - ETA: 3:38 - loss: 0.3074 - regression_loss: 0.2885 - classification_loss: 0.0188 74/500 [===>..........................] - ETA: 3:37 - loss: 0.3129 - regression_loss: 0.2933 - classification_loss: 0.0195 75/500 [===>..........................] - ETA: 3:36 - loss: 0.3141 - regression_loss: 0.2946 - classification_loss: 0.0195 76/500 [===>..........................] - ETA: 3:36 - loss: 0.3142 - regression_loss: 0.2944 - classification_loss: 0.0198 77/500 [===>..........................] - ETA: 3:35 - loss: 0.3129 - regression_loss: 0.2933 - classification_loss: 0.0196 78/500 [===>..........................] - ETA: 3:35 - loss: 0.3140 - regression_loss: 0.2942 - classification_loss: 0.0198 79/500 [===>..........................] - ETA: 3:34 - loss: 0.3166 - regression_loss: 0.2969 - classification_loss: 0.0198 80/500 [===>..........................] - ETA: 3:34 - loss: 0.3173 - regression_loss: 0.2972 - classification_loss: 0.0201 81/500 [===>..........................] - ETA: 3:33 - loss: 0.3157 - regression_loss: 0.2957 - classification_loss: 0.0200 82/500 [===>..........................] - ETA: 3:33 - loss: 0.3181 - regression_loss: 0.2979 - classification_loss: 0.0202 83/500 [===>..........................] - ETA: 3:32 - loss: 0.3198 - regression_loss: 0.2997 - classification_loss: 0.0202 84/500 [====>.........................] - ETA: 3:32 - loss: 0.3209 - regression_loss: 0.3008 - classification_loss: 0.0201 85/500 [====>.........................] - ETA: 3:31 - loss: 0.3199 - regression_loss: 0.3000 - classification_loss: 0.0199 86/500 [====>.........................] - ETA: 3:30 - loss: 0.3237 - regression_loss: 0.3036 - classification_loss: 0.0201 87/500 [====>.........................] - ETA: 3:30 - loss: 0.3271 - regression_loss: 0.3067 - classification_loss: 0.0204 88/500 [====>.........................] - ETA: 3:29 - loss: 0.3288 - regression_loss: 0.3081 - classification_loss: 0.0206 89/500 [====>.........................] - ETA: 3:29 - loss: 0.3308 - regression_loss: 0.3101 - classification_loss: 0.0207 90/500 [====>.........................] - ETA: 3:28 - loss: 0.3315 - regression_loss: 0.3105 - classification_loss: 0.0210 91/500 [====>.........................] - ETA: 3:28 - loss: 0.3346 - regression_loss: 0.3135 - classification_loss: 0.0211 92/500 [====>.........................] - ETA: 3:27 - loss: 0.3359 - regression_loss: 0.3149 - classification_loss: 0.0210 93/500 [====>.........................] - ETA: 3:27 - loss: 0.3375 - regression_loss: 0.3166 - classification_loss: 0.0209 94/500 [====>.........................] - ETA: 3:26 - loss: 0.3363 - regression_loss: 0.3155 - classification_loss: 0.0208 95/500 [====>.........................] - ETA: 3:26 - loss: 0.3397 - regression_loss: 0.3189 - classification_loss: 0.0207 96/500 [====>.........................] - ETA: 3:25 - loss: 0.3425 - regression_loss: 0.3216 - classification_loss: 0.0208 97/500 [====>.........................] - ETA: 3:24 - loss: 0.3456 - regression_loss: 0.3244 - classification_loss: 0.0212 98/500 [====>.........................] - ETA: 3:24 - loss: 0.3467 - regression_loss: 0.3252 - classification_loss: 0.0214 99/500 [====>.........................] - ETA: 3:23 - loss: 0.3447 - regression_loss: 0.3234 - classification_loss: 0.0213 100/500 [=====>........................] - ETA: 3:23 - loss: 0.3444 - regression_loss: 0.3232 - classification_loss: 0.0212 101/500 [=====>........................] - ETA: 3:22 - loss: 0.3443 - regression_loss: 0.3231 - classification_loss: 0.0213 102/500 [=====>........................] - ETA: 3:22 - loss: 0.3442 - regression_loss: 0.3230 - classification_loss: 0.0211 103/500 [=====>........................] - ETA: 3:21 - loss: 0.3456 - regression_loss: 0.3245 - classification_loss: 0.0211 104/500 [=====>........................] - ETA: 3:21 - loss: 0.3471 - regression_loss: 0.3260 - classification_loss: 0.0211 105/500 [=====>........................] - ETA: 3:20 - loss: 0.3481 - regression_loss: 0.3270 - classification_loss: 0.0211 106/500 [=====>........................] - ETA: 3:20 - loss: 0.3472 - regression_loss: 0.3262 - classification_loss: 0.0210 107/500 [=====>........................] - ETA: 3:19 - loss: 0.3477 - regression_loss: 0.3266 - classification_loss: 0.0211 108/500 [=====>........................] - ETA: 3:19 - loss: 0.3483 - regression_loss: 0.3272 - classification_loss: 0.0211 109/500 [=====>........................] - ETA: 3:18 - loss: 0.3480 - regression_loss: 0.3270 - classification_loss: 0.0210 110/500 [=====>........................] - ETA: 3:18 - loss: 0.3494 - regression_loss: 0.3282 - classification_loss: 0.0212 111/500 [=====>........................] - ETA: 3:17 - loss: 0.3501 - regression_loss: 0.3291 - classification_loss: 0.0211 112/500 [=====>........................] - ETA: 3:16 - loss: 0.3518 - regression_loss: 0.3305 - classification_loss: 0.0213 113/500 [=====>........................] - ETA: 3:16 - loss: 0.3508 - regression_loss: 0.3297 - classification_loss: 0.0212 114/500 [=====>........................] - ETA: 3:15 - loss: 0.3509 - regression_loss: 0.3296 - classification_loss: 0.0212 115/500 [=====>........................] - ETA: 3:15 - loss: 0.3494 - regression_loss: 0.3283 - classification_loss: 0.0211 116/500 [=====>........................] - ETA: 3:14 - loss: 0.3476 - regression_loss: 0.3266 - classification_loss: 0.0210 117/500 [======>.......................] - ETA: 3:14 - loss: 0.3485 - regression_loss: 0.3274 - classification_loss: 0.0211 118/500 [======>.......................] - ETA: 3:13 - loss: 0.3491 - regression_loss: 0.3280 - classification_loss: 0.0211 119/500 [======>.......................] - ETA: 3:13 - loss: 0.3491 - regression_loss: 0.3280 - classification_loss: 0.0211 120/500 [======>.......................] - ETA: 3:12 - loss: 0.3525 - regression_loss: 0.3311 - classification_loss: 0.0214 121/500 [======>.......................] - ETA: 3:12 - loss: 0.3545 - regression_loss: 0.3330 - classification_loss: 0.0214 122/500 [======>.......................] - ETA: 3:11 - loss: 0.3538 - regression_loss: 0.3325 - classification_loss: 0.0213 123/500 [======>.......................] - ETA: 3:11 - loss: 0.3526 - regression_loss: 0.3314 - classification_loss: 0.0212 124/500 [======>.......................] - ETA: 3:10 - loss: 0.3507 - regression_loss: 0.3297 - classification_loss: 0.0210 125/500 [======>.......................] - ETA: 3:10 - loss: 0.3493 - regression_loss: 0.3284 - classification_loss: 0.0209 126/500 [======>.......................] - ETA: 3:09 - loss: 0.3521 - regression_loss: 0.3308 - classification_loss: 0.0213 127/500 [======>.......................] - ETA: 3:09 - loss: 0.3514 - regression_loss: 0.3302 - classification_loss: 0.0212 128/500 [======>.......................] - ETA: 3:08 - loss: 0.3511 - regression_loss: 0.3299 - classification_loss: 0.0211 129/500 [======>.......................] - ETA: 3:08 - loss: 0.3528 - regression_loss: 0.3317 - classification_loss: 0.0212 130/500 [======>.......................] - ETA: 3:07 - loss: 0.3551 - regression_loss: 0.3339 - classification_loss: 0.0212 131/500 [======>.......................] - ETA: 3:06 - loss: 0.3547 - regression_loss: 0.3335 - classification_loss: 0.0212 132/500 [======>.......................] - ETA: 3:06 - loss: 0.3552 - regression_loss: 0.3340 - classification_loss: 0.0212 133/500 [======>.......................] - ETA: 3:05 - loss: 0.3546 - regression_loss: 0.3335 - classification_loss: 0.0211 134/500 [=======>......................] - ETA: 3:05 - loss: 0.3535 - regression_loss: 0.3325 - classification_loss: 0.0211 135/500 [=======>......................] - ETA: 3:04 - loss: 0.3519 - regression_loss: 0.3309 - classification_loss: 0.0210 136/500 [=======>......................] - ETA: 3:04 - loss: 0.3517 - regression_loss: 0.3309 - classification_loss: 0.0208 137/500 [=======>......................] - ETA: 3:03 - loss: 0.3507 - regression_loss: 0.3300 - classification_loss: 0.0207 138/500 [=======>......................] - ETA: 3:03 - loss: 0.3510 - regression_loss: 0.3303 - classification_loss: 0.0207 139/500 [=======>......................] - ETA: 3:02 - loss: 0.3523 - regression_loss: 0.3314 - classification_loss: 0.0209 140/500 [=======>......................] - ETA: 3:02 - loss: 0.3538 - regression_loss: 0.3327 - classification_loss: 0.0211 141/500 [=======>......................] - ETA: 3:01 - loss: 0.3530 - regression_loss: 0.3321 - classification_loss: 0.0210 142/500 [=======>......................] - ETA: 3:01 - loss: 0.3533 - regression_loss: 0.3323 - classification_loss: 0.0211 143/500 [=======>......................] - ETA: 3:00 - loss: 0.3530 - regression_loss: 0.3319 - classification_loss: 0.0211 144/500 [=======>......................] - ETA: 3:00 - loss: 0.3535 - regression_loss: 0.3324 - classification_loss: 0.0211 145/500 [=======>......................] - ETA: 2:59 - loss: 0.3566 - regression_loss: 0.3351 - classification_loss: 0.0215 146/500 [=======>......................] - ETA: 2:59 - loss: 0.3550 - regression_loss: 0.3336 - classification_loss: 0.0214 147/500 [=======>......................] - ETA: 2:58 - loss: 0.3548 - regression_loss: 0.3335 - classification_loss: 0.0213 148/500 [=======>......................] - ETA: 2:58 - loss: 0.3549 - regression_loss: 0.3335 - classification_loss: 0.0213 149/500 [=======>......................] - ETA: 2:57 - loss: 0.3551 - regression_loss: 0.3337 - classification_loss: 0.0214 150/500 [========>.....................] - ETA: 2:57 - loss: 0.3558 - regression_loss: 0.3343 - classification_loss: 0.0215 151/500 [========>.....................] - ETA: 2:56 - loss: 0.3560 - regression_loss: 0.3346 - classification_loss: 0.0215 152/500 [========>.....................] - ETA: 2:56 - loss: 0.3559 - regression_loss: 0.3344 - classification_loss: 0.0215 153/500 [========>.....................] - ETA: 2:55 - loss: 0.3555 - regression_loss: 0.3340 - classification_loss: 0.0215 154/500 [========>.....................] - ETA: 2:55 - loss: 0.3558 - regression_loss: 0.3344 - classification_loss: 0.0215 155/500 [========>.....................] - ETA: 2:54 - loss: 0.3554 - regression_loss: 0.3339 - classification_loss: 0.0215 156/500 [========>.....................] - ETA: 2:54 - loss: 0.3549 - regression_loss: 0.3335 - classification_loss: 0.0214 157/500 [========>.....................] - ETA: 2:53 - loss: 0.3547 - regression_loss: 0.3332 - classification_loss: 0.0215 158/500 [========>.....................] - ETA: 2:53 - loss: 0.3540 - regression_loss: 0.3326 - classification_loss: 0.0214 159/500 [========>.....................] - ETA: 2:52 - loss: 0.3586 - regression_loss: 0.3358 - classification_loss: 0.0229 160/500 [========>.....................] - ETA: 2:51 - loss: 0.3582 - regression_loss: 0.3353 - classification_loss: 0.0229 161/500 [========>.....................] - ETA: 2:51 - loss: 0.3609 - regression_loss: 0.3379 - classification_loss: 0.0230 162/500 [========>.....................] - ETA: 2:51 - loss: 0.3616 - regression_loss: 0.3383 - classification_loss: 0.0232 163/500 [========>.....................] - ETA: 2:50 - loss: 0.3643 - regression_loss: 0.3406 - classification_loss: 0.0236 164/500 [========>.....................] - ETA: 2:50 - loss: 0.3640 - regression_loss: 0.3403 - classification_loss: 0.0236 165/500 [========>.....................] - ETA: 2:49 - loss: 0.3636 - regression_loss: 0.3400 - classification_loss: 0.0236 166/500 [========>.....................] - ETA: 2:48 - loss: 0.3622 - regression_loss: 0.3387 - classification_loss: 0.0235 167/500 [=========>....................] - ETA: 2:48 - loss: 0.3616 - regression_loss: 0.3382 - classification_loss: 0.0234 168/500 [=========>....................] - ETA: 2:47 - loss: 0.3609 - regression_loss: 0.3375 - classification_loss: 0.0234 169/500 [=========>....................] - ETA: 2:47 - loss: 0.3610 - regression_loss: 0.3376 - classification_loss: 0.0234 170/500 [=========>....................] - ETA: 2:46 - loss: 0.3601 - regression_loss: 0.3367 - classification_loss: 0.0234 171/500 [=========>....................] - ETA: 2:46 - loss: 0.3605 - regression_loss: 0.3370 - classification_loss: 0.0236 172/500 [=========>....................] - ETA: 2:45 - loss: 0.3597 - regression_loss: 0.3362 - classification_loss: 0.0235 173/500 [=========>....................] - ETA: 2:45 - loss: 0.3620 - regression_loss: 0.3385 - classification_loss: 0.0235 174/500 [=========>....................] - ETA: 2:44 - loss: 0.3611 - regression_loss: 0.3374 - classification_loss: 0.0236 175/500 [=========>....................] - ETA: 2:44 - loss: 0.3610 - regression_loss: 0.3374 - classification_loss: 0.0236 176/500 [=========>....................] - ETA: 2:43 - loss: 0.3617 - regression_loss: 0.3382 - classification_loss: 0.0235 177/500 [=========>....................] - ETA: 2:43 - loss: 0.3621 - regression_loss: 0.3386 - classification_loss: 0.0235 178/500 [=========>....................] - ETA: 2:42 - loss: 0.3606 - regression_loss: 0.3372 - classification_loss: 0.0235 179/500 [=========>....................] - ETA: 2:42 - loss: 0.3595 - regression_loss: 0.3361 - classification_loss: 0.0234 180/500 [=========>....................] - ETA: 2:41 - loss: 0.3608 - regression_loss: 0.3371 - classification_loss: 0.0237 181/500 [=========>....................] - ETA: 2:41 - loss: 0.3604 - regression_loss: 0.3369 - classification_loss: 0.0236 182/500 [=========>....................] - ETA: 2:40 - loss: 0.3598 - regression_loss: 0.3363 - classification_loss: 0.0235 183/500 [=========>....................] - ETA: 2:40 - loss: 0.3608 - regression_loss: 0.3373 - classification_loss: 0.0235 184/500 [==========>...................] - ETA: 2:39 - loss: 0.3598 - regression_loss: 0.3364 - classification_loss: 0.0234 185/500 [==========>...................] - ETA: 2:39 - loss: 0.3604 - regression_loss: 0.3371 - classification_loss: 0.0233 186/500 [==========>...................] - ETA: 2:38 - loss: 0.3602 - regression_loss: 0.3369 - classification_loss: 0.0233 187/500 [==========>...................] - ETA: 2:38 - loss: 0.3593 - regression_loss: 0.3361 - classification_loss: 0.0232 188/500 [==========>...................] - ETA: 2:37 - loss: 0.3589 - regression_loss: 0.3357 - classification_loss: 0.0232 189/500 [==========>...................] - ETA: 2:37 - loss: 0.3586 - regression_loss: 0.3354 - classification_loss: 0.0231 190/500 [==========>...................] - ETA: 2:36 - loss: 0.3577 - regression_loss: 0.3347 - classification_loss: 0.0231 191/500 [==========>...................] - ETA: 2:36 - loss: 0.3580 - regression_loss: 0.3350 - classification_loss: 0.0231 192/500 [==========>...................] - ETA: 2:35 - loss: 0.3568 - regression_loss: 0.3338 - classification_loss: 0.0230 193/500 [==========>...................] - ETA: 2:35 - loss: 0.3562 - regression_loss: 0.3332 - classification_loss: 0.0229 194/500 [==========>...................] - ETA: 2:34 - loss: 0.3560 - regression_loss: 0.3331 - classification_loss: 0.0229 195/500 [==========>...................] - ETA: 2:34 - loss: 0.3551 - regression_loss: 0.3323 - classification_loss: 0.0228 196/500 [==========>...................] - ETA: 2:33 - loss: 0.3554 - regression_loss: 0.3325 - classification_loss: 0.0229 197/500 [==========>...................] - ETA: 2:33 - loss: 0.3549 - regression_loss: 0.3321 - classification_loss: 0.0228 198/500 [==========>...................] - ETA: 2:32 - loss: 0.3539 - regression_loss: 0.3312 - classification_loss: 0.0227 199/500 [==========>...................] - ETA: 2:32 - loss: 0.3540 - regression_loss: 0.3313 - classification_loss: 0.0227 200/500 [===========>..................] - ETA: 2:31 - loss: 0.3531 - regression_loss: 0.3305 - classification_loss: 0.0226 201/500 [===========>..................] - ETA: 2:31 - loss: 0.3529 - regression_loss: 0.3304 - classification_loss: 0.0225 202/500 [===========>..................] - ETA: 2:30 - loss: 0.3521 - regression_loss: 0.3297 - classification_loss: 0.0225 203/500 [===========>..................] - ETA: 2:30 - loss: 0.3513 - regression_loss: 0.3289 - classification_loss: 0.0224 204/500 [===========>..................] - ETA: 2:29 - loss: 0.3517 - regression_loss: 0.3293 - classification_loss: 0.0224 205/500 [===========>..................] - ETA: 2:29 - loss: 0.3510 - regression_loss: 0.3287 - classification_loss: 0.0223 206/500 [===========>..................] - ETA: 2:28 - loss: 0.3504 - regression_loss: 0.3281 - classification_loss: 0.0223 207/500 [===========>..................] - ETA: 2:28 - loss: 0.3501 - regression_loss: 0.3278 - classification_loss: 0.0222 208/500 [===========>..................] - ETA: 2:27 - loss: 0.3493 - regression_loss: 0.3271 - classification_loss: 0.0222 209/500 [===========>..................] - ETA: 2:27 - loss: 0.3492 - regression_loss: 0.3270 - classification_loss: 0.0222 210/500 [===========>..................] - ETA: 2:26 - loss: 0.3482 - regression_loss: 0.3261 - classification_loss: 0.0221 211/500 [===========>..................] - ETA: 2:26 - loss: 0.3482 - regression_loss: 0.3261 - classification_loss: 0.0221 212/500 [===========>..................] - ETA: 2:25 - loss: 0.3472 - regression_loss: 0.3252 - classification_loss: 0.0220 213/500 [===========>..................] - ETA: 2:25 - loss: 0.3465 - regression_loss: 0.3246 - classification_loss: 0.0219 214/500 [===========>..................] - ETA: 2:24 - loss: 0.3468 - regression_loss: 0.3248 - classification_loss: 0.0220 215/500 [===========>..................] - ETA: 2:24 - loss: 0.3473 - regression_loss: 0.3252 - classification_loss: 0.0221 216/500 [===========>..................] - ETA: 2:23 - loss: 0.3469 - regression_loss: 0.3248 - classification_loss: 0.0221 217/500 [============>.................] - ETA: 2:23 - loss: 0.3461 - regression_loss: 0.3241 - classification_loss: 0.0220 218/500 [============>.................] - ETA: 2:22 - loss: 0.3461 - regression_loss: 0.3241 - classification_loss: 0.0220 219/500 [============>.................] - ETA: 2:22 - loss: 0.3476 - regression_loss: 0.3256 - classification_loss: 0.0221 220/500 [============>.................] - ETA: 2:21 - loss: 0.3471 - regression_loss: 0.3250 - classification_loss: 0.0220 221/500 [============>.................] - ETA: 2:21 - loss: 0.3467 - regression_loss: 0.3247 - classification_loss: 0.0220 222/500 [============>.................] - ETA: 2:20 - loss: 0.3462 - regression_loss: 0.3243 - classification_loss: 0.0219 223/500 [============>.................] - ETA: 2:20 - loss: 0.3468 - regression_loss: 0.3249 - classification_loss: 0.0219 224/500 [============>.................] - ETA: 2:19 - loss: 0.3466 - regression_loss: 0.3246 - classification_loss: 0.0219 225/500 [============>.................] - ETA: 2:19 - loss: 0.3464 - regression_loss: 0.3245 - classification_loss: 0.0219 226/500 [============>.................] - ETA: 2:18 - loss: 0.3462 - regression_loss: 0.3243 - classification_loss: 0.0219 227/500 [============>.................] - ETA: 2:18 - loss: 0.3460 - regression_loss: 0.3242 - classification_loss: 0.0218 228/500 [============>.................] - ETA: 2:17 - loss: 0.3463 - regression_loss: 0.3245 - classification_loss: 0.0219 229/500 [============>.................] - ETA: 2:17 - loss: 0.3460 - regression_loss: 0.3242 - classification_loss: 0.0218 230/500 [============>.................] - ETA: 2:16 - loss: 0.3454 - regression_loss: 0.3236 - classification_loss: 0.0218 231/500 [============>.................] - ETA: 2:15 - loss: 0.3449 - regression_loss: 0.3232 - classification_loss: 0.0218 232/500 [============>.................] - ETA: 2:15 - loss: 0.3450 - regression_loss: 0.3233 - classification_loss: 0.0218 233/500 [============>.................] - ETA: 2:15 - loss: 0.3448 - regression_loss: 0.3230 - classification_loss: 0.0217 234/500 [=============>................] - ETA: 2:14 - loss: 0.3442 - regression_loss: 0.3226 - classification_loss: 0.0217 235/500 [=============>................] - ETA: 2:13 - loss: 0.3446 - regression_loss: 0.3228 - classification_loss: 0.0217 236/500 [=============>................] - ETA: 2:13 - loss: 0.3439 - regression_loss: 0.3222 - classification_loss: 0.0217 237/500 [=============>................] - ETA: 2:12 - loss: 0.3436 - regression_loss: 0.3219 - classification_loss: 0.0217 238/500 [=============>................] - ETA: 2:12 - loss: 0.3439 - regression_loss: 0.3221 - classification_loss: 0.0217 239/500 [=============>................] - ETA: 2:11 - loss: 0.3444 - regression_loss: 0.3226 - classification_loss: 0.0218 240/500 [=============>................] - ETA: 2:11 - loss: 0.3438 - regression_loss: 0.3221 - classification_loss: 0.0218 241/500 [=============>................] - ETA: 2:10 - loss: 0.3431 - regression_loss: 0.3214 - classification_loss: 0.0217 242/500 [=============>................] - ETA: 2:10 - loss: 0.3428 - regression_loss: 0.3211 - classification_loss: 0.0217 243/500 [=============>................] - ETA: 2:09 - loss: 0.3436 - regression_loss: 0.3218 - classification_loss: 0.0218 244/500 [=============>................] - ETA: 2:09 - loss: 0.3437 - regression_loss: 0.3218 - classification_loss: 0.0218 245/500 [=============>................] - ETA: 2:08 - loss: 0.3436 - regression_loss: 0.3219 - classification_loss: 0.0218 246/500 [=============>................] - ETA: 2:08 - loss: 0.3434 - regression_loss: 0.3216 - classification_loss: 0.0217 247/500 [=============>................] - ETA: 2:07 - loss: 0.3427 - regression_loss: 0.3211 - classification_loss: 0.0217 248/500 [=============>................] - ETA: 2:07 - loss: 0.3433 - regression_loss: 0.3214 - classification_loss: 0.0218 249/500 [=============>................] - ETA: 2:06 - loss: 0.3425 - regression_loss: 0.3208 - classification_loss: 0.0217 250/500 [==============>...............] - ETA: 2:06 - loss: 0.3427 - regression_loss: 0.3208 - classification_loss: 0.0219 251/500 [==============>...............] - ETA: 2:05 - loss: 0.3419 - regression_loss: 0.3201 - classification_loss: 0.0218 252/500 [==============>...............] - ETA: 2:05 - loss: 0.3419 - regression_loss: 0.3201 - classification_loss: 0.0218 253/500 [==============>...............] - ETA: 2:04 - loss: 0.3414 - regression_loss: 0.3196 - classification_loss: 0.0217 254/500 [==============>...............] - ETA: 2:04 - loss: 0.3415 - regression_loss: 0.3197 - classification_loss: 0.0218 255/500 [==============>...............] - ETA: 2:03 - loss: 0.3414 - regression_loss: 0.3197 - classification_loss: 0.0218 256/500 [==============>...............] - ETA: 2:03 - loss: 0.3418 - regression_loss: 0.3200 - classification_loss: 0.0218 257/500 [==============>...............] - ETA: 2:02 - loss: 0.3418 - regression_loss: 0.3201 - classification_loss: 0.0217 258/500 [==============>...............] - ETA: 2:02 - loss: 0.3418 - regression_loss: 0.3201 - classification_loss: 0.0217 259/500 [==============>...............] - ETA: 2:01 - loss: 0.3415 - regression_loss: 0.3198 - classification_loss: 0.0217 260/500 [==============>...............] - ETA: 2:01 - loss: 0.3414 - regression_loss: 0.3197 - classification_loss: 0.0217 261/500 [==============>...............] - ETA: 2:00 - loss: 0.3410 - regression_loss: 0.3193 - classification_loss: 0.0217 262/500 [==============>...............] - ETA: 2:00 - loss: 0.3406 - regression_loss: 0.3189 - classification_loss: 0.0216 263/500 [==============>...............] - ETA: 1:59 - loss: 0.3407 - regression_loss: 0.3190 - classification_loss: 0.0216 264/500 [==============>...............] - ETA: 1:59 - loss: 0.3401 - regression_loss: 0.3186 - classification_loss: 0.0216 265/500 [==============>...............] - ETA: 1:58 - loss: 0.3401 - regression_loss: 0.3186 - classification_loss: 0.0215 266/500 [==============>...............] - ETA: 1:58 - loss: 0.3399 - regression_loss: 0.3184 - classification_loss: 0.0215 267/500 [===============>..............] - ETA: 1:57 - loss: 0.3402 - regression_loss: 0.3187 - classification_loss: 0.0215 268/500 [===============>..............] - ETA: 1:57 - loss: 0.3392 - regression_loss: 0.3178 - classification_loss: 0.0214 269/500 [===============>..............] - ETA: 1:56 - loss: 0.3406 - regression_loss: 0.3192 - classification_loss: 0.0215 270/500 [===============>..............] - ETA: 1:56 - loss: 0.3417 - regression_loss: 0.3202 - classification_loss: 0.0215 271/500 [===============>..............] - ETA: 1:55 - loss: 0.3412 - regression_loss: 0.3198 - classification_loss: 0.0214 272/500 [===============>..............] - ETA: 1:55 - loss: 0.3409 - regression_loss: 0.3195 - classification_loss: 0.0214 273/500 [===============>..............] - ETA: 1:54 - loss: 0.3401 - regression_loss: 0.3188 - classification_loss: 0.0213 274/500 [===============>..............] - ETA: 1:54 - loss: 0.3406 - regression_loss: 0.3192 - classification_loss: 0.0214 275/500 [===============>..............] - ETA: 1:53 - loss: 0.3408 - regression_loss: 0.3194 - classification_loss: 0.0214 276/500 [===============>..............] - ETA: 1:53 - loss: 0.3408 - regression_loss: 0.3194 - classification_loss: 0.0214 277/500 [===============>..............] - ETA: 1:52 - loss: 0.3403 - regression_loss: 0.3190 - classification_loss: 0.0214 278/500 [===============>..............] - ETA: 1:52 - loss: 0.3401 - regression_loss: 0.3187 - classification_loss: 0.0213 279/500 [===============>..............] - ETA: 1:51 - loss: 0.3394 - regression_loss: 0.3181 - classification_loss: 0.0213 280/500 [===============>..............] - ETA: 1:51 - loss: 0.3388 - regression_loss: 0.3175 - classification_loss: 0.0213 281/500 [===============>..............] - ETA: 1:50 - loss: 0.3390 - regression_loss: 0.3178 - classification_loss: 0.0212 282/500 [===============>..............] - ETA: 1:50 - loss: 0.3395 - regression_loss: 0.3180 - classification_loss: 0.0215 283/500 [===============>..............] - ETA: 1:49 - loss: 0.3392 - regression_loss: 0.3177 - classification_loss: 0.0215 284/500 [================>.............] - ETA: 1:49 - loss: 0.3393 - regression_loss: 0.3179 - classification_loss: 0.0214 285/500 [================>.............] - ETA: 1:48 - loss: 0.3402 - regression_loss: 0.3186 - classification_loss: 0.0216 286/500 [================>.............] - ETA: 1:48 - loss: 0.3399 - regression_loss: 0.3184 - classification_loss: 0.0215 287/500 [================>.............] - ETA: 1:47 - loss: 0.3394 - regression_loss: 0.3179 - classification_loss: 0.0215 288/500 [================>.............] - ETA: 1:46 - loss: 0.3390 - regression_loss: 0.3176 - classification_loss: 0.0214 289/500 [================>.............] - ETA: 1:46 - loss: 0.3391 - regression_loss: 0.3177 - classification_loss: 0.0214 290/500 [================>.............] - ETA: 1:46 - loss: 0.3396 - regression_loss: 0.3181 - classification_loss: 0.0215 291/500 [================>.............] - ETA: 1:45 - loss: 0.3397 - regression_loss: 0.3181 - classification_loss: 0.0216 292/500 [================>.............] - ETA: 1:44 - loss: 0.3399 - regression_loss: 0.3183 - classification_loss: 0.0216 293/500 [================>.............] - ETA: 1:44 - loss: 0.3392 - regression_loss: 0.3177 - classification_loss: 0.0215 294/500 [================>.............] - ETA: 1:43 - loss: 0.3390 - regression_loss: 0.3175 - classification_loss: 0.0215 295/500 [================>.............] - ETA: 1:43 - loss: 0.3387 - regression_loss: 0.3173 - classification_loss: 0.0215 296/500 [================>.............] - ETA: 1:42 - loss: 0.3384 - regression_loss: 0.3170 - classification_loss: 0.0215 297/500 [================>.............] - ETA: 1:42 - loss: 0.3390 - regression_loss: 0.3175 - classification_loss: 0.0216 298/500 [================>.............] - ETA: 1:41 - loss: 0.3391 - regression_loss: 0.3175 - classification_loss: 0.0216 299/500 [================>.............] - ETA: 1:41 - loss: 0.3387 - regression_loss: 0.3170 - classification_loss: 0.0216 300/500 [=================>............] - ETA: 1:40 - loss: 0.3400 - regression_loss: 0.3183 - classification_loss: 0.0217 301/500 [=================>............] - ETA: 1:40 - loss: 0.3400 - regression_loss: 0.3183 - classification_loss: 0.0217 302/500 [=================>............] - ETA: 1:39 - loss: 0.3423 - regression_loss: 0.3204 - classification_loss: 0.0218 303/500 [=================>............] - ETA: 1:39 - loss: 0.3423 - regression_loss: 0.3205 - classification_loss: 0.0218 304/500 [=================>............] - ETA: 1:38 - loss: 0.3420 - regression_loss: 0.3202 - classification_loss: 0.0218 305/500 [=================>............] - ETA: 1:38 - loss: 0.3424 - regression_loss: 0.3206 - classification_loss: 0.0218 306/500 [=================>............] - ETA: 1:37 - loss: 0.3423 - regression_loss: 0.3205 - classification_loss: 0.0218 307/500 [=================>............] - ETA: 1:37 - loss: 0.3421 - regression_loss: 0.3203 - classification_loss: 0.0218 308/500 [=================>............] - ETA: 1:36 - loss: 0.3420 - regression_loss: 0.3202 - classification_loss: 0.0218 309/500 [=================>............] - ETA: 1:36 - loss: 0.3422 - regression_loss: 0.3205 - classification_loss: 0.0217 310/500 [=================>............] - ETA: 1:35 - loss: 0.3432 - regression_loss: 0.3214 - classification_loss: 0.0218 311/500 [=================>............] - ETA: 1:35 - loss: 0.3440 - regression_loss: 0.3223 - classification_loss: 0.0218 312/500 [=================>............] - ETA: 1:34 - loss: 0.3440 - regression_loss: 0.3223 - classification_loss: 0.0218 313/500 [=================>............] - ETA: 1:34 - loss: 0.3439 - regression_loss: 0.3221 - classification_loss: 0.0218 314/500 [=================>............] - ETA: 1:33 - loss: 0.3439 - regression_loss: 0.3221 - classification_loss: 0.0217 315/500 [=================>............] - ETA: 1:33 - loss: 0.3440 - regression_loss: 0.3222 - classification_loss: 0.0218 316/500 [=================>............] - ETA: 1:32 - loss: 0.3443 - regression_loss: 0.3224 - classification_loss: 0.0218 317/500 [==================>...........] - ETA: 1:32 - loss: 0.3443 - regression_loss: 0.3225 - classification_loss: 0.0218 318/500 [==================>...........] - ETA: 1:31 - loss: 0.3436 - regression_loss: 0.3218 - classification_loss: 0.0218 319/500 [==================>...........] - ETA: 1:31 - loss: 0.3432 - regression_loss: 0.3214 - classification_loss: 0.0217 320/500 [==================>...........] - ETA: 1:30 - loss: 0.3435 - regression_loss: 0.3218 - classification_loss: 0.0217 321/500 [==================>...........] - ETA: 1:30 - loss: 0.3438 - regression_loss: 0.3221 - classification_loss: 0.0217 322/500 [==================>...........] - ETA: 1:29 - loss: 0.3437 - regression_loss: 0.3219 - classification_loss: 0.0218 323/500 [==================>...........] - ETA: 1:29 - loss: 0.3438 - regression_loss: 0.3219 - classification_loss: 0.0218 324/500 [==================>...........] - ETA: 1:28 - loss: 0.3438 - regression_loss: 0.3220 - classification_loss: 0.0218 325/500 [==================>...........] - ETA: 1:28 - loss: 0.3432 - regression_loss: 0.3214 - classification_loss: 0.0217 326/500 [==================>...........] - ETA: 1:27 - loss: 0.3428 - regression_loss: 0.3210 - classification_loss: 0.0217 327/500 [==================>...........] - ETA: 1:27 - loss: 0.3425 - regression_loss: 0.3209 - classification_loss: 0.0217 328/500 [==================>...........] - ETA: 1:26 - loss: 0.3423 - regression_loss: 0.3207 - classification_loss: 0.0216 329/500 [==================>...........] - ETA: 1:26 - loss: 0.3421 - regression_loss: 0.3205 - classification_loss: 0.0217 330/500 [==================>...........] - ETA: 1:25 - loss: 0.3420 - regression_loss: 0.3203 - classification_loss: 0.0217 331/500 [==================>...........] - ETA: 1:25 - loss: 0.3417 - regression_loss: 0.3200 - classification_loss: 0.0217 332/500 [==================>...........] - ETA: 1:24 - loss: 0.3415 - regression_loss: 0.3198 - classification_loss: 0.0216 333/500 [==================>...........] - ETA: 1:24 - loss: 0.3408 - regression_loss: 0.3192 - classification_loss: 0.0216 334/500 [===================>..........] - ETA: 1:23 - loss: 0.3404 - regression_loss: 0.3188 - classification_loss: 0.0216 335/500 [===================>..........] - ETA: 1:23 - loss: 0.3399 - regression_loss: 0.3183 - classification_loss: 0.0215 336/500 [===================>..........] - ETA: 1:22 - loss: 0.3394 - regression_loss: 0.3179 - classification_loss: 0.0215 337/500 [===================>..........] - ETA: 1:22 - loss: 0.3387 - regression_loss: 0.3172 - classification_loss: 0.0214 338/500 [===================>..........] - ETA: 1:21 - loss: 0.3386 - regression_loss: 0.3172 - classification_loss: 0.0214 339/500 [===================>..........] - ETA: 1:21 - loss: 0.3386 - regression_loss: 0.3171 - classification_loss: 0.0214 340/500 [===================>..........] - ETA: 1:20 - loss: 0.3387 - regression_loss: 0.3173 - classification_loss: 0.0214 341/500 [===================>..........] - ETA: 1:20 - loss: 0.3388 - regression_loss: 0.3174 - classification_loss: 0.0214 342/500 [===================>..........] - ETA: 1:19 - loss: 0.3389 - regression_loss: 0.3176 - classification_loss: 0.0214 343/500 [===================>..........] - ETA: 1:19 - loss: 0.3384 - regression_loss: 0.3171 - classification_loss: 0.0213 344/500 [===================>..........] - ETA: 1:18 - loss: 0.3384 - regression_loss: 0.3171 - classification_loss: 0.0213 345/500 [===================>..........] - ETA: 1:18 - loss: 0.3380 - regression_loss: 0.3168 - classification_loss: 0.0212 346/500 [===================>..........] - ETA: 1:17 - loss: 0.3372 - regression_loss: 0.3160 - classification_loss: 0.0212 347/500 [===================>..........] - ETA: 1:17 - loss: 0.3376 - regression_loss: 0.3164 - classification_loss: 0.0212 348/500 [===================>..........] - ETA: 1:16 - loss: 0.3370 - regression_loss: 0.3159 - classification_loss: 0.0212 349/500 [===================>..........] - ETA: 1:16 - loss: 0.3371 - regression_loss: 0.3159 - classification_loss: 0.0212 350/500 [====================>.........] - ETA: 1:15 - loss: 0.3370 - regression_loss: 0.3159 - classification_loss: 0.0211 351/500 [====================>.........] - ETA: 1:15 - loss: 0.3367 - regression_loss: 0.3156 - classification_loss: 0.0211 352/500 [====================>.........] - ETA: 1:14 - loss: 0.3363 - regression_loss: 0.3152 - classification_loss: 0.0211 353/500 [====================>.........] - ETA: 1:14 - loss: 0.3364 - regression_loss: 0.3153 - classification_loss: 0.0211 354/500 [====================>.........] - ETA: 1:13 - loss: 0.3364 - regression_loss: 0.3153 - classification_loss: 0.0211 355/500 [====================>.........] - ETA: 1:13 - loss: 0.3367 - regression_loss: 0.3156 - classification_loss: 0.0212 356/500 [====================>.........] - ETA: 1:12 - loss: 0.3366 - regression_loss: 0.3155 - classification_loss: 0.0211 357/500 [====================>.........] - ETA: 1:12 - loss: 0.3364 - regression_loss: 0.3152 - classification_loss: 0.0211 358/500 [====================>.........] - ETA: 1:11 - loss: 0.3361 - regression_loss: 0.3150 - classification_loss: 0.0211 359/500 [====================>.........] - ETA: 1:11 - loss: 0.3358 - regression_loss: 0.3147 - classification_loss: 0.0211 360/500 [====================>.........] - ETA: 1:10 - loss: 0.3356 - regression_loss: 0.3145 - classification_loss: 0.0211 361/500 [====================>.........] - ETA: 1:10 - loss: 0.3353 - regression_loss: 0.3143 - classification_loss: 0.0210 362/500 [====================>.........] - ETA: 1:09 - loss: 0.3347 - regression_loss: 0.3137 - classification_loss: 0.0210 363/500 [====================>.........] - ETA: 1:09 - loss: 0.3349 - regression_loss: 0.3139 - classification_loss: 0.0210 364/500 [====================>.........] - ETA: 1:08 - loss: 0.3345 - regression_loss: 0.3135 - classification_loss: 0.0210 365/500 [====================>.........] - ETA: 1:08 - loss: 0.3344 - regression_loss: 0.3134 - classification_loss: 0.0210 366/500 [====================>.........] - ETA: 1:07 - loss: 0.3339 - regression_loss: 0.3130 - classification_loss: 0.0209 367/500 [=====================>........] - ETA: 1:07 - loss: 0.3335 - regression_loss: 0.3126 - classification_loss: 0.0209 368/500 [=====================>........] - ETA: 1:06 - loss: 0.3337 - regression_loss: 0.3129 - classification_loss: 0.0209 369/500 [=====================>........] - ETA: 1:06 - loss: 0.3344 - regression_loss: 0.3135 - classification_loss: 0.0209 370/500 [=====================>........] - ETA: 1:05 - loss: 0.3339 - regression_loss: 0.3130 - classification_loss: 0.0209 371/500 [=====================>........] - ETA: 1:05 - loss: 0.3342 - regression_loss: 0.3133 - classification_loss: 0.0209 372/500 [=====================>........] - ETA: 1:04 - loss: 0.3338 - regression_loss: 0.3129 - classification_loss: 0.0209 373/500 [=====================>........] - ETA: 1:04 - loss: 0.3345 - regression_loss: 0.3134 - classification_loss: 0.0210 374/500 [=====================>........] - ETA: 1:03 - loss: 0.3347 - regression_loss: 0.3137 - classification_loss: 0.0211 375/500 [=====================>........] - ETA: 1:03 - loss: 0.3351 - regression_loss: 0.3140 - classification_loss: 0.0211 376/500 [=====================>........] - ETA: 1:02 - loss: 0.3349 - regression_loss: 0.3138 - classification_loss: 0.0210 377/500 [=====================>........] - ETA: 1:02 - loss: 0.3347 - regression_loss: 0.3137 - classification_loss: 0.0210 378/500 [=====================>........] - ETA: 1:01 - loss: 0.3346 - regression_loss: 0.3136 - classification_loss: 0.0210 379/500 [=====================>........] - ETA: 1:01 - loss: 0.3348 - regression_loss: 0.3138 - classification_loss: 0.0210 380/500 [=====================>........] - ETA: 1:00 - loss: 0.3349 - regression_loss: 0.3139 - classification_loss: 0.0211 381/500 [=====================>........] - ETA: 1:00 - loss: 0.3347 - regression_loss: 0.3136 - classification_loss: 0.0211 382/500 [=====================>........] - ETA: 59s - loss: 0.3351 - regression_loss: 0.3140 - classification_loss: 0.0211  383/500 [=====================>........] - ETA: 58s - loss: 0.3354 - regression_loss: 0.3143 - classification_loss: 0.0212 384/500 [======================>.......] - ETA: 58s - loss: 0.3358 - regression_loss: 0.3146 - classification_loss: 0.0212 385/500 [======================>.......] - ETA: 57s - loss: 0.3361 - regression_loss: 0.3149 - classification_loss: 0.0212 386/500 [======================>.......] - ETA: 57s - loss: 0.3364 - regression_loss: 0.3152 - classification_loss: 0.0212 387/500 [======================>.......] - ETA: 56s - loss: 0.3364 - regression_loss: 0.3152 - classification_loss: 0.0212 388/500 [======================>.......] - ETA: 56s - loss: 0.3361 - regression_loss: 0.3149 - classification_loss: 0.0212 389/500 [======================>.......] - ETA: 55s - loss: 0.3359 - regression_loss: 0.3148 - classification_loss: 0.0212 390/500 [======================>.......] - ETA: 55s - loss: 0.3357 - regression_loss: 0.3145 - classification_loss: 0.0211 391/500 [======================>.......] - ETA: 54s - loss: 0.3358 - regression_loss: 0.3147 - classification_loss: 0.0211 392/500 [======================>.......] - ETA: 54s - loss: 0.3353 - regression_loss: 0.3142 - classification_loss: 0.0211 393/500 [======================>.......] - ETA: 53s - loss: 0.3349 - regression_loss: 0.3139 - classification_loss: 0.0211 394/500 [======================>.......] - ETA: 53s - loss: 0.3347 - regression_loss: 0.3137 - classification_loss: 0.0210 395/500 [======================>.......] - ETA: 52s - loss: 0.3350 - regression_loss: 0.3139 - classification_loss: 0.0211 396/500 [======================>.......] - ETA: 52s - loss: 0.3351 - regression_loss: 0.3140 - classification_loss: 0.0211 397/500 [======================>.......] - ETA: 51s - loss: 0.3351 - regression_loss: 0.3140 - classification_loss: 0.0211 398/500 [======================>.......] - ETA: 51s - loss: 0.3350 - regression_loss: 0.3140 - classification_loss: 0.0211 399/500 [======================>.......] - ETA: 50s - loss: 0.3350 - regression_loss: 0.3139 - classification_loss: 0.0211 400/500 [=======================>......] - ETA: 50s - loss: 0.3345 - regression_loss: 0.3135 - classification_loss: 0.0210 401/500 [=======================>......] - ETA: 49s - loss: 0.3348 - regression_loss: 0.3138 - classification_loss: 0.0210 402/500 [=======================>......] - ETA: 49s - loss: 0.3360 - regression_loss: 0.3149 - classification_loss: 0.0211 403/500 [=======================>......] - ETA: 48s - loss: 0.3361 - regression_loss: 0.3150 - classification_loss: 0.0211 404/500 [=======================>......] - ETA: 48s - loss: 0.3369 - regression_loss: 0.3158 - classification_loss: 0.0212 405/500 [=======================>......] - ETA: 47s - loss: 0.3372 - regression_loss: 0.3160 - classification_loss: 0.0212 406/500 [=======================>......] - ETA: 47s - loss: 0.3369 - regression_loss: 0.3158 - classification_loss: 0.0211 407/500 [=======================>......] - ETA: 46s - loss: 0.3370 - regression_loss: 0.3159 - classification_loss: 0.0212 408/500 [=======================>......] - ETA: 46s - loss: 0.3373 - regression_loss: 0.3161 - classification_loss: 0.0212 409/500 [=======================>......] - ETA: 45s - loss: 0.3370 - regression_loss: 0.3159 - classification_loss: 0.0211 410/500 [=======================>......] - ETA: 45s - loss: 0.3370 - regression_loss: 0.3159 - classification_loss: 0.0211 411/500 [=======================>......] - ETA: 44s - loss: 0.3378 - regression_loss: 0.3166 - classification_loss: 0.0212 412/500 [=======================>......] - ETA: 44s - loss: 0.3381 - regression_loss: 0.3169 - classification_loss: 0.0212 413/500 [=======================>......] - ETA: 43s - loss: 0.3381 - regression_loss: 0.3169 - classification_loss: 0.0212 414/500 [=======================>......] - ETA: 43s - loss: 0.3376 - regression_loss: 0.3165 - classification_loss: 0.0212 415/500 [=======================>......] - ETA: 42s - loss: 0.3376 - regression_loss: 0.3164 - classification_loss: 0.0212 416/500 [=======================>......] - ETA: 42s - loss: 0.3375 - regression_loss: 0.3164 - classification_loss: 0.0212 417/500 [========================>.....] - ETA: 41s - loss: 0.3375 - regression_loss: 0.3163 - classification_loss: 0.0212 418/500 [========================>.....] - ETA: 41s - loss: 0.3374 - regression_loss: 0.3162 - classification_loss: 0.0212 419/500 [========================>.....] - ETA: 40s - loss: 0.3369 - regression_loss: 0.3158 - classification_loss: 0.0211 420/500 [========================>.....] - ETA: 40s - loss: 0.3368 - regression_loss: 0.3157 - classification_loss: 0.0211 421/500 [========================>.....] - ETA: 39s - loss: 0.3369 - regression_loss: 0.3158 - classification_loss: 0.0211 422/500 [========================>.....] - ETA: 39s - loss: 0.3369 - regression_loss: 0.3158 - classification_loss: 0.0211 423/500 [========================>.....] - ETA: 38s - loss: 0.3369 - regression_loss: 0.3158 - classification_loss: 0.0211 424/500 [========================>.....] - ETA: 38s - loss: 0.3376 - regression_loss: 0.3165 - classification_loss: 0.0211 425/500 [========================>.....] - ETA: 37s - loss: 0.3372 - regression_loss: 0.3162 - classification_loss: 0.0211 426/500 [========================>.....] - ETA: 37s - loss: 0.3384 - regression_loss: 0.3173 - classification_loss: 0.0211 427/500 [========================>.....] - ETA: 36s - loss: 0.3389 - regression_loss: 0.3177 - classification_loss: 0.0211 428/500 [========================>.....] - ETA: 36s - loss: 0.3386 - regression_loss: 0.3175 - classification_loss: 0.0211 429/500 [========================>.....] - ETA: 35s - loss: 0.3387 - regression_loss: 0.3176 - classification_loss: 0.0211 430/500 [========================>.....] - ETA: 35s - loss: 0.3382 - regression_loss: 0.3171 - classification_loss: 0.0211 431/500 [========================>.....] - ETA: 34s - loss: 0.3382 - regression_loss: 0.3171 - classification_loss: 0.0211 432/500 [========================>.....] - ETA: 34s - loss: 0.3378 - regression_loss: 0.3168 - classification_loss: 0.0211 433/500 [========================>.....] - ETA: 33s - loss: 0.3378 - regression_loss: 0.3167 - classification_loss: 0.0211 434/500 [=========================>....] - ETA: 33s - loss: 0.3383 - regression_loss: 0.3171 - classification_loss: 0.0211 435/500 [=========================>....] - ETA: 32s - loss: 0.3379 - regression_loss: 0.3168 - classification_loss: 0.0211 436/500 [=========================>....] - ETA: 32s - loss: 0.3382 - regression_loss: 0.3171 - classification_loss: 0.0211 437/500 [=========================>....] - ETA: 31s - loss: 0.3381 - regression_loss: 0.3169 - classification_loss: 0.0211 438/500 [=========================>....] - ETA: 31s - loss: 0.3378 - regression_loss: 0.3167 - classification_loss: 0.0211 439/500 [=========================>....] - ETA: 30s - loss: 0.3379 - regression_loss: 0.3167 - classification_loss: 0.0211 440/500 [=========================>....] - ETA: 30s - loss: 0.3382 - regression_loss: 0.3170 - classification_loss: 0.0212 441/500 [=========================>....] - ETA: 29s - loss: 0.3389 - regression_loss: 0.3177 - classification_loss: 0.0212 442/500 [=========================>....] - ETA: 29s - loss: 0.3388 - regression_loss: 0.3176 - classification_loss: 0.0212 443/500 [=========================>....] - ETA: 28s - loss: 0.3390 - regression_loss: 0.3177 - classification_loss: 0.0212 444/500 [=========================>....] - ETA: 28s - loss: 0.3388 - regression_loss: 0.3176 - classification_loss: 0.0212 445/500 [=========================>....] - ETA: 27s - loss: 0.3386 - regression_loss: 0.3174 - classification_loss: 0.0212 446/500 [=========================>....] - ETA: 27s - loss: 0.3386 - regression_loss: 0.3174 - classification_loss: 0.0211 447/500 [=========================>....] - ETA: 26s - loss: 0.3387 - regression_loss: 0.3175 - classification_loss: 0.0211 448/500 [=========================>....] - ETA: 26s - loss: 0.3384 - regression_loss: 0.3172 - classification_loss: 0.0211 449/500 [=========================>....] - ETA: 25s - loss: 0.3384 - regression_loss: 0.3172 - classification_loss: 0.0211 450/500 [==========================>...] - ETA: 25s - loss: 0.3384 - regression_loss: 0.3173 - classification_loss: 0.0211 451/500 [==========================>...] - ETA: 24s - loss: 0.3381 - regression_loss: 0.3170 - classification_loss: 0.0211 452/500 [==========================>...] - ETA: 24s - loss: 0.3378 - regression_loss: 0.3167 - classification_loss: 0.0211 453/500 [==========================>...] - ETA: 23s - loss: 0.3377 - regression_loss: 0.3166 - classification_loss: 0.0211 454/500 [==========================>...] - ETA: 23s - loss: 0.3379 - regression_loss: 0.3168 - classification_loss: 0.0212 455/500 [==========================>...] - ETA: 22s - loss: 0.3379 - regression_loss: 0.3167 - classification_loss: 0.0212 456/500 [==========================>...] - ETA: 22s - loss: 0.3378 - regression_loss: 0.3167 - classification_loss: 0.0212 457/500 [==========================>...] - ETA: 21s - loss: 0.3377 - regression_loss: 0.3165 - classification_loss: 0.0211 458/500 [==========================>...] - ETA: 21s - loss: 0.3382 - regression_loss: 0.3170 - classification_loss: 0.0212 459/500 [==========================>...] - ETA: 20s - loss: 0.3380 - regression_loss: 0.3169 - classification_loss: 0.0212 460/500 [==========================>...] - ETA: 20s - loss: 0.3378 - regression_loss: 0.3167 - classification_loss: 0.0212 461/500 [==========================>...] - ETA: 19s - loss: 0.3380 - regression_loss: 0.3168 - classification_loss: 0.0212 462/500 [==========================>...] - ETA: 19s - loss: 0.3382 - regression_loss: 0.3170 - classification_loss: 0.0212 463/500 [==========================>...] - ETA: 18s - loss: 0.3380 - regression_loss: 0.3168 - classification_loss: 0.0211 464/500 [==========================>...] - ETA: 18s - loss: 0.3381 - regression_loss: 0.3169 - classification_loss: 0.0211 465/500 [==========================>...] - ETA: 17s - loss: 0.3378 - regression_loss: 0.3167 - classification_loss: 0.0212 466/500 [==========================>...] - ETA: 17s - loss: 0.3378 - regression_loss: 0.3167 - classification_loss: 0.0212 467/500 [===========================>..] - ETA: 16s - loss: 0.3383 - regression_loss: 0.3171 - classification_loss: 0.0212 468/500 [===========================>..] - ETA: 16s - loss: 0.3380 - regression_loss: 0.3169 - classification_loss: 0.0211 469/500 [===========================>..] - ETA: 15s - loss: 0.3383 - regression_loss: 0.3172 - classification_loss: 0.0211 470/500 [===========================>..] - ETA: 15s - loss: 0.3386 - regression_loss: 0.3174 - classification_loss: 0.0212 471/500 [===========================>..] - ETA: 14s - loss: 0.3385 - regression_loss: 0.3174 - classification_loss: 0.0212 472/500 [===========================>..] - ETA: 14s - loss: 0.3381 - regression_loss: 0.3169 - classification_loss: 0.0211 473/500 [===========================>..] - ETA: 13s - loss: 0.3382 - regression_loss: 0.3171 - classification_loss: 0.0211 474/500 [===========================>..] - ETA: 13s - loss: 0.3385 - regression_loss: 0.3173 - classification_loss: 0.0212 475/500 [===========================>..] - ETA: 12s - loss: 0.3390 - regression_loss: 0.3178 - classification_loss: 0.0212 476/500 [===========================>..] - ETA: 12s - loss: 0.3387 - regression_loss: 0.3175 - classification_loss: 0.0212 477/500 [===========================>..] - ETA: 11s - loss: 0.3384 - regression_loss: 0.3172 - classification_loss: 0.0212 478/500 [===========================>..] - ETA: 11s - loss: 0.3385 - regression_loss: 0.3173 - classification_loss: 0.0212 479/500 [===========================>..] - ETA: 10s - loss: 0.3389 - regression_loss: 0.3177 - classification_loss: 0.0213 480/500 [===========================>..] - ETA: 10s - loss: 0.3392 - regression_loss: 0.3179 - classification_loss: 0.0213 481/500 [===========================>..] - ETA: 9s - loss: 0.3396 - regression_loss: 0.3183 - classification_loss: 0.0213  482/500 [===========================>..] - ETA: 9s - loss: 0.3400 - regression_loss: 0.3187 - classification_loss: 0.0214 483/500 [===========================>..] - ETA: 8s - loss: 0.3401 - regression_loss: 0.3188 - classification_loss: 0.0214 484/500 [============================>.] - ETA: 8s - loss: 0.3401 - regression_loss: 0.3187 - classification_loss: 0.0214 485/500 [============================>.] - ETA: 7s - loss: 0.3399 - regression_loss: 0.3185 - classification_loss: 0.0214 486/500 [============================>.] - ETA: 7s - loss: 0.3398 - regression_loss: 0.3184 - classification_loss: 0.0214 487/500 [============================>.] - ETA: 6s - loss: 0.3398 - regression_loss: 0.3184 - classification_loss: 0.0214 488/500 [============================>.] - ETA: 6s - loss: 0.3399 - regression_loss: 0.3185 - classification_loss: 0.0214 489/500 [============================>.] - ETA: 5s - loss: 0.3397 - regression_loss: 0.3183 - classification_loss: 0.0214 490/500 [============================>.] - ETA: 5s - loss: 0.3397 - regression_loss: 0.3183 - classification_loss: 0.0214 491/500 [============================>.] - ETA: 4s - loss: 0.3392 - regression_loss: 0.3179 - classification_loss: 0.0213 492/500 [============================>.] - ETA: 4s - loss: 0.3390 - regression_loss: 0.3177 - classification_loss: 0.0213 493/500 [============================>.] - ETA: 3s - loss: 0.3392 - regression_loss: 0.3179 - classification_loss: 0.0213 494/500 [============================>.] - ETA: 3s - loss: 0.3386 - regression_loss: 0.3173 - classification_loss: 0.0213 495/500 [============================>.] - ETA: 2s - loss: 0.3390 - regression_loss: 0.3177 - classification_loss: 0.0213 496/500 [============================>.] - ETA: 2s - loss: 0.3390 - regression_loss: 0.3177 - classification_loss: 0.0213 497/500 [============================>.] - ETA: 1s - loss: 0.3393 - regression_loss: 0.3180 - classification_loss: 0.0213 498/500 [============================>.] - ETA: 1s - loss: 0.3390 - regression_loss: 0.3177 - classification_loss: 0.0213 499/500 [============================>.] - ETA: 0s - loss: 0.3390 - regression_loss: 0.3177 - classification_loss: 0.0212 500/500 [==============================] - 252s 504ms/step - loss: 0.3388 - regression_loss: 0.3176 - classification_loss: 0.0212 1172 instances of class plum with average precision: 0.7283 mAP: 0.7283 Epoch 00017: saving model to ./training/snapshots/resnet101_pascal_17.h5 Epoch 18/150 1/500 [..............................] - ETA: 4:31 - loss: 0.1911 - regression_loss: 0.1834 - classification_loss: 0.0078 2/500 [..............................] - ETA: 4:19 - loss: 0.2614 - regression_loss: 0.2379 - classification_loss: 0.0236 3/500 [..............................] - ETA: 4:12 - loss: 0.2484 - regression_loss: 0.2282 - classification_loss: 0.0202 4/500 [..............................] - ETA: 4:11 - loss: 0.2868 - regression_loss: 0.2657 - classification_loss: 0.0211 5/500 [..............................] - ETA: 4:08 - loss: 0.2843 - regression_loss: 0.2637 - classification_loss: 0.0206 6/500 [..............................] - ETA: 4:06 - loss: 0.2983 - regression_loss: 0.2772 - classification_loss: 0.0211 7/500 [..............................] - ETA: 4:05 - loss: 0.2742 - regression_loss: 0.2554 - classification_loss: 0.0188 8/500 [..............................] - ETA: 4:03 - loss: 0.2604 - regression_loss: 0.2430 - classification_loss: 0.0174 9/500 [..............................] - ETA: 4:02 - loss: 0.2526 - regression_loss: 0.2359 - classification_loss: 0.0166 10/500 [..............................] - ETA: 4:03 - loss: 0.2585 - regression_loss: 0.2414 - classification_loss: 0.0171 11/500 [..............................] - ETA: 4:03 - loss: 0.2568 - regression_loss: 0.2399 - classification_loss: 0.0169 12/500 [..............................] - ETA: 4:03 - loss: 0.2759 - regression_loss: 0.2585 - classification_loss: 0.0174 13/500 [..............................] - ETA: 4:02 - loss: 0.2651 - regression_loss: 0.2483 - classification_loss: 0.0168 14/500 [..............................] - ETA: 4:01 - loss: 0.2658 - regression_loss: 0.2492 - classification_loss: 0.0167 15/500 [..............................] - ETA: 4:01 - loss: 0.2712 - regression_loss: 0.2546 - classification_loss: 0.0167 16/500 [..............................] - ETA: 4:00 - loss: 0.2616 - regression_loss: 0.2457 - classification_loss: 0.0159 17/500 [>.............................] - ETA: 4:00 - loss: 0.2571 - regression_loss: 0.2419 - classification_loss: 0.0152 18/500 [>.............................] - ETA: 3:59 - loss: 0.2675 - regression_loss: 0.2521 - classification_loss: 0.0155 19/500 [>.............................] - ETA: 3:59 - loss: 0.2644 - regression_loss: 0.2488 - classification_loss: 0.0157 20/500 [>.............................] - ETA: 3:59 - loss: 0.2645 - regression_loss: 0.2476 - classification_loss: 0.0168 21/500 [>.............................] - ETA: 3:59 - loss: 0.2650 - regression_loss: 0.2486 - classification_loss: 0.0164 22/500 [>.............................] - ETA: 3:58 - loss: 0.2607 - regression_loss: 0.2446 - classification_loss: 0.0161 23/500 [>.............................] - ETA: 3:58 - loss: 0.2615 - regression_loss: 0.2458 - classification_loss: 0.0157 24/500 [>.............................] - ETA: 3:57 - loss: 0.2662 - regression_loss: 0.2503 - classification_loss: 0.0159 25/500 [>.............................] - ETA: 3:57 - loss: 0.2660 - regression_loss: 0.2505 - classification_loss: 0.0155 26/500 [>.............................] - ETA: 3:56 - loss: 0.2769 - regression_loss: 0.2608 - classification_loss: 0.0160 27/500 [>.............................] - ETA: 3:56 - loss: 0.2783 - regression_loss: 0.2624 - classification_loss: 0.0159 28/500 [>.............................] - ETA: 3:56 - loss: 0.2758 - regression_loss: 0.2602 - classification_loss: 0.0156 29/500 [>.............................] - ETA: 3:56 - loss: 0.2811 - regression_loss: 0.2652 - classification_loss: 0.0160 30/500 [>.............................] - ETA: 3:55 - loss: 0.2783 - regression_loss: 0.2624 - classification_loss: 0.0160 31/500 [>.............................] - ETA: 3:54 - loss: 0.2788 - regression_loss: 0.2630 - classification_loss: 0.0158 32/500 [>.............................] - ETA: 3:54 - loss: 0.2795 - regression_loss: 0.2639 - classification_loss: 0.0156 33/500 [>.............................] - ETA: 3:53 - loss: 0.2860 - regression_loss: 0.2699 - classification_loss: 0.0161 34/500 [=>............................] - ETA: 3:53 - loss: 0.2827 - regression_loss: 0.2669 - classification_loss: 0.0158 35/500 [=>............................] - ETA: 3:52 - loss: 0.2824 - regression_loss: 0.2665 - classification_loss: 0.0159 36/500 [=>............................] - ETA: 3:52 - loss: 0.2792 - regression_loss: 0.2635 - classification_loss: 0.0156 37/500 [=>............................] - ETA: 3:51 - loss: 0.2736 - regression_loss: 0.2583 - classification_loss: 0.0154 38/500 [=>............................] - ETA: 3:51 - loss: 0.2733 - regression_loss: 0.2580 - classification_loss: 0.0153 39/500 [=>............................] - ETA: 3:51 - loss: 0.2720 - regression_loss: 0.2568 - classification_loss: 0.0152 40/500 [=>............................] - ETA: 3:50 - loss: 0.2764 - regression_loss: 0.2608 - classification_loss: 0.0156 41/500 [=>............................] - ETA: 3:50 - loss: 0.2769 - regression_loss: 0.2611 - classification_loss: 0.0159 42/500 [=>............................] - ETA: 3:49 - loss: 0.2765 - regression_loss: 0.2606 - classification_loss: 0.0159 43/500 [=>............................] - ETA: 3:49 - loss: 0.2813 - regression_loss: 0.2651 - classification_loss: 0.0162 44/500 [=>............................] - ETA: 3:48 - loss: 0.2775 - regression_loss: 0.2615 - classification_loss: 0.0160 45/500 [=>............................] - ETA: 3:48 - loss: 0.2783 - regression_loss: 0.2620 - classification_loss: 0.0162 46/500 [=>............................] - ETA: 3:47 - loss: 0.2762 - regression_loss: 0.2600 - classification_loss: 0.0161 47/500 [=>............................] - ETA: 3:47 - loss: 0.2741 - regression_loss: 0.2581 - classification_loss: 0.0160 48/500 [=>............................] - ETA: 3:46 - loss: 0.2763 - regression_loss: 0.2602 - classification_loss: 0.0161 49/500 [=>............................] - ETA: 3:46 - loss: 0.2785 - regression_loss: 0.2624 - classification_loss: 0.0162 50/500 [==>...........................] - ETA: 3:45 - loss: 0.2744 - regression_loss: 0.2584 - classification_loss: 0.0160 51/500 [==>...........................] - ETA: 3:45 - loss: 0.2731 - regression_loss: 0.2571 - classification_loss: 0.0160 52/500 [==>...........................] - ETA: 3:44 - loss: 0.2745 - regression_loss: 0.2584 - classification_loss: 0.0161 53/500 [==>...........................] - ETA: 3:44 - loss: 0.2767 - regression_loss: 0.2606 - classification_loss: 0.0161 54/500 [==>...........................] - ETA: 3:43 - loss: 0.2793 - regression_loss: 0.2628 - classification_loss: 0.0165 55/500 [==>...........................] - ETA: 3:43 - loss: 0.2805 - regression_loss: 0.2639 - classification_loss: 0.0166 56/500 [==>...........................] - ETA: 3:42 - loss: 0.2793 - regression_loss: 0.2629 - classification_loss: 0.0164 57/500 [==>...........................] - ETA: 3:42 - loss: 0.2773 - regression_loss: 0.2612 - classification_loss: 0.0162 58/500 [==>...........................] - ETA: 3:41 - loss: 0.2761 - regression_loss: 0.2599 - classification_loss: 0.0162 59/500 [==>...........................] - ETA: 3:41 - loss: 0.2757 - regression_loss: 0.2595 - classification_loss: 0.0162 60/500 [==>...........................] - ETA: 3:40 - loss: 0.2774 - regression_loss: 0.2612 - classification_loss: 0.0162 61/500 [==>...........................] - ETA: 3:40 - loss: 0.2780 - regression_loss: 0.2617 - classification_loss: 0.0163 62/500 [==>...........................] - ETA: 3:39 - loss: 0.2768 - regression_loss: 0.2607 - classification_loss: 0.0161 63/500 [==>...........................] - ETA: 3:39 - loss: 0.2782 - regression_loss: 0.2620 - classification_loss: 0.0161 64/500 [==>...........................] - ETA: 3:39 - loss: 0.2793 - regression_loss: 0.2630 - classification_loss: 0.0163 65/500 [==>...........................] - ETA: 3:38 - loss: 0.2808 - regression_loss: 0.2646 - classification_loss: 0.0162 66/500 [==>...........................] - ETA: 3:37 - loss: 0.2797 - regression_loss: 0.2636 - classification_loss: 0.0161 67/500 [===>..........................] - ETA: 3:37 - loss: 0.2790 - regression_loss: 0.2629 - classification_loss: 0.0161 68/500 [===>..........................] - ETA: 3:36 - loss: 0.2786 - regression_loss: 0.2625 - classification_loss: 0.0161 69/500 [===>..........................] - ETA: 3:36 - loss: 0.2814 - regression_loss: 0.2654 - classification_loss: 0.0161 70/500 [===>..........................] - ETA: 3:35 - loss: 0.2806 - regression_loss: 0.2647 - classification_loss: 0.0159 71/500 [===>..........................] - ETA: 3:35 - loss: 0.2811 - regression_loss: 0.2653 - classification_loss: 0.0158 72/500 [===>..........................] - ETA: 3:34 - loss: 0.2787 - regression_loss: 0.2630 - classification_loss: 0.0157 73/500 [===>..........................] - ETA: 3:34 - loss: 0.2763 - regression_loss: 0.2608 - classification_loss: 0.0155 74/500 [===>..........................] - ETA: 3:33 - loss: 0.2757 - regression_loss: 0.2603 - classification_loss: 0.0154 75/500 [===>..........................] - ETA: 3:33 - loss: 0.2745 - regression_loss: 0.2589 - classification_loss: 0.0156 76/500 [===>..........................] - ETA: 3:32 - loss: 0.2765 - regression_loss: 0.2607 - classification_loss: 0.0157 77/500 [===>..........................] - ETA: 3:32 - loss: 0.2775 - regression_loss: 0.2617 - classification_loss: 0.0159 78/500 [===>..........................] - ETA: 3:31 - loss: 0.2776 - regression_loss: 0.2619 - classification_loss: 0.0157 79/500 [===>..........................] - ETA: 3:31 - loss: 0.2760 - regression_loss: 0.2604 - classification_loss: 0.0157 80/500 [===>..........................] - ETA: 3:30 - loss: 0.2756 - regression_loss: 0.2600 - classification_loss: 0.0156 81/500 [===>..........................] - ETA: 3:30 - loss: 0.2740 - regression_loss: 0.2585 - classification_loss: 0.0155 82/500 [===>..........................] - ETA: 3:29 - loss: 0.2755 - regression_loss: 0.2600 - classification_loss: 0.0155 83/500 [===>..........................] - ETA: 3:29 - loss: 0.2760 - regression_loss: 0.2603 - classification_loss: 0.0157 84/500 [====>.........................] - ETA: 3:28 - loss: 0.2761 - regression_loss: 0.2604 - classification_loss: 0.0157 85/500 [====>.........................] - ETA: 3:28 - loss: 0.2754 - regression_loss: 0.2598 - classification_loss: 0.0156 86/500 [====>.........................] - ETA: 3:28 - loss: 0.2746 - regression_loss: 0.2590 - classification_loss: 0.0156 87/500 [====>.........................] - ETA: 3:27 - loss: 0.2779 - regression_loss: 0.2617 - classification_loss: 0.0162 88/500 [====>.........................] - ETA: 3:26 - loss: 0.2781 - regression_loss: 0.2619 - classification_loss: 0.0161 89/500 [====>.........................] - ETA: 3:26 - loss: 0.2777 - regression_loss: 0.2616 - classification_loss: 0.0161 90/500 [====>.........................] - ETA: 3:25 - loss: 0.2797 - regression_loss: 0.2634 - classification_loss: 0.0163 91/500 [====>.........................] - ETA: 3:25 - loss: 0.2786 - regression_loss: 0.2624 - classification_loss: 0.0162 92/500 [====>.........................] - ETA: 3:24 - loss: 0.2782 - regression_loss: 0.2621 - classification_loss: 0.0161 93/500 [====>.........................] - ETA: 3:24 - loss: 0.2766 - regression_loss: 0.2606 - classification_loss: 0.0159 94/500 [====>.........................] - ETA: 3:23 - loss: 0.2774 - regression_loss: 0.2616 - classification_loss: 0.0158 95/500 [====>.........................] - ETA: 3:23 - loss: 0.2770 - regression_loss: 0.2613 - classification_loss: 0.0157 96/500 [====>.........................] - ETA: 3:22 - loss: 0.2784 - regression_loss: 0.2627 - classification_loss: 0.0157 97/500 [====>.........................] - ETA: 3:22 - loss: 0.2784 - regression_loss: 0.2626 - classification_loss: 0.0158 98/500 [====>.........................] - ETA: 3:21 - loss: 0.2790 - regression_loss: 0.2631 - classification_loss: 0.0158 99/500 [====>.........................] - ETA: 3:21 - loss: 0.2828 - regression_loss: 0.2667 - classification_loss: 0.0161 100/500 [=====>........................] - ETA: 3:20 - loss: 0.2843 - regression_loss: 0.2681 - classification_loss: 0.0162 101/500 [=====>........................] - ETA: 3:20 - loss: 0.2840 - regression_loss: 0.2679 - classification_loss: 0.0162 102/500 [=====>........................] - ETA: 3:20 - loss: 0.2830 - regression_loss: 0.2669 - classification_loss: 0.0161 103/500 [=====>........................] - ETA: 3:19 - loss: 0.2881 - regression_loss: 0.2720 - classification_loss: 0.0162 104/500 [=====>........................] - ETA: 3:18 - loss: 0.2891 - regression_loss: 0.2728 - classification_loss: 0.0163 105/500 [=====>........................] - ETA: 3:18 - loss: 0.2885 - regression_loss: 0.2723 - classification_loss: 0.0163 106/500 [=====>........................] - ETA: 3:17 - loss: 0.2899 - regression_loss: 0.2736 - classification_loss: 0.0164 107/500 [=====>........................] - ETA: 3:17 - loss: 0.2888 - regression_loss: 0.2725 - classification_loss: 0.0162 108/500 [=====>........................] - ETA: 3:16 - loss: 0.2902 - regression_loss: 0.2738 - classification_loss: 0.0163 109/500 [=====>........................] - ETA: 3:16 - loss: 0.2896 - regression_loss: 0.2734 - classification_loss: 0.0163 110/500 [=====>........................] - ETA: 3:15 - loss: 0.2907 - regression_loss: 0.2744 - classification_loss: 0.0163 111/500 [=====>........................] - ETA: 3:15 - loss: 0.2914 - regression_loss: 0.2748 - classification_loss: 0.0166 112/500 [=====>........................] - ETA: 3:14 - loss: 0.2903 - regression_loss: 0.2738 - classification_loss: 0.0165 113/500 [=====>........................] - ETA: 3:14 - loss: 0.2898 - regression_loss: 0.2734 - classification_loss: 0.0164 114/500 [=====>........................] - ETA: 3:13 - loss: 0.2915 - regression_loss: 0.2751 - classification_loss: 0.0164 115/500 [=====>........................] - ETA: 3:13 - loss: 0.2906 - regression_loss: 0.2743 - classification_loss: 0.0163 116/500 [=====>........................] - ETA: 3:13 - loss: 0.2904 - regression_loss: 0.2741 - classification_loss: 0.0163 117/500 [======>.......................] - ETA: 3:12 - loss: 0.2895 - regression_loss: 0.2734 - classification_loss: 0.0162 118/500 [======>.......................] - ETA: 3:12 - loss: 0.2888 - regression_loss: 0.2727 - classification_loss: 0.0161 119/500 [======>.......................] - ETA: 3:11 - loss: 0.2867 - regression_loss: 0.2708 - classification_loss: 0.0160 120/500 [======>.......................] - ETA: 3:11 - loss: 0.2863 - regression_loss: 0.2704 - classification_loss: 0.0160 121/500 [======>.......................] - ETA: 3:10 - loss: 0.2859 - regression_loss: 0.2700 - classification_loss: 0.0159 122/500 [======>.......................] - ETA: 3:10 - loss: 0.2854 - regression_loss: 0.2695 - classification_loss: 0.0159 123/500 [======>.......................] - ETA: 3:09 - loss: 0.2853 - regression_loss: 0.2694 - classification_loss: 0.0159 124/500 [======>.......................] - ETA: 3:08 - loss: 0.2841 - regression_loss: 0.2683 - classification_loss: 0.0159 125/500 [======>.......................] - ETA: 3:08 - loss: 0.2842 - regression_loss: 0.2684 - classification_loss: 0.0158 126/500 [======>.......................] - ETA: 3:07 - loss: 0.2833 - regression_loss: 0.2675 - classification_loss: 0.0158 127/500 [======>.......................] - ETA: 3:07 - loss: 0.2830 - regression_loss: 0.2673 - classification_loss: 0.0157 128/500 [======>.......................] - ETA: 3:06 - loss: 0.2831 - regression_loss: 0.2675 - classification_loss: 0.0156 129/500 [======>.......................] - ETA: 3:06 - loss: 0.2839 - regression_loss: 0.2683 - classification_loss: 0.0156 130/500 [======>.......................] - ETA: 3:05 - loss: 0.2857 - regression_loss: 0.2700 - classification_loss: 0.0156 131/500 [======>.......................] - ETA: 3:05 - loss: 0.2850 - regression_loss: 0.2694 - classification_loss: 0.0156 132/500 [======>.......................] - ETA: 3:04 - loss: 0.2842 - regression_loss: 0.2686 - classification_loss: 0.0155 133/500 [======>.......................] - ETA: 3:04 - loss: 0.2847 - regression_loss: 0.2691 - classification_loss: 0.0156 134/500 [=======>......................] - ETA: 3:03 - loss: 0.2849 - regression_loss: 0.2694 - classification_loss: 0.0156 135/500 [=======>......................] - ETA: 3:03 - loss: 0.2870 - regression_loss: 0.2711 - classification_loss: 0.0159 136/500 [=======>......................] - ETA: 3:02 - loss: 0.2869 - regression_loss: 0.2711 - classification_loss: 0.0159 137/500 [=======>......................] - ETA: 3:02 - loss: 0.2874 - regression_loss: 0.2714 - classification_loss: 0.0160 138/500 [=======>......................] - ETA: 3:01 - loss: 0.2875 - regression_loss: 0.2714 - classification_loss: 0.0161 139/500 [=======>......................] - ETA: 3:01 - loss: 0.2871 - regression_loss: 0.2710 - classification_loss: 0.0161 140/500 [=======>......................] - ETA: 3:00 - loss: 0.2874 - regression_loss: 0.2713 - classification_loss: 0.0162 141/500 [=======>......................] - ETA: 3:00 - loss: 0.2884 - regression_loss: 0.2722 - classification_loss: 0.0161 142/500 [=======>......................] - ETA: 2:59 - loss: 0.2884 - regression_loss: 0.2723 - classification_loss: 0.0161 143/500 [=======>......................] - ETA: 2:59 - loss: 0.2889 - regression_loss: 0.2729 - classification_loss: 0.0160 144/500 [=======>......................] - ETA: 2:58 - loss: 0.2901 - regression_loss: 0.2739 - classification_loss: 0.0162 145/500 [=======>......................] - ETA: 2:58 - loss: 0.2906 - regression_loss: 0.2744 - classification_loss: 0.0162 146/500 [=======>......................] - ETA: 2:57 - loss: 0.2908 - regression_loss: 0.2746 - classification_loss: 0.0162 147/500 [=======>......................] - ETA: 2:57 - loss: 0.2914 - regression_loss: 0.2752 - classification_loss: 0.0162 148/500 [=======>......................] - ETA: 2:56 - loss: 0.2911 - regression_loss: 0.2749 - classification_loss: 0.0162 149/500 [=======>......................] - ETA: 2:56 - loss: 0.2912 - regression_loss: 0.2749 - classification_loss: 0.0163 150/500 [========>.....................] - ETA: 2:55 - loss: 0.2904 - regression_loss: 0.2741 - classification_loss: 0.0162 151/500 [========>.....................] - ETA: 2:55 - loss: 0.2895 - regression_loss: 0.2734 - classification_loss: 0.0161 152/500 [========>.....................] - ETA: 2:54 - loss: 0.2909 - regression_loss: 0.2746 - classification_loss: 0.0163 153/500 [========>.....................] - ETA: 2:54 - loss: 0.2903 - regression_loss: 0.2741 - classification_loss: 0.0162 154/500 [========>.....................] - ETA: 2:53 - loss: 0.2903 - regression_loss: 0.2740 - classification_loss: 0.0163 155/500 [========>.....................] - ETA: 2:53 - loss: 0.2900 - regression_loss: 0.2737 - classification_loss: 0.0163 156/500 [========>.....................] - ETA: 2:52 - loss: 0.2893 - regression_loss: 0.2730 - classification_loss: 0.0162 157/500 [========>.....................] - ETA: 2:52 - loss: 0.2895 - regression_loss: 0.2732 - classification_loss: 0.0163 158/500 [========>.....................] - ETA: 2:51 - loss: 0.2893 - regression_loss: 0.2730 - classification_loss: 0.0162 159/500 [========>.....................] - ETA: 2:51 - loss: 0.2891 - regression_loss: 0.2730 - classification_loss: 0.0162 160/500 [========>.....................] - ETA: 2:50 - loss: 0.2895 - regression_loss: 0.2733 - classification_loss: 0.0162 161/500 [========>.....................] - ETA: 2:50 - loss: 0.2887 - regression_loss: 0.2726 - classification_loss: 0.0162 162/500 [========>.....................] - ETA: 2:49 - loss: 0.2884 - regression_loss: 0.2723 - classification_loss: 0.0161 163/500 [========>.....................] - ETA: 2:49 - loss: 0.2890 - regression_loss: 0.2728 - classification_loss: 0.0162 164/500 [========>.....................] - ETA: 2:48 - loss: 0.2898 - regression_loss: 0.2736 - classification_loss: 0.0162 165/500 [========>.....................] - ETA: 2:48 - loss: 0.2900 - regression_loss: 0.2738 - classification_loss: 0.0162 166/500 [========>.....................] - ETA: 2:47 - loss: 0.2897 - regression_loss: 0.2735 - classification_loss: 0.0162 167/500 [=========>....................] - ETA: 2:47 - loss: 0.2896 - regression_loss: 0.2733 - classification_loss: 0.0162 168/500 [=========>....................] - ETA: 2:46 - loss: 0.2897 - regression_loss: 0.2734 - classification_loss: 0.0163 169/500 [=========>....................] - ETA: 2:46 - loss: 0.2897 - regression_loss: 0.2734 - classification_loss: 0.0163 170/500 [=========>....................] - ETA: 2:45 - loss: 0.2905 - regression_loss: 0.2740 - classification_loss: 0.0165 171/500 [=========>....................] - ETA: 2:45 - loss: 0.2897 - regression_loss: 0.2733 - classification_loss: 0.0164 172/500 [=========>....................] - ETA: 2:44 - loss: 0.2907 - regression_loss: 0.2742 - classification_loss: 0.0165 173/500 [=========>....................] - ETA: 2:44 - loss: 0.2897 - regression_loss: 0.2733 - classification_loss: 0.0164 174/500 [=========>....................] - ETA: 2:43 - loss: 0.2892 - regression_loss: 0.2728 - classification_loss: 0.0164 175/500 [=========>....................] - ETA: 2:43 - loss: 0.2883 - regression_loss: 0.2720 - classification_loss: 0.0163 176/500 [=========>....................] - ETA: 2:42 - loss: 0.2882 - regression_loss: 0.2720 - classification_loss: 0.0163 177/500 [=========>....................] - ETA: 2:42 - loss: 0.2890 - regression_loss: 0.2727 - classification_loss: 0.0163 178/500 [=========>....................] - ETA: 2:41 - loss: 0.2889 - regression_loss: 0.2726 - classification_loss: 0.0163 179/500 [=========>....................] - ETA: 2:41 - loss: 0.2886 - regression_loss: 0.2723 - classification_loss: 0.0163 180/500 [=========>....................] - ETA: 2:40 - loss: 0.2881 - regression_loss: 0.2719 - classification_loss: 0.0162 181/500 [=========>....................] - ETA: 2:40 - loss: 0.2904 - regression_loss: 0.2741 - classification_loss: 0.0163 182/500 [=========>....................] - ETA: 2:39 - loss: 0.2906 - regression_loss: 0.2743 - classification_loss: 0.0164 183/500 [=========>....................] - ETA: 2:39 - loss: 0.2897 - regression_loss: 0.2734 - classification_loss: 0.0163 184/500 [==========>...................] - ETA: 2:38 - loss: 0.2891 - regression_loss: 0.2729 - classification_loss: 0.0163 185/500 [==========>...................] - ETA: 2:38 - loss: 0.2888 - regression_loss: 0.2726 - classification_loss: 0.0162 186/500 [==========>...................] - ETA: 2:37 - loss: 0.2881 - regression_loss: 0.2719 - classification_loss: 0.0162 187/500 [==========>...................] - ETA: 2:37 - loss: 0.2873 - regression_loss: 0.2711 - classification_loss: 0.0162 188/500 [==========>...................] - ETA: 2:36 - loss: 0.2865 - regression_loss: 0.2704 - classification_loss: 0.0161 189/500 [==========>...................] - ETA: 2:36 - loss: 0.2870 - regression_loss: 0.2709 - classification_loss: 0.0161 190/500 [==========>...................] - ETA: 2:35 - loss: 0.2861 - regression_loss: 0.2701 - classification_loss: 0.0161 191/500 [==========>...................] - ETA: 2:35 - loss: 0.2862 - regression_loss: 0.2702 - classification_loss: 0.0161 192/500 [==========>...................] - ETA: 2:34 - loss: 0.2861 - regression_loss: 0.2699 - classification_loss: 0.0162 193/500 [==========>...................] - ETA: 2:34 - loss: 0.2855 - regression_loss: 0.2693 - classification_loss: 0.0161 194/500 [==========>...................] - ETA: 2:33 - loss: 0.2851 - regression_loss: 0.2690 - classification_loss: 0.0161 195/500 [==========>...................] - ETA: 2:33 - loss: 0.2848 - regression_loss: 0.2687 - classification_loss: 0.0161 196/500 [==========>...................] - ETA: 2:32 - loss: 0.2850 - regression_loss: 0.2689 - classification_loss: 0.0161 197/500 [==========>...................] - ETA: 2:32 - loss: 0.2845 - regression_loss: 0.2685 - classification_loss: 0.0160 198/500 [==========>...................] - ETA: 2:31 - loss: 0.2842 - regression_loss: 0.2682 - classification_loss: 0.0160 199/500 [==========>...................] - ETA: 2:31 - loss: 0.2836 - regression_loss: 0.2677 - classification_loss: 0.0159 200/500 [===========>..................] - ETA: 2:30 - loss: 0.2835 - regression_loss: 0.2676 - classification_loss: 0.0160 201/500 [===========>..................] - ETA: 2:30 - loss: 0.2831 - regression_loss: 0.2672 - classification_loss: 0.0159 202/500 [===========>..................] - ETA: 2:29 - loss: 0.2823 - regression_loss: 0.2664 - classification_loss: 0.0159 203/500 [===========>..................] - ETA: 2:29 - loss: 0.2823 - regression_loss: 0.2664 - classification_loss: 0.0159 204/500 [===========>..................] - ETA: 2:28 - loss: 0.2832 - regression_loss: 0.2672 - classification_loss: 0.0159 205/500 [===========>..................] - ETA: 2:28 - loss: 0.2833 - regression_loss: 0.2674 - classification_loss: 0.0159 206/500 [===========>..................] - ETA: 2:27 - loss: 0.2828 - regression_loss: 0.2669 - classification_loss: 0.0159 207/500 [===========>..................] - ETA: 2:27 - loss: 0.2832 - regression_loss: 0.2673 - classification_loss: 0.0159 208/500 [===========>..................] - ETA: 2:26 - loss: 0.2824 - regression_loss: 0.2665 - classification_loss: 0.0158 209/500 [===========>..................] - ETA: 2:26 - loss: 0.2827 - regression_loss: 0.2669 - classification_loss: 0.0158 210/500 [===========>..................] - ETA: 2:25 - loss: 0.2825 - regression_loss: 0.2667 - classification_loss: 0.0158 211/500 [===========>..................] - ETA: 2:25 - loss: 0.2825 - regression_loss: 0.2666 - classification_loss: 0.0159 212/500 [===========>..................] - ETA: 2:24 - loss: 0.2817 - regression_loss: 0.2659 - classification_loss: 0.0158 213/500 [===========>..................] - ETA: 2:24 - loss: 0.2827 - regression_loss: 0.2668 - classification_loss: 0.0159 214/500 [===========>..................] - ETA: 2:23 - loss: 0.2824 - regression_loss: 0.2665 - classification_loss: 0.0159 215/500 [===========>..................] - ETA: 2:23 - loss: 0.2817 - regression_loss: 0.2658 - classification_loss: 0.0159 216/500 [===========>..................] - ETA: 2:22 - loss: 0.2818 - regression_loss: 0.2660 - classification_loss: 0.0159 217/500 [============>.................] - ETA: 2:22 - loss: 0.2822 - regression_loss: 0.2663 - classification_loss: 0.0159 218/500 [============>.................] - ETA: 2:21 - loss: 0.2820 - regression_loss: 0.2661 - classification_loss: 0.0159 219/500 [============>.................] - ETA: 2:21 - loss: 0.2815 - regression_loss: 0.2656 - classification_loss: 0.0159 220/500 [============>.................] - ETA: 2:20 - loss: 0.2825 - regression_loss: 0.2666 - classification_loss: 0.0159 221/500 [============>.................] - ETA: 2:20 - loss: 0.2822 - regression_loss: 0.2663 - classification_loss: 0.0159 222/500 [============>.................] - ETA: 2:19 - loss: 0.2827 - regression_loss: 0.2667 - classification_loss: 0.0160 223/500 [============>.................] - ETA: 2:19 - loss: 0.2822 - regression_loss: 0.2662 - classification_loss: 0.0160 224/500 [============>.................] - ETA: 2:18 - loss: 0.2816 - regression_loss: 0.2657 - classification_loss: 0.0159 225/500 [============>.................] - ETA: 2:18 - loss: 0.2817 - regression_loss: 0.2658 - classification_loss: 0.0159 226/500 [============>.................] - ETA: 2:17 - loss: 0.2812 - regression_loss: 0.2653 - classification_loss: 0.0159 227/500 [============>.................] - ETA: 2:17 - loss: 0.2812 - regression_loss: 0.2653 - classification_loss: 0.0159 228/500 [============>.................] - ETA: 2:16 - loss: 0.2807 - regression_loss: 0.2648 - classification_loss: 0.0159 229/500 [============>.................] - ETA: 2:16 - loss: 0.2803 - regression_loss: 0.2645 - classification_loss: 0.0159 230/500 [============>.................] - ETA: 2:15 - loss: 0.2802 - regression_loss: 0.2643 - classification_loss: 0.0159 231/500 [============>.................] - ETA: 2:15 - loss: 0.2803 - regression_loss: 0.2644 - classification_loss: 0.0159 232/500 [============>.................] - ETA: 2:14 - loss: 0.2808 - regression_loss: 0.2649 - classification_loss: 0.0159 233/500 [============>.................] - ETA: 2:14 - loss: 0.2805 - regression_loss: 0.2646 - classification_loss: 0.0159 234/500 [=============>................] - ETA: 2:13 - loss: 0.2806 - regression_loss: 0.2647 - classification_loss: 0.0159 235/500 [=============>................] - ETA: 2:13 - loss: 0.2805 - regression_loss: 0.2646 - classification_loss: 0.0159 236/500 [=============>................] - ETA: 2:12 - loss: 0.2800 - regression_loss: 0.2642 - classification_loss: 0.0159 237/500 [=============>................] - ETA: 2:12 - loss: 0.2805 - regression_loss: 0.2646 - classification_loss: 0.0158 238/500 [=============>................] - ETA: 2:11 - loss: 0.2802 - regression_loss: 0.2644 - classification_loss: 0.0158 239/500 [=============>................] - ETA: 2:11 - loss: 0.2809 - regression_loss: 0.2651 - classification_loss: 0.0158 240/500 [=============>................] - ETA: 2:10 - loss: 0.2800 - regression_loss: 0.2643 - classification_loss: 0.0158 241/500 [=============>................] - ETA: 2:10 - loss: 0.2805 - regression_loss: 0.2647 - classification_loss: 0.0158 242/500 [=============>................] - ETA: 2:09 - loss: 0.2806 - regression_loss: 0.2648 - classification_loss: 0.0159 243/500 [=============>................] - ETA: 2:09 - loss: 0.2806 - regression_loss: 0.2647 - classification_loss: 0.0159 244/500 [=============>................] - ETA: 2:08 - loss: 0.2801 - regression_loss: 0.2643 - classification_loss: 0.0158 245/500 [=============>................] - ETA: 2:08 - loss: 0.2806 - regression_loss: 0.2648 - classification_loss: 0.0158 246/500 [=============>................] - ETA: 2:07 - loss: 0.2808 - regression_loss: 0.2649 - classification_loss: 0.0159 247/500 [=============>................] - ETA: 2:07 - loss: 0.2805 - regression_loss: 0.2646 - classification_loss: 0.0159 248/500 [=============>................] - ETA: 2:06 - loss: 0.2803 - regression_loss: 0.2644 - classification_loss: 0.0158 249/500 [=============>................] - ETA: 2:06 - loss: 0.2797 - regression_loss: 0.2639 - classification_loss: 0.0158 250/500 [==============>...............] - ETA: 2:05 - loss: 0.2802 - regression_loss: 0.2644 - classification_loss: 0.0158 251/500 [==============>...............] - ETA: 2:05 - loss: 0.2803 - regression_loss: 0.2646 - classification_loss: 0.0158 252/500 [==============>...............] - ETA: 2:04 - loss: 0.2799 - regression_loss: 0.2642 - classification_loss: 0.0157 253/500 [==============>...............] - ETA: 2:04 - loss: 0.2802 - regression_loss: 0.2645 - classification_loss: 0.0157 254/500 [==============>...............] - ETA: 2:03 - loss: 0.2803 - regression_loss: 0.2646 - classification_loss: 0.0157 255/500 [==============>...............] - ETA: 2:03 - loss: 0.2812 - regression_loss: 0.2653 - classification_loss: 0.0158 256/500 [==============>...............] - ETA: 2:02 - loss: 0.2810 - regression_loss: 0.2652 - classification_loss: 0.0158 257/500 [==============>...............] - ETA: 2:02 - loss: 0.2813 - regression_loss: 0.2655 - classification_loss: 0.0159 258/500 [==============>...............] - ETA: 2:01 - loss: 0.2819 - regression_loss: 0.2660 - classification_loss: 0.0159 259/500 [==============>...............] - ETA: 2:01 - loss: 0.2813 - regression_loss: 0.2655 - classification_loss: 0.0158 260/500 [==============>...............] - ETA: 2:00 - loss: 0.2821 - regression_loss: 0.2661 - classification_loss: 0.0160 261/500 [==============>...............] - ETA: 2:00 - loss: 0.2820 - regression_loss: 0.2660 - classification_loss: 0.0160 262/500 [==============>...............] - ETA: 1:59 - loss: 0.2821 - regression_loss: 0.2661 - classification_loss: 0.0160 263/500 [==============>...............] - ETA: 1:59 - loss: 0.2815 - regression_loss: 0.2655 - classification_loss: 0.0160 264/500 [==============>...............] - ETA: 1:58 - loss: 0.2820 - regression_loss: 0.2660 - classification_loss: 0.0160 265/500 [==============>...............] - ETA: 1:58 - loss: 0.2836 - regression_loss: 0.2675 - classification_loss: 0.0161 266/500 [==============>...............] - ETA: 1:57 - loss: 0.2832 - regression_loss: 0.2671 - classification_loss: 0.0161 267/500 [===============>..............] - ETA: 1:57 - loss: 0.2838 - regression_loss: 0.2677 - classification_loss: 0.0162 268/500 [===============>..............] - ETA: 1:56 - loss: 0.2843 - regression_loss: 0.2681 - classification_loss: 0.0162 269/500 [===============>..............] - ETA: 1:56 - loss: 0.2839 - regression_loss: 0.2678 - classification_loss: 0.0161 270/500 [===============>..............] - ETA: 1:55 - loss: 0.2836 - regression_loss: 0.2675 - classification_loss: 0.0161 271/500 [===============>..............] - ETA: 1:55 - loss: 0.2836 - regression_loss: 0.2675 - classification_loss: 0.0161 272/500 [===============>..............] - ETA: 1:54 - loss: 0.2839 - regression_loss: 0.2678 - classification_loss: 0.0161 273/500 [===============>..............] - ETA: 1:54 - loss: 0.2839 - regression_loss: 0.2678 - classification_loss: 0.0161 274/500 [===============>..............] - ETA: 1:53 - loss: 0.2838 - regression_loss: 0.2677 - classification_loss: 0.0160 275/500 [===============>..............] - ETA: 1:53 - loss: 0.2853 - regression_loss: 0.2692 - classification_loss: 0.0160 276/500 [===============>..............] - ETA: 1:52 - loss: 0.2855 - regression_loss: 0.2693 - classification_loss: 0.0161 277/500 [===============>..............] - ETA: 1:52 - loss: 0.2854 - regression_loss: 0.2693 - classification_loss: 0.0161 278/500 [===============>..............] - ETA: 1:51 - loss: 0.2850 - regression_loss: 0.2689 - classification_loss: 0.0161 279/500 [===============>..............] - ETA: 1:51 - loss: 0.2846 - regression_loss: 0.2685 - classification_loss: 0.0161 280/500 [===============>..............] - ETA: 1:50 - loss: 0.2842 - regression_loss: 0.2682 - classification_loss: 0.0160 281/500 [===============>..............] - ETA: 1:50 - loss: 0.2848 - regression_loss: 0.2687 - classification_loss: 0.0160 282/500 [===============>..............] - ETA: 1:49 - loss: 0.2852 - regression_loss: 0.2691 - classification_loss: 0.0161 283/500 [===============>..............] - ETA: 1:49 - loss: 0.2844 - regression_loss: 0.2684 - classification_loss: 0.0160 284/500 [================>.............] - ETA: 1:48 - loss: 0.2848 - regression_loss: 0.2687 - classification_loss: 0.0161 285/500 [================>.............] - ETA: 1:48 - loss: 0.2852 - regression_loss: 0.2691 - classification_loss: 0.0161 286/500 [================>.............] - ETA: 1:47 - loss: 0.2850 - regression_loss: 0.2690 - classification_loss: 0.0161 287/500 [================>.............] - ETA: 1:47 - loss: 0.2854 - regression_loss: 0.2693 - classification_loss: 0.0161 288/500 [================>.............] - ETA: 1:46 - loss: 0.2859 - regression_loss: 0.2698 - classification_loss: 0.0161 289/500 [================>.............] - ETA: 1:46 - loss: 0.2861 - regression_loss: 0.2700 - classification_loss: 0.0161 290/500 [================>.............] - ETA: 1:45 - loss: 0.2871 - regression_loss: 0.2708 - classification_loss: 0.0163 291/500 [================>.............] - ETA: 1:45 - loss: 0.2871 - regression_loss: 0.2709 - classification_loss: 0.0163 292/500 [================>.............] - ETA: 1:44 - loss: 0.2875 - regression_loss: 0.2713 - classification_loss: 0.0162 293/500 [================>.............] - ETA: 1:44 - loss: 0.2874 - regression_loss: 0.2712 - classification_loss: 0.0162 294/500 [================>.............] - ETA: 1:43 - loss: 0.2872 - regression_loss: 0.2710 - classification_loss: 0.0162 295/500 [================>.............] - ETA: 1:43 - loss: 0.2879 - regression_loss: 0.2716 - classification_loss: 0.0163 296/500 [================>.............] - ETA: 1:42 - loss: 0.2883 - regression_loss: 0.2720 - classification_loss: 0.0164 297/500 [================>.............] - ETA: 1:42 - loss: 0.2887 - regression_loss: 0.2723 - classification_loss: 0.0164 298/500 [================>.............] - ETA: 1:41 - loss: 0.2901 - regression_loss: 0.2736 - classification_loss: 0.0165 299/500 [================>.............] - ETA: 1:41 - loss: 0.2904 - regression_loss: 0.2739 - classification_loss: 0.0165 300/500 [=================>............] - ETA: 1:40 - loss: 0.2909 - regression_loss: 0.2743 - classification_loss: 0.0166 301/500 [=================>............] - ETA: 1:40 - loss: 0.2910 - regression_loss: 0.2745 - classification_loss: 0.0166 302/500 [=================>............] - ETA: 1:39 - loss: 0.2907 - regression_loss: 0.2742 - classification_loss: 0.0165 303/500 [=================>............] - ETA: 1:38 - loss: 0.2899 - regression_loss: 0.2734 - classification_loss: 0.0165 304/500 [=================>............] - ETA: 1:38 - loss: 0.2895 - regression_loss: 0.2730 - classification_loss: 0.0165 305/500 [=================>............] - ETA: 1:37 - loss: 0.2898 - regression_loss: 0.2734 - classification_loss: 0.0165 306/500 [=================>............] - ETA: 1:37 - loss: 0.2899 - regression_loss: 0.2734 - classification_loss: 0.0165 307/500 [=================>............] - ETA: 1:36 - loss: 0.2901 - regression_loss: 0.2736 - classification_loss: 0.0165 308/500 [=================>............] - ETA: 1:36 - loss: 0.2900 - regression_loss: 0.2734 - classification_loss: 0.0166 309/500 [=================>............] - ETA: 1:35 - loss: 0.2904 - regression_loss: 0.2737 - classification_loss: 0.0167 310/500 [=================>............] - ETA: 1:35 - loss: 0.2902 - regression_loss: 0.2735 - classification_loss: 0.0167 311/500 [=================>............] - ETA: 1:34 - loss: 0.2908 - regression_loss: 0.2741 - classification_loss: 0.0168 312/500 [=================>............] - ETA: 1:34 - loss: 0.2911 - regression_loss: 0.2743 - classification_loss: 0.0168 313/500 [=================>............] - ETA: 1:33 - loss: 0.2911 - regression_loss: 0.2743 - classification_loss: 0.0167 314/500 [=================>............] - ETA: 1:33 - loss: 0.2913 - regression_loss: 0.2745 - classification_loss: 0.0168 315/500 [=================>............] - ETA: 1:32 - loss: 0.2921 - regression_loss: 0.2753 - classification_loss: 0.0168 316/500 [=================>............] - ETA: 1:32 - loss: 0.2919 - regression_loss: 0.2752 - classification_loss: 0.0168 317/500 [==================>...........] - ETA: 1:31 - loss: 0.2918 - regression_loss: 0.2751 - classification_loss: 0.0167 318/500 [==================>...........] - ETA: 1:31 - loss: 0.2922 - regression_loss: 0.2754 - classification_loss: 0.0168 319/500 [==================>...........] - ETA: 1:30 - loss: 0.2921 - regression_loss: 0.2753 - classification_loss: 0.0168 320/500 [==================>...........] - ETA: 1:30 - loss: 0.2930 - regression_loss: 0.2761 - classification_loss: 0.0169 321/500 [==================>...........] - ETA: 1:29 - loss: 0.2936 - regression_loss: 0.2766 - classification_loss: 0.0170 322/500 [==================>...........] - ETA: 1:29 - loss: 0.2943 - regression_loss: 0.2773 - classification_loss: 0.0170 323/500 [==================>...........] - ETA: 1:28 - loss: 0.2943 - regression_loss: 0.2773 - classification_loss: 0.0170 324/500 [==================>...........] - ETA: 1:28 - loss: 0.2941 - regression_loss: 0.2771 - classification_loss: 0.0170 325/500 [==================>...........] - ETA: 1:27 - loss: 0.2941 - regression_loss: 0.2771 - classification_loss: 0.0170 326/500 [==================>...........] - ETA: 1:27 - loss: 0.2938 - regression_loss: 0.2768 - classification_loss: 0.0170 327/500 [==================>...........] - ETA: 1:26 - loss: 0.2945 - regression_loss: 0.2774 - classification_loss: 0.0171 328/500 [==================>...........] - ETA: 1:26 - loss: 0.2946 - regression_loss: 0.2775 - classification_loss: 0.0171 329/500 [==================>...........] - ETA: 1:25 - loss: 0.2946 - regression_loss: 0.2775 - classification_loss: 0.0170 330/500 [==================>...........] - ETA: 1:25 - loss: 0.2944 - regression_loss: 0.2773 - classification_loss: 0.0170 331/500 [==================>...........] - ETA: 1:24 - loss: 0.2941 - regression_loss: 0.2771 - classification_loss: 0.0170 332/500 [==================>...........] - ETA: 1:24 - loss: 0.2941 - regression_loss: 0.2771 - classification_loss: 0.0170 333/500 [==================>...........] - ETA: 1:23 - loss: 0.2941 - regression_loss: 0.2771 - classification_loss: 0.0170 334/500 [===================>..........] - ETA: 1:23 - loss: 0.2938 - regression_loss: 0.2768 - classification_loss: 0.0169 335/500 [===================>..........] - ETA: 1:22 - loss: 0.2940 - regression_loss: 0.2770 - classification_loss: 0.0170 336/500 [===================>..........] - ETA: 1:22 - loss: 0.2940 - regression_loss: 0.2770 - classification_loss: 0.0171 337/500 [===================>..........] - ETA: 1:21 - loss: 0.2943 - regression_loss: 0.2773 - classification_loss: 0.0170 338/500 [===================>..........] - ETA: 1:21 - loss: 0.2940 - regression_loss: 0.2770 - classification_loss: 0.0170 339/500 [===================>..........] - ETA: 1:20 - loss: 0.2941 - regression_loss: 0.2771 - classification_loss: 0.0170 340/500 [===================>..........] - ETA: 1:20 - loss: 0.2939 - regression_loss: 0.2769 - classification_loss: 0.0170 341/500 [===================>..........] - ETA: 1:19 - loss: 0.2941 - regression_loss: 0.2771 - classification_loss: 0.0170 342/500 [===================>..........] - ETA: 1:19 - loss: 0.2942 - regression_loss: 0.2772 - classification_loss: 0.0170 343/500 [===================>..........] - ETA: 1:18 - loss: 0.2947 - regression_loss: 0.2777 - classification_loss: 0.0171 344/500 [===================>..........] - ETA: 1:18 - loss: 0.2950 - regression_loss: 0.2779 - classification_loss: 0.0171 345/500 [===================>..........] - ETA: 1:17 - loss: 0.2956 - regression_loss: 0.2785 - classification_loss: 0.0170 346/500 [===================>..........] - ETA: 1:17 - loss: 0.2953 - regression_loss: 0.2783 - classification_loss: 0.0170 347/500 [===================>..........] - ETA: 1:16 - loss: 0.2948 - regression_loss: 0.2779 - classification_loss: 0.0170 348/500 [===================>..........] - ETA: 1:16 - loss: 0.2950 - regression_loss: 0.2780 - classification_loss: 0.0170 349/500 [===================>..........] - ETA: 1:15 - loss: 0.2947 - regression_loss: 0.2778 - classification_loss: 0.0169 350/500 [====================>.........] - ETA: 1:15 - loss: 0.2953 - regression_loss: 0.2783 - classification_loss: 0.0170 351/500 [====================>.........] - ETA: 1:14 - loss: 0.2961 - regression_loss: 0.2791 - classification_loss: 0.0170 352/500 [====================>.........] - ETA: 1:14 - loss: 0.2959 - regression_loss: 0.2789 - classification_loss: 0.0170 353/500 [====================>.........] - ETA: 1:13 - loss: 0.2960 - regression_loss: 0.2790 - classification_loss: 0.0170 354/500 [====================>.........] - ETA: 1:13 - loss: 0.2962 - regression_loss: 0.2792 - classification_loss: 0.0170 355/500 [====================>.........] - ETA: 1:12 - loss: 0.2959 - regression_loss: 0.2789 - classification_loss: 0.0170 356/500 [====================>.........] - ETA: 1:12 - loss: 0.2963 - regression_loss: 0.2793 - classification_loss: 0.0170 357/500 [====================>.........] - ETA: 1:11 - loss: 0.2962 - regression_loss: 0.2793 - classification_loss: 0.0170 358/500 [====================>.........] - ETA: 1:11 - loss: 0.2965 - regression_loss: 0.2794 - classification_loss: 0.0170 359/500 [====================>.........] - ETA: 1:10 - loss: 0.2963 - regression_loss: 0.2792 - classification_loss: 0.0170 360/500 [====================>.........] - ETA: 1:10 - loss: 0.2960 - regression_loss: 0.2790 - classification_loss: 0.0170 361/500 [====================>.........] - ETA: 1:09 - loss: 0.2960 - regression_loss: 0.2790 - classification_loss: 0.0170 362/500 [====================>.........] - ETA: 1:09 - loss: 0.2960 - regression_loss: 0.2790 - classification_loss: 0.0170 363/500 [====================>.........] - ETA: 1:08 - loss: 0.2964 - regression_loss: 0.2793 - classification_loss: 0.0171 364/500 [====================>.........] - ETA: 1:08 - loss: 0.2968 - regression_loss: 0.2796 - classification_loss: 0.0172 365/500 [====================>.........] - ETA: 1:07 - loss: 0.2969 - regression_loss: 0.2797 - classification_loss: 0.0172 366/500 [====================>.........] - ETA: 1:07 - loss: 0.2971 - regression_loss: 0.2799 - classification_loss: 0.0172 367/500 [=====================>........] - ETA: 1:06 - loss: 0.2969 - regression_loss: 0.2797 - classification_loss: 0.0172 368/500 [=====================>........] - ETA: 1:06 - loss: 0.2968 - regression_loss: 0.2796 - classification_loss: 0.0172 369/500 [=====================>........] - ETA: 1:05 - loss: 0.2966 - regression_loss: 0.2795 - classification_loss: 0.0172 370/500 [=====================>........] - ETA: 1:05 - loss: 0.2963 - regression_loss: 0.2792 - classification_loss: 0.0171 371/500 [=====================>........] - ETA: 1:04 - loss: 0.2962 - regression_loss: 0.2790 - classification_loss: 0.0171 372/500 [=====================>........] - ETA: 1:04 - loss: 0.2958 - regression_loss: 0.2787 - classification_loss: 0.0171 373/500 [=====================>........] - ETA: 1:03 - loss: 0.2959 - regression_loss: 0.2788 - classification_loss: 0.0171 374/500 [=====================>........] - ETA: 1:03 - loss: 0.2956 - regression_loss: 0.2785 - classification_loss: 0.0171 375/500 [=====================>........] - ETA: 1:02 - loss: 0.2958 - regression_loss: 0.2787 - classification_loss: 0.0171 376/500 [=====================>........] - ETA: 1:02 - loss: 0.2967 - regression_loss: 0.2795 - classification_loss: 0.0171 377/500 [=====================>........] - ETA: 1:01 - loss: 0.2967 - regression_loss: 0.2796 - classification_loss: 0.0171 378/500 [=====================>........] - ETA: 1:01 - loss: 0.2963 - regression_loss: 0.2792 - classification_loss: 0.0171 379/500 [=====================>........] - ETA: 1:00 - loss: 0.2963 - regression_loss: 0.2792 - classification_loss: 0.0171 380/500 [=====================>........] - ETA: 1:00 - loss: 0.2961 - regression_loss: 0.2791 - classification_loss: 0.0170 381/500 [=====================>........] - ETA: 59s - loss: 0.2958 - regression_loss: 0.2788 - classification_loss: 0.0170  382/500 [=====================>........] - ETA: 59s - loss: 0.2958 - regression_loss: 0.2787 - classification_loss: 0.0170 383/500 [=====================>........] - ETA: 58s - loss: 0.2957 - regression_loss: 0.2787 - classification_loss: 0.0171 384/500 [======================>.......] - ETA: 58s - loss: 0.2954 - regression_loss: 0.2783 - classification_loss: 0.0170 385/500 [======================>.......] - ETA: 57s - loss: 0.2960 - regression_loss: 0.2789 - classification_loss: 0.0171 386/500 [======================>.......] - ETA: 57s - loss: 0.2966 - regression_loss: 0.2795 - classification_loss: 0.0171 387/500 [======================>.......] - ETA: 56s - loss: 0.2966 - regression_loss: 0.2795 - classification_loss: 0.0171 388/500 [======================>.......] - ETA: 56s - loss: 0.2967 - regression_loss: 0.2796 - classification_loss: 0.0171 389/500 [======================>.......] - ETA: 55s - loss: 0.2965 - regression_loss: 0.2794 - classification_loss: 0.0171 390/500 [======================>.......] - ETA: 55s - loss: 0.2969 - regression_loss: 0.2798 - classification_loss: 0.0171 391/500 [======================>.......] - ETA: 54s - loss: 0.2968 - regression_loss: 0.2797 - classification_loss: 0.0171 392/500 [======================>.......] - ETA: 54s - loss: 0.2967 - regression_loss: 0.2796 - classification_loss: 0.0171 393/500 [======================>.......] - ETA: 53s - loss: 0.2964 - regression_loss: 0.2793 - classification_loss: 0.0171 394/500 [======================>.......] - ETA: 53s - loss: 0.2964 - regression_loss: 0.2793 - classification_loss: 0.0171 395/500 [======================>.......] - ETA: 52s - loss: 0.2961 - regression_loss: 0.2790 - classification_loss: 0.0171 396/500 [======================>.......] - ETA: 52s - loss: 0.2961 - regression_loss: 0.2790 - classification_loss: 0.0171 397/500 [======================>.......] - ETA: 51s - loss: 0.2958 - regression_loss: 0.2787 - classification_loss: 0.0171 398/500 [======================>.......] - ETA: 51s - loss: 0.2953 - regression_loss: 0.2783 - classification_loss: 0.0171 399/500 [======================>.......] - ETA: 50s - loss: 0.2957 - regression_loss: 0.2786 - classification_loss: 0.0171 400/500 [=======================>......] - ETA: 50s - loss: 0.2955 - regression_loss: 0.2784 - classification_loss: 0.0171 401/500 [=======================>......] - ETA: 49s - loss: 0.2957 - regression_loss: 0.2786 - classification_loss: 0.0171 402/500 [=======================>......] - ETA: 49s - loss: 0.2954 - regression_loss: 0.2783 - classification_loss: 0.0171 403/500 [=======================>......] - ETA: 48s - loss: 0.2953 - regression_loss: 0.2782 - classification_loss: 0.0171 404/500 [=======================>......] - ETA: 48s - loss: 0.2955 - regression_loss: 0.2784 - classification_loss: 0.0171 405/500 [=======================>......] - ETA: 47s - loss: 0.2960 - regression_loss: 0.2789 - classification_loss: 0.0171 406/500 [=======================>......] - ETA: 47s - loss: 0.2964 - regression_loss: 0.2793 - classification_loss: 0.0171 407/500 [=======================>......] - ETA: 46s - loss: 0.2963 - regression_loss: 0.2792 - classification_loss: 0.0171 408/500 [=======================>......] - ETA: 46s - loss: 0.2963 - regression_loss: 0.2792 - classification_loss: 0.0171 409/500 [=======================>......] - ETA: 45s - loss: 0.2963 - regression_loss: 0.2792 - classification_loss: 0.0171 410/500 [=======================>......] - ETA: 45s - loss: 0.2962 - regression_loss: 0.2791 - classification_loss: 0.0171 411/500 [=======================>......] - ETA: 44s - loss: 0.2961 - regression_loss: 0.2790 - classification_loss: 0.0171 412/500 [=======================>......] - ETA: 44s - loss: 0.2966 - regression_loss: 0.2795 - classification_loss: 0.0171 413/500 [=======================>......] - ETA: 43s - loss: 0.2966 - regression_loss: 0.2794 - classification_loss: 0.0172 414/500 [=======================>......] - ETA: 43s - loss: 0.2970 - regression_loss: 0.2798 - classification_loss: 0.0172 415/500 [=======================>......] - ETA: 42s - loss: 0.2971 - regression_loss: 0.2798 - classification_loss: 0.0172 416/500 [=======================>......] - ETA: 42s - loss: 0.2972 - regression_loss: 0.2799 - classification_loss: 0.0172 417/500 [========================>.....] - ETA: 41s - loss: 0.2970 - regression_loss: 0.2798 - classification_loss: 0.0172 418/500 [========================>.....] - ETA: 41s - loss: 0.2970 - regression_loss: 0.2798 - classification_loss: 0.0172 419/500 [========================>.....] - ETA: 40s - loss: 0.2967 - regression_loss: 0.2795 - classification_loss: 0.0172 420/500 [========================>.....] - ETA: 40s - loss: 0.2967 - regression_loss: 0.2795 - classification_loss: 0.0172 421/500 [========================>.....] - ETA: 39s - loss: 0.2968 - regression_loss: 0.2796 - classification_loss: 0.0172 422/500 [========================>.....] - ETA: 39s - loss: 0.2969 - regression_loss: 0.2797 - classification_loss: 0.0172 423/500 [========================>.....] - ETA: 38s - loss: 0.2968 - regression_loss: 0.2796 - classification_loss: 0.0172 424/500 [========================>.....] - ETA: 38s - loss: 0.2967 - regression_loss: 0.2794 - classification_loss: 0.0172 425/500 [========================>.....] - ETA: 37s - loss: 0.2965 - regression_loss: 0.2792 - classification_loss: 0.0172 426/500 [========================>.....] - ETA: 37s - loss: 0.2964 - regression_loss: 0.2792 - classification_loss: 0.0172 427/500 [========================>.....] - ETA: 36s - loss: 0.2963 - regression_loss: 0.2791 - classification_loss: 0.0172 428/500 [========================>.....] - ETA: 36s - loss: 0.2963 - regression_loss: 0.2791 - classification_loss: 0.0172 429/500 [========================>.....] - ETA: 35s - loss: 0.2960 - regression_loss: 0.2788 - classification_loss: 0.0172 430/500 [========================>.....] - ETA: 35s - loss: 0.2958 - regression_loss: 0.2786 - classification_loss: 0.0172 431/500 [========================>.....] - ETA: 34s - loss: 0.2960 - regression_loss: 0.2788 - classification_loss: 0.0172 432/500 [========================>.....] - ETA: 34s - loss: 0.2959 - regression_loss: 0.2787 - classification_loss: 0.0172 433/500 [========================>.....] - ETA: 33s - loss: 0.2958 - regression_loss: 0.2786 - classification_loss: 0.0172 434/500 [=========================>....] - ETA: 33s - loss: 0.2956 - regression_loss: 0.2784 - classification_loss: 0.0172 435/500 [=========================>....] - ETA: 32s - loss: 0.2954 - regression_loss: 0.2783 - classification_loss: 0.0172 436/500 [=========================>....] - ETA: 32s - loss: 0.2956 - regression_loss: 0.2784 - classification_loss: 0.0172 437/500 [=========================>....] - ETA: 31s - loss: 0.2953 - regression_loss: 0.2781 - classification_loss: 0.0172 438/500 [=========================>....] - ETA: 31s - loss: 0.2951 - regression_loss: 0.2779 - classification_loss: 0.0172 439/500 [=========================>....] - ETA: 30s - loss: 0.2951 - regression_loss: 0.2779 - classification_loss: 0.0172 440/500 [=========================>....] - ETA: 30s - loss: 0.2950 - regression_loss: 0.2778 - classification_loss: 0.0172 441/500 [=========================>....] - ETA: 29s - loss: 0.2948 - regression_loss: 0.2776 - classification_loss: 0.0172 442/500 [=========================>....] - ETA: 29s - loss: 0.2946 - regression_loss: 0.2775 - classification_loss: 0.0171 443/500 [=========================>....] - ETA: 28s - loss: 0.2945 - regression_loss: 0.2773 - classification_loss: 0.0171 444/500 [=========================>....] - ETA: 28s - loss: 0.2942 - regression_loss: 0.2771 - classification_loss: 0.0171 445/500 [=========================>....] - ETA: 27s - loss: 0.2945 - regression_loss: 0.2774 - classification_loss: 0.0171 446/500 [=========================>....] - ETA: 27s - loss: 0.2941 - regression_loss: 0.2770 - classification_loss: 0.0171 447/500 [=========================>....] - ETA: 26s - loss: 0.2951 - regression_loss: 0.2779 - classification_loss: 0.0172 448/500 [=========================>....] - ETA: 26s - loss: 0.2951 - regression_loss: 0.2779 - classification_loss: 0.0172 449/500 [=========================>....] - ETA: 25s - loss: 0.2951 - regression_loss: 0.2779 - classification_loss: 0.0171 450/500 [==========================>...] - ETA: 25s - loss: 0.2954 - regression_loss: 0.2783 - classification_loss: 0.0172 451/500 [==========================>...] - ETA: 24s - loss: 0.2959 - regression_loss: 0.2787 - classification_loss: 0.0172 452/500 [==========================>...] - ETA: 24s - loss: 0.2957 - regression_loss: 0.2785 - classification_loss: 0.0172 453/500 [==========================>...] - ETA: 23s - loss: 0.2956 - regression_loss: 0.2784 - classification_loss: 0.0172 454/500 [==========================>...] - ETA: 23s - loss: 0.2956 - regression_loss: 0.2784 - classification_loss: 0.0172 455/500 [==========================>...] - ETA: 22s - loss: 0.2959 - regression_loss: 0.2787 - classification_loss: 0.0172 456/500 [==========================>...] - ETA: 22s - loss: 0.2958 - regression_loss: 0.2787 - classification_loss: 0.0172 457/500 [==========================>...] - ETA: 21s - loss: 0.2956 - regression_loss: 0.2785 - classification_loss: 0.0172 458/500 [==========================>...] - ETA: 21s - loss: 0.2960 - regression_loss: 0.2788 - classification_loss: 0.0172 459/500 [==========================>...] - ETA: 20s - loss: 0.2958 - regression_loss: 0.2786 - classification_loss: 0.0172 460/500 [==========================>...] - ETA: 20s - loss: 0.2958 - regression_loss: 0.2786 - classification_loss: 0.0172 461/500 [==========================>...] - ETA: 19s - loss: 0.2958 - regression_loss: 0.2786 - classification_loss: 0.0172 462/500 [==========================>...] - ETA: 19s - loss: 0.2956 - regression_loss: 0.2784 - classification_loss: 0.0172 463/500 [==========================>...] - ETA: 18s - loss: 0.2957 - regression_loss: 0.2784 - classification_loss: 0.0172 464/500 [==========================>...] - ETA: 18s - loss: 0.2958 - regression_loss: 0.2786 - classification_loss: 0.0172 465/500 [==========================>...] - ETA: 17s - loss: 0.2958 - regression_loss: 0.2785 - classification_loss: 0.0172 466/500 [==========================>...] - ETA: 17s - loss: 0.2957 - regression_loss: 0.2785 - classification_loss: 0.0172 467/500 [===========================>..] - ETA: 16s - loss: 0.2956 - regression_loss: 0.2784 - classification_loss: 0.0172 468/500 [===========================>..] - ETA: 16s - loss: 0.2955 - regression_loss: 0.2784 - classification_loss: 0.0172 469/500 [===========================>..] - ETA: 15s - loss: 0.2964 - regression_loss: 0.2791 - classification_loss: 0.0173 470/500 [===========================>..] - ETA: 15s - loss: 0.2963 - regression_loss: 0.2790 - classification_loss: 0.0172 471/500 [===========================>..] - ETA: 14s - loss: 0.2964 - regression_loss: 0.2791 - classification_loss: 0.0173 472/500 [===========================>..] - ETA: 14s - loss: 0.2963 - regression_loss: 0.2790 - classification_loss: 0.0173 473/500 [===========================>..] - ETA: 13s - loss: 0.2963 - regression_loss: 0.2790 - classification_loss: 0.0173 474/500 [===========================>..] - ETA: 13s - loss: 0.2963 - regression_loss: 0.2791 - classification_loss: 0.0173 475/500 [===========================>..] - ETA: 12s - loss: 0.2961 - regression_loss: 0.2788 - classification_loss: 0.0173 476/500 [===========================>..] - ETA: 12s - loss: 0.2960 - regression_loss: 0.2787 - classification_loss: 0.0172 477/500 [===========================>..] - ETA: 11s - loss: 0.2956 - regression_loss: 0.2784 - classification_loss: 0.0172 478/500 [===========================>..] - ETA: 11s - loss: 0.2959 - regression_loss: 0.2787 - classification_loss: 0.0172 479/500 [===========================>..] - ETA: 10s - loss: 0.2962 - regression_loss: 0.2790 - classification_loss: 0.0172 480/500 [===========================>..] - ETA: 10s - loss: 0.2959 - regression_loss: 0.2787 - classification_loss: 0.0172 481/500 [===========================>..] - ETA: 9s - loss: 0.2961 - regression_loss: 0.2789 - classification_loss: 0.0172  482/500 [===========================>..] - ETA: 9s - loss: 0.2958 - regression_loss: 0.2786 - classification_loss: 0.0172 483/500 [===========================>..] - ETA: 8s - loss: 0.2958 - regression_loss: 0.2786 - classification_loss: 0.0172 484/500 [============================>.] - ETA: 8s - loss: 0.2960 - regression_loss: 0.2787 - classification_loss: 0.0172 485/500 [============================>.] - ETA: 7s - loss: 0.2965 - regression_loss: 0.2791 - classification_loss: 0.0174 486/500 [============================>.] - ETA: 7s - loss: 0.2966 - regression_loss: 0.2792 - classification_loss: 0.0174 487/500 [============================>.] - ETA: 6s - loss: 0.2966 - regression_loss: 0.2792 - classification_loss: 0.0174 488/500 [============================>.] - ETA: 6s - loss: 0.2965 - regression_loss: 0.2791 - classification_loss: 0.0174 489/500 [============================>.] - ETA: 5s - loss: 0.2962 - regression_loss: 0.2788 - classification_loss: 0.0173 490/500 [============================>.] - ETA: 5s - loss: 0.2965 - regression_loss: 0.2792 - classification_loss: 0.0173 491/500 [============================>.] - ETA: 4s - loss: 0.2965 - regression_loss: 0.2792 - classification_loss: 0.0173 492/500 [============================>.] - ETA: 4s - loss: 0.2964 - regression_loss: 0.2791 - classification_loss: 0.0173 493/500 [============================>.] - ETA: 3s - loss: 0.2964 - regression_loss: 0.2791 - classification_loss: 0.0173 494/500 [============================>.] - ETA: 3s - loss: 0.2962 - regression_loss: 0.2789 - classification_loss: 0.0173 495/500 [============================>.] - ETA: 2s - loss: 0.2962 - regression_loss: 0.2789 - classification_loss: 0.0173 496/500 [============================>.] - ETA: 2s - loss: 0.2962 - regression_loss: 0.2789 - classification_loss: 0.0173 497/500 [============================>.] - ETA: 1s - loss: 0.2964 - regression_loss: 0.2791 - classification_loss: 0.0173 498/500 [============================>.] - ETA: 1s - loss: 0.2975 - regression_loss: 0.2800 - classification_loss: 0.0175 499/500 [============================>.] - ETA: 0s - loss: 0.2973 - regression_loss: 0.2798 - classification_loss: 0.0175 500/500 [==============================] - 251s 502ms/step - loss: 0.2973 - regression_loss: 0.2798 - classification_loss: 0.0175 1172 instances of class plum with average precision: 0.7413 mAP: 0.7413 Epoch 00018: saving model to ./training/snapshots/resnet101_pascal_18.h5 Epoch 19/150 1/500 [..............................] - ETA: 4:10 - loss: 0.4897 - regression_loss: 0.4695 - classification_loss: 0.0201 2/500 [..............................] - ETA: 4:08 - loss: 0.3268 - regression_loss: 0.3131 - classification_loss: 0.0137 3/500 [..............................] - ETA: 4:07 - loss: 0.3171 - regression_loss: 0.2991 - classification_loss: 0.0180 4/500 [..............................] - ETA: 4:07 - loss: 0.3173 - regression_loss: 0.3005 - classification_loss: 0.0168 5/500 [..............................] - ETA: 4:05 - loss: 0.2980 - regression_loss: 0.2813 - classification_loss: 0.0167 6/500 [..............................] - ETA: 4:03 - loss: 0.3065 - regression_loss: 0.2884 - classification_loss: 0.0180 7/500 [..............................] - ETA: 4:03 - loss: 0.3434 - regression_loss: 0.3257 - classification_loss: 0.0177 8/500 [..............................] - ETA: 4:03 - loss: 0.3568 - regression_loss: 0.3384 - classification_loss: 0.0183 9/500 [..............................] - ETA: 4:02 - loss: 0.3694 - regression_loss: 0.3508 - classification_loss: 0.0187 10/500 [..............................] - ETA: 4:04 - loss: 0.3551 - regression_loss: 0.3375 - classification_loss: 0.0175 11/500 [..............................] - ETA: 4:04 - loss: 0.3582 - regression_loss: 0.3380 - classification_loss: 0.0203 12/500 [..............................] - ETA: 4:02 - loss: 0.3639 - regression_loss: 0.3408 - classification_loss: 0.0231 13/500 [..............................] - ETA: 4:02 - loss: 0.3550 - regression_loss: 0.3324 - classification_loss: 0.0226 14/500 [..............................] - ETA: 4:02 - loss: 0.3598 - regression_loss: 0.3362 - classification_loss: 0.0236 15/500 [..............................] - ETA: 4:11 - loss: 0.3533 - regression_loss: 0.3303 - classification_loss: 0.0231 16/500 [..............................] - ETA: 4:10 - loss: 0.3473 - regression_loss: 0.3250 - classification_loss: 0.0224 17/500 [>.............................] - ETA: 4:09 - loss: 0.3435 - regression_loss: 0.3213 - classification_loss: 0.0221 18/500 [>.............................] - ETA: 4:08 - loss: 0.3560 - regression_loss: 0.3333 - classification_loss: 0.0227 19/500 [>.............................] - ETA: 4:07 - loss: 0.3598 - regression_loss: 0.3371 - classification_loss: 0.0227 20/500 [>.............................] - ETA: 4:07 - loss: 0.3631 - regression_loss: 0.3402 - classification_loss: 0.0229 21/500 [>.............................] - ETA: 4:07 - loss: 0.3751 - regression_loss: 0.3477 - classification_loss: 0.0273 22/500 [>.............................] - ETA: 4:06 - loss: 0.3635 - regression_loss: 0.3373 - classification_loss: 0.0262 23/500 [>.............................] - ETA: 4:05 - loss: 0.3644 - regression_loss: 0.3385 - classification_loss: 0.0259 24/500 [>.............................] - ETA: 4:04 - loss: 0.3659 - regression_loss: 0.3405 - classification_loss: 0.0255 25/500 [>.............................] - ETA: 4:03 - loss: 0.3601 - regression_loss: 0.3354 - classification_loss: 0.0247 26/500 [>.............................] - ETA: 4:02 - loss: 0.3540 - regression_loss: 0.3297 - classification_loss: 0.0242 27/500 [>.............................] - ETA: 4:01 - loss: 0.3530 - regression_loss: 0.3292 - classification_loss: 0.0238 28/500 [>.............................] - ETA: 4:01 - loss: 0.3519 - regression_loss: 0.3285 - classification_loss: 0.0233 29/500 [>.............................] - ETA: 4:00 - loss: 0.3495 - regression_loss: 0.3266 - classification_loss: 0.0230 30/500 [>.............................] - ETA: 4:00 - loss: 0.3412 - regression_loss: 0.3190 - classification_loss: 0.0223 31/500 [>.............................] - ETA: 3:59 - loss: 0.3363 - regression_loss: 0.3145 - classification_loss: 0.0218 32/500 [>.............................] - ETA: 3:58 - loss: 0.3332 - regression_loss: 0.3119 - classification_loss: 0.0213 33/500 [>.............................] - ETA: 3:58 - loss: 0.3318 - regression_loss: 0.3108 - classification_loss: 0.0210 34/500 [=>............................] - ETA: 3:57 - loss: 0.3294 - regression_loss: 0.3086 - classification_loss: 0.0208 35/500 [=>............................] - ETA: 3:57 - loss: 0.3341 - regression_loss: 0.3127 - classification_loss: 0.0214 36/500 [=>............................] - ETA: 3:56 - loss: 0.3342 - regression_loss: 0.3129 - classification_loss: 0.0213 37/500 [=>............................] - ETA: 3:55 - loss: 0.3402 - regression_loss: 0.3186 - classification_loss: 0.0216 38/500 [=>............................] - ETA: 3:55 - loss: 0.3378 - regression_loss: 0.3165 - classification_loss: 0.0213 39/500 [=>............................] - ETA: 3:54 - loss: 0.3531 - regression_loss: 0.3316 - classification_loss: 0.0215 40/500 [=>............................] - ETA: 3:54 - loss: 0.3537 - regression_loss: 0.3322 - classification_loss: 0.0215 41/500 [=>............................] - ETA: 3:53 - loss: 0.3510 - regression_loss: 0.3297 - classification_loss: 0.0212 42/500 [=>............................] - ETA: 3:53 - loss: 0.3460 - regression_loss: 0.3250 - classification_loss: 0.0210 43/500 [=>............................] - ETA: 3:52 - loss: 0.3434 - regression_loss: 0.3228 - classification_loss: 0.0205 44/500 [=>............................] - ETA: 3:51 - loss: 0.3402 - regression_loss: 0.3198 - classification_loss: 0.0203 45/500 [=>............................] - ETA: 3:51 - loss: 0.3386 - regression_loss: 0.3186 - classification_loss: 0.0200 46/500 [=>............................] - ETA: 3:50 - loss: 0.3336 - regression_loss: 0.3139 - classification_loss: 0.0197 47/500 [=>............................] - ETA: 3:50 - loss: 0.3319 - regression_loss: 0.3124 - classification_loss: 0.0195 48/500 [=>............................] - ETA: 3:49 - loss: 0.3315 - regression_loss: 0.3119 - classification_loss: 0.0195 49/500 [=>............................] - ETA: 3:49 - loss: 0.3311 - regression_loss: 0.3116 - classification_loss: 0.0194 50/500 [==>...........................] - ETA: 3:48 - loss: 0.3298 - regression_loss: 0.3106 - classification_loss: 0.0192 51/500 [==>...........................] - ETA: 3:48 - loss: 0.3305 - regression_loss: 0.3113 - classification_loss: 0.0192 52/500 [==>...........................] - ETA: 3:47 - loss: 0.3300 - regression_loss: 0.3111 - classification_loss: 0.0190 53/500 [==>...........................] - ETA: 3:46 - loss: 0.3303 - regression_loss: 0.3114 - classification_loss: 0.0189 54/500 [==>...........................] - ETA: 3:46 - loss: 0.3303 - regression_loss: 0.3114 - classification_loss: 0.0190 55/500 [==>...........................] - ETA: 3:45 - loss: 0.3257 - regression_loss: 0.3070 - classification_loss: 0.0187 56/500 [==>...........................] - ETA: 3:45 - loss: 0.3242 - regression_loss: 0.3057 - classification_loss: 0.0185 57/500 [==>...........................] - ETA: 3:44 - loss: 0.3205 - regression_loss: 0.3023 - classification_loss: 0.0183 58/500 [==>...........................] - ETA: 3:44 - loss: 0.3201 - regression_loss: 0.3017 - classification_loss: 0.0183 59/500 [==>...........................] - ETA: 3:43 - loss: 0.3235 - regression_loss: 0.3050 - classification_loss: 0.0185 60/500 [==>...........................] - ETA: 3:43 - loss: 0.3202 - regression_loss: 0.3020 - classification_loss: 0.0182 61/500 [==>...........................] - ETA: 3:42 - loss: 0.3201 - regression_loss: 0.3017 - classification_loss: 0.0184 62/500 [==>...........................] - ETA: 3:41 - loss: 0.3182 - regression_loss: 0.3000 - classification_loss: 0.0183 63/500 [==>...........................] - ETA: 3:41 - loss: 0.3192 - regression_loss: 0.3006 - classification_loss: 0.0186 64/500 [==>...........................] - ETA: 3:40 - loss: 0.3207 - regression_loss: 0.3021 - classification_loss: 0.0185 65/500 [==>...........................] - ETA: 3:40 - loss: 0.3202 - regression_loss: 0.3018 - classification_loss: 0.0184 66/500 [==>...........................] - ETA: 3:39 - loss: 0.3231 - regression_loss: 0.3046 - classification_loss: 0.0185 67/500 [===>..........................] - ETA: 3:38 - loss: 0.3215 - regression_loss: 0.3031 - classification_loss: 0.0184 68/500 [===>..........................] - ETA: 3:38 - loss: 0.3209 - regression_loss: 0.3024 - classification_loss: 0.0184 69/500 [===>..........................] - ETA: 3:37 - loss: 0.3199 - regression_loss: 0.3015 - classification_loss: 0.0184 70/500 [===>..........................] - ETA: 3:37 - loss: 0.3207 - regression_loss: 0.3024 - classification_loss: 0.0184 71/500 [===>..........................] - ETA: 3:36 - loss: 0.3210 - regression_loss: 0.3025 - classification_loss: 0.0184 72/500 [===>..........................] - ETA: 3:36 - loss: 0.3223 - regression_loss: 0.3038 - classification_loss: 0.0185 73/500 [===>..........................] - ETA: 3:35 - loss: 0.3216 - regression_loss: 0.3032 - classification_loss: 0.0184 74/500 [===>..........................] - ETA: 3:35 - loss: 0.3202 - regression_loss: 0.3019 - classification_loss: 0.0183 75/500 [===>..........................] - ETA: 3:34 - loss: 0.3193 - regression_loss: 0.3011 - classification_loss: 0.0182 76/500 [===>..........................] - ETA: 3:34 - loss: 0.3186 - regression_loss: 0.3005 - classification_loss: 0.0181 77/500 [===>..........................] - ETA: 3:33 - loss: 0.3179 - regression_loss: 0.2999 - classification_loss: 0.0180 78/500 [===>..........................] - ETA: 3:33 - loss: 0.3168 - regression_loss: 0.2989 - classification_loss: 0.0179 79/500 [===>..........................] - ETA: 3:33 - loss: 0.3169 - regression_loss: 0.2987 - classification_loss: 0.0181 80/500 [===>..........................] - ETA: 3:32 - loss: 0.3167 - regression_loss: 0.2986 - classification_loss: 0.0182 81/500 [===>..........................] - ETA: 3:31 - loss: 0.3156 - regression_loss: 0.2976 - classification_loss: 0.0180 82/500 [===>..........................] - ETA: 3:31 - loss: 0.3146 - regression_loss: 0.2967 - classification_loss: 0.0179 83/500 [===>..........................] - ETA: 3:30 - loss: 0.3162 - regression_loss: 0.2982 - classification_loss: 0.0180 84/500 [====>.........................] - ETA: 3:30 - loss: 0.3173 - regression_loss: 0.2991 - classification_loss: 0.0181 85/500 [====>.........................] - ETA: 3:29 - loss: 0.3155 - regression_loss: 0.2975 - classification_loss: 0.0180 86/500 [====>.........................] - ETA: 3:29 - loss: 0.3149 - regression_loss: 0.2969 - classification_loss: 0.0180 87/500 [====>.........................] - ETA: 3:28 - loss: 0.3147 - regression_loss: 0.2968 - classification_loss: 0.0180 88/500 [====>.........................] - ETA: 3:28 - loss: 0.3150 - regression_loss: 0.2969 - classification_loss: 0.0181 89/500 [====>.........................] - ETA: 3:27 - loss: 0.3143 - regression_loss: 0.2962 - classification_loss: 0.0181 90/500 [====>.........................] - ETA: 3:27 - loss: 0.3141 - regression_loss: 0.2961 - classification_loss: 0.0180 91/500 [====>.........................] - ETA: 3:26 - loss: 0.3145 - regression_loss: 0.2965 - classification_loss: 0.0180 92/500 [====>.........................] - ETA: 3:25 - loss: 0.3133 - regression_loss: 0.2953 - classification_loss: 0.0180 93/500 [====>.........................] - ETA: 3:25 - loss: 0.3119 - regression_loss: 0.2940 - classification_loss: 0.0179 94/500 [====>.........................] - ETA: 3:24 - loss: 0.3105 - regression_loss: 0.2928 - classification_loss: 0.0177 95/500 [====>.........................] - ETA: 3:24 - loss: 0.3091 - regression_loss: 0.2915 - classification_loss: 0.0176 96/500 [====>.........................] - ETA: 3:23 - loss: 0.3093 - regression_loss: 0.2917 - classification_loss: 0.0176 97/500 [====>.........................] - ETA: 3:23 - loss: 0.3108 - regression_loss: 0.2928 - classification_loss: 0.0179 98/500 [====>.........................] - ETA: 3:22 - loss: 0.3094 - regression_loss: 0.2916 - classification_loss: 0.0178 99/500 [====>.........................] - ETA: 3:22 - loss: 0.3086 - regression_loss: 0.2908 - classification_loss: 0.0177 100/500 [=====>........................] - ETA: 3:21 - loss: 0.3095 - regression_loss: 0.2918 - classification_loss: 0.0177 101/500 [=====>........................] - ETA: 3:21 - loss: 0.3097 - regression_loss: 0.2920 - classification_loss: 0.0177 102/500 [=====>........................] - ETA: 3:20 - loss: 0.3093 - regression_loss: 0.2918 - classification_loss: 0.0176 103/500 [=====>........................] - ETA: 3:20 - loss: 0.3086 - regression_loss: 0.2911 - classification_loss: 0.0175 104/500 [=====>........................] - ETA: 3:19 - loss: 0.3089 - regression_loss: 0.2915 - classification_loss: 0.0175 105/500 [=====>........................] - ETA: 3:19 - loss: 0.3105 - regression_loss: 0.2929 - classification_loss: 0.0176 106/500 [=====>........................] - ETA: 3:18 - loss: 0.3120 - regression_loss: 0.2943 - classification_loss: 0.0177 107/500 [=====>........................] - ETA: 3:18 - loss: 0.3107 - regression_loss: 0.2931 - classification_loss: 0.0176 108/500 [=====>........................] - ETA: 3:17 - loss: 0.3089 - regression_loss: 0.2914 - classification_loss: 0.0175 109/500 [=====>........................] - ETA: 3:17 - loss: 0.3106 - regression_loss: 0.2932 - classification_loss: 0.0174 110/500 [=====>........................] - ETA: 3:16 - loss: 0.3096 - regression_loss: 0.2923 - classification_loss: 0.0173 111/500 [=====>........................] - ETA: 3:16 - loss: 0.3081 - regression_loss: 0.2909 - classification_loss: 0.0173 112/500 [=====>........................] - ETA: 3:15 - loss: 0.3079 - regression_loss: 0.2906 - classification_loss: 0.0172 113/500 [=====>........................] - ETA: 3:15 - loss: 0.3073 - regression_loss: 0.2901 - classification_loss: 0.0172 114/500 [=====>........................] - ETA: 3:14 - loss: 0.3078 - regression_loss: 0.2906 - classification_loss: 0.0172 115/500 [=====>........................] - ETA: 3:14 - loss: 0.3072 - regression_loss: 0.2900 - classification_loss: 0.0172 116/500 [=====>........................] - ETA: 3:13 - loss: 0.3072 - regression_loss: 0.2900 - classification_loss: 0.0172 117/500 [======>.......................] - ETA: 3:13 - loss: 0.3092 - regression_loss: 0.2916 - classification_loss: 0.0175 118/500 [======>.......................] - ETA: 3:12 - loss: 0.3105 - regression_loss: 0.2928 - classification_loss: 0.0177 119/500 [======>.......................] - ETA: 3:12 - loss: 0.3098 - regression_loss: 0.2922 - classification_loss: 0.0176 120/500 [======>.......................] - ETA: 3:11 - loss: 0.3089 - regression_loss: 0.2913 - classification_loss: 0.0176 121/500 [======>.......................] - ETA: 3:10 - loss: 0.3084 - regression_loss: 0.2908 - classification_loss: 0.0176 122/500 [======>.......................] - ETA: 3:10 - loss: 0.3072 - regression_loss: 0.2897 - classification_loss: 0.0175 123/500 [======>.......................] - ETA: 3:10 - loss: 0.3058 - regression_loss: 0.2884 - classification_loss: 0.0174 124/500 [======>.......................] - ETA: 3:09 - loss: 0.3054 - regression_loss: 0.2881 - classification_loss: 0.0173 125/500 [======>.......................] - ETA: 3:09 - loss: 0.3039 - regression_loss: 0.2866 - classification_loss: 0.0173 126/500 [======>.......................] - ETA: 3:08 - loss: 0.3037 - regression_loss: 0.2863 - classification_loss: 0.0174 127/500 [======>.......................] - ETA: 3:08 - loss: 0.3031 - regression_loss: 0.2857 - classification_loss: 0.0174 128/500 [======>.......................] - ETA: 3:07 - loss: 0.3026 - regression_loss: 0.2852 - classification_loss: 0.0174 129/500 [======>.......................] - ETA: 3:06 - loss: 0.3018 - regression_loss: 0.2844 - classification_loss: 0.0173 130/500 [======>.......................] - ETA: 3:06 - loss: 0.3014 - regression_loss: 0.2841 - classification_loss: 0.0172 131/500 [======>.......................] - ETA: 3:05 - loss: 0.3010 - regression_loss: 0.2837 - classification_loss: 0.0172 132/500 [======>.......................] - ETA: 3:05 - loss: 0.3018 - regression_loss: 0.2845 - classification_loss: 0.0173 133/500 [======>.......................] - ETA: 3:04 - loss: 0.3003 - regression_loss: 0.2831 - classification_loss: 0.0172 134/500 [=======>......................] - ETA: 3:04 - loss: 0.2991 - regression_loss: 0.2820 - classification_loss: 0.0171 135/500 [=======>......................] - ETA: 3:03 - loss: 0.2979 - regression_loss: 0.2809 - classification_loss: 0.0170 136/500 [=======>......................] - ETA: 3:03 - loss: 0.2990 - regression_loss: 0.2817 - classification_loss: 0.0173 137/500 [=======>......................] - ETA: 3:02 - loss: 0.2995 - regression_loss: 0.2821 - classification_loss: 0.0174 138/500 [=======>......................] - ETA: 3:02 - loss: 0.2994 - regression_loss: 0.2820 - classification_loss: 0.0173 139/500 [=======>......................] - ETA: 3:01 - loss: 0.2985 - regression_loss: 0.2812 - classification_loss: 0.0173 140/500 [=======>......................] - ETA: 3:01 - loss: 0.2991 - regression_loss: 0.2818 - classification_loss: 0.0173 141/500 [=======>......................] - ETA: 3:00 - loss: 0.2985 - regression_loss: 0.2812 - classification_loss: 0.0173 142/500 [=======>......................] - ETA: 3:00 - loss: 0.3009 - regression_loss: 0.2835 - classification_loss: 0.0174 143/500 [=======>......................] - ETA: 2:59 - loss: 0.3022 - regression_loss: 0.2847 - classification_loss: 0.0175 144/500 [=======>......................] - ETA: 2:59 - loss: 0.3026 - regression_loss: 0.2853 - classification_loss: 0.0174 145/500 [=======>......................] - ETA: 2:58 - loss: 0.3023 - regression_loss: 0.2850 - classification_loss: 0.0173 146/500 [=======>......................] - ETA: 2:58 - loss: 0.3029 - regression_loss: 0.2855 - classification_loss: 0.0174 147/500 [=======>......................] - ETA: 2:57 - loss: 0.3023 - regression_loss: 0.2850 - classification_loss: 0.0173 148/500 [=======>......................] - ETA: 2:57 - loss: 0.3017 - regression_loss: 0.2845 - classification_loss: 0.0173 149/500 [=======>......................] - ETA: 2:56 - loss: 0.3019 - regression_loss: 0.2847 - classification_loss: 0.0173 150/500 [========>.....................] - ETA: 2:56 - loss: 0.3013 - regression_loss: 0.2841 - classification_loss: 0.0172 151/500 [========>.....................] - ETA: 2:55 - loss: 0.3003 - regression_loss: 0.2832 - classification_loss: 0.0171 152/500 [========>.....................] - ETA: 2:55 - loss: 0.3003 - regression_loss: 0.2832 - classification_loss: 0.0170 153/500 [========>.....................] - ETA: 2:54 - loss: 0.2987 - regression_loss: 0.2818 - classification_loss: 0.0169 154/500 [========>.....................] - ETA: 2:54 - loss: 0.2979 - regression_loss: 0.2810 - classification_loss: 0.0169 155/500 [========>.....................] - ETA: 2:53 - loss: 0.2975 - regression_loss: 0.2807 - classification_loss: 0.0168 156/500 [========>.....................] - ETA: 2:53 - loss: 0.2991 - regression_loss: 0.2821 - classification_loss: 0.0170 157/500 [========>.....................] - ETA: 2:52 - loss: 0.2982 - regression_loss: 0.2813 - classification_loss: 0.0169 158/500 [========>.....................] - ETA: 2:52 - loss: 0.2985 - regression_loss: 0.2816 - classification_loss: 0.0168 159/500 [========>.....................] - ETA: 2:51 - loss: 0.3001 - regression_loss: 0.2829 - classification_loss: 0.0172 160/500 [========>.....................] - ETA: 2:51 - loss: 0.2990 - regression_loss: 0.2819 - classification_loss: 0.0171 161/500 [========>.....................] - ETA: 2:50 - loss: 0.2986 - regression_loss: 0.2815 - classification_loss: 0.0171 162/500 [========>.....................] - ETA: 2:50 - loss: 0.2995 - regression_loss: 0.2824 - classification_loss: 0.0171 163/500 [========>.....................] - ETA: 2:49 - loss: 0.2987 - regression_loss: 0.2817 - classification_loss: 0.0170 164/500 [========>.....................] - ETA: 2:49 - loss: 0.2994 - regression_loss: 0.2824 - classification_loss: 0.0170 165/500 [========>.....................] - ETA: 2:48 - loss: 0.2995 - regression_loss: 0.2824 - classification_loss: 0.0170 166/500 [========>.....................] - ETA: 2:48 - loss: 0.3000 - regression_loss: 0.2828 - classification_loss: 0.0172 167/500 [=========>....................] - ETA: 2:47 - loss: 0.3008 - regression_loss: 0.2836 - classification_loss: 0.0172 168/500 [=========>....................] - ETA: 2:47 - loss: 0.3005 - regression_loss: 0.2834 - classification_loss: 0.0171 169/500 [=========>....................] - ETA: 2:46 - loss: 0.3007 - regression_loss: 0.2836 - classification_loss: 0.0171 170/500 [=========>....................] - ETA: 2:46 - loss: 0.3022 - regression_loss: 0.2849 - classification_loss: 0.0173 171/500 [=========>....................] - ETA: 2:45 - loss: 0.3026 - regression_loss: 0.2854 - classification_loss: 0.0173 172/500 [=========>....................] - ETA: 2:45 - loss: 0.3017 - regression_loss: 0.2844 - classification_loss: 0.0172 173/500 [=========>....................] - ETA: 2:44 - loss: 0.3008 - regression_loss: 0.2836 - classification_loss: 0.0172 174/500 [=========>....................] - ETA: 2:44 - loss: 0.2999 - regression_loss: 0.2829 - classification_loss: 0.0171 175/500 [=========>....................] - ETA: 2:43 - loss: 0.2998 - regression_loss: 0.2828 - classification_loss: 0.0171 176/500 [=========>....................] - ETA: 2:43 - loss: 0.3011 - regression_loss: 0.2839 - classification_loss: 0.0172 177/500 [=========>....................] - ETA: 2:42 - loss: 0.3018 - regression_loss: 0.2845 - classification_loss: 0.0173 178/500 [=========>....................] - ETA: 2:42 - loss: 0.3012 - regression_loss: 0.2839 - classification_loss: 0.0173 179/500 [=========>....................] - ETA: 2:41 - loss: 0.3015 - regression_loss: 0.2841 - classification_loss: 0.0174 180/500 [=========>....................] - ETA: 2:41 - loss: 0.3012 - regression_loss: 0.2839 - classification_loss: 0.0173 181/500 [=========>....................] - ETA: 2:40 - loss: 0.3005 - regression_loss: 0.2833 - classification_loss: 0.0173 182/500 [=========>....................] - ETA: 2:40 - loss: 0.3016 - regression_loss: 0.2843 - classification_loss: 0.0173 183/500 [=========>....................] - ETA: 2:39 - loss: 0.3007 - regression_loss: 0.2834 - classification_loss: 0.0173 184/500 [==========>...................] - ETA: 2:39 - loss: 0.3008 - regression_loss: 0.2836 - classification_loss: 0.0172 185/500 [==========>...................] - ETA: 2:38 - loss: 0.3005 - regression_loss: 0.2832 - classification_loss: 0.0173 186/500 [==========>...................] - ETA: 2:38 - loss: 0.3005 - regression_loss: 0.2832 - classification_loss: 0.0173 187/500 [==========>...................] - ETA: 2:37 - loss: 0.2995 - regression_loss: 0.2823 - classification_loss: 0.0172 188/500 [==========>...................] - ETA: 2:36 - loss: 0.2996 - regression_loss: 0.2824 - classification_loss: 0.0172 189/500 [==========>...................] - ETA: 2:36 - loss: 0.2998 - regression_loss: 0.2827 - classification_loss: 0.0172 190/500 [==========>...................] - ETA: 2:36 - loss: 0.2993 - regression_loss: 0.2822 - classification_loss: 0.0171 191/500 [==========>...................] - ETA: 2:35 - loss: 0.2985 - regression_loss: 0.2814 - classification_loss: 0.0171 192/500 [==========>...................] - ETA: 2:34 - loss: 0.2979 - regression_loss: 0.2809 - classification_loss: 0.0170 193/500 [==========>...................] - ETA: 2:34 - loss: 0.2981 - regression_loss: 0.2811 - classification_loss: 0.0170 194/500 [==========>...................] - ETA: 2:33 - loss: 0.2975 - regression_loss: 0.2805 - classification_loss: 0.0169 195/500 [==========>...................] - ETA: 2:33 - loss: 0.2984 - regression_loss: 0.2814 - classification_loss: 0.0170 196/500 [==========>...................] - ETA: 2:32 - loss: 0.2980 - regression_loss: 0.2810 - classification_loss: 0.0170 197/500 [==========>...................] - ETA: 2:32 - loss: 0.2981 - regression_loss: 0.2811 - classification_loss: 0.0170 198/500 [==========>...................] - ETA: 2:31 - loss: 0.2983 - regression_loss: 0.2813 - classification_loss: 0.0170 199/500 [==========>...................] - ETA: 2:31 - loss: 0.2980 - regression_loss: 0.2811 - classification_loss: 0.0169 200/500 [===========>..................] - ETA: 2:30 - loss: 0.2974 - regression_loss: 0.2805 - classification_loss: 0.0169 201/500 [===========>..................] - ETA: 2:30 - loss: 0.2977 - regression_loss: 0.2808 - classification_loss: 0.0169 202/500 [===========>..................] - ETA: 2:29 - loss: 0.2975 - regression_loss: 0.2806 - classification_loss: 0.0169 203/500 [===========>..................] - ETA: 2:29 - loss: 0.2983 - regression_loss: 0.2814 - classification_loss: 0.0169 204/500 [===========>..................] - ETA: 2:28 - loss: 0.2982 - regression_loss: 0.2813 - classification_loss: 0.0169 205/500 [===========>..................] - ETA: 2:28 - loss: 0.2984 - regression_loss: 0.2816 - classification_loss: 0.0168 206/500 [===========>..................] - ETA: 2:27 - loss: 0.2990 - regression_loss: 0.2821 - classification_loss: 0.0169 207/500 [===========>..................] - ETA: 2:27 - loss: 0.2994 - regression_loss: 0.2826 - classification_loss: 0.0168 208/500 [===========>..................] - ETA: 2:26 - loss: 0.2990 - regression_loss: 0.2823 - classification_loss: 0.0168 209/500 [===========>..................] - ETA: 2:26 - loss: 0.2988 - regression_loss: 0.2821 - classification_loss: 0.0167 210/500 [===========>..................] - ETA: 2:25 - loss: 0.2986 - regression_loss: 0.2819 - classification_loss: 0.0167 211/500 [===========>..................] - ETA: 2:25 - loss: 0.2985 - regression_loss: 0.2818 - classification_loss: 0.0167 212/500 [===========>..................] - ETA: 2:24 - loss: 0.2990 - regression_loss: 0.2822 - classification_loss: 0.0167 213/500 [===========>..................] - ETA: 2:24 - loss: 0.2998 - regression_loss: 0.2831 - classification_loss: 0.0167 214/500 [===========>..................] - ETA: 2:23 - loss: 0.2988 - regression_loss: 0.2822 - classification_loss: 0.0166 215/500 [===========>..................] - ETA: 2:23 - loss: 0.2991 - regression_loss: 0.2825 - classification_loss: 0.0166 216/500 [===========>..................] - ETA: 2:22 - loss: 0.2980 - regression_loss: 0.2814 - classification_loss: 0.0166 217/500 [============>.................] - ETA: 2:22 - loss: 0.2980 - regression_loss: 0.2815 - classification_loss: 0.0165 218/500 [============>.................] - ETA: 2:21 - loss: 0.2984 - regression_loss: 0.2819 - classification_loss: 0.0165 219/500 [============>.................] - ETA: 2:21 - loss: 0.2988 - regression_loss: 0.2822 - classification_loss: 0.0166 220/500 [============>.................] - ETA: 2:20 - loss: 0.2992 - regression_loss: 0.2826 - classification_loss: 0.0166 221/500 [============>.................] - ETA: 2:20 - loss: 0.2999 - regression_loss: 0.2833 - classification_loss: 0.0166 222/500 [============>.................] - ETA: 2:19 - loss: 0.3007 - regression_loss: 0.2839 - classification_loss: 0.0167 223/500 [============>.................] - ETA: 2:19 - loss: 0.3003 - regression_loss: 0.2836 - classification_loss: 0.0167 224/500 [============>.................] - ETA: 2:18 - loss: 0.3011 - regression_loss: 0.2843 - classification_loss: 0.0168 225/500 [============>.................] - ETA: 2:18 - loss: 0.3021 - regression_loss: 0.2852 - classification_loss: 0.0169 226/500 [============>.................] - ETA: 2:17 - loss: 0.3019 - regression_loss: 0.2850 - classification_loss: 0.0169 227/500 [============>.................] - ETA: 2:17 - loss: 0.3024 - regression_loss: 0.2855 - classification_loss: 0.0169 228/500 [============>.................] - ETA: 2:16 - loss: 0.3030 - regression_loss: 0.2862 - classification_loss: 0.0168 229/500 [============>.................] - ETA: 2:16 - loss: 0.3027 - regression_loss: 0.2859 - classification_loss: 0.0168 230/500 [============>.................] - ETA: 2:15 - loss: 0.3028 - regression_loss: 0.2860 - classification_loss: 0.0168 231/500 [============>.................] - ETA: 2:15 - loss: 0.3028 - regression_loss: 0.2859 - classification_loss: 0.0168 232/500 [============>.................] - ETA: 2:14 - loss: 0.3030 - regression_loss: 0.2861 - classification_loss: 0.0169 233/500 [============>.................] - ETA: 2:14 - loss: 0.3029 - regression_loss: 0.2860 - classification_loss: 0.0169 234/500 [=============>................] - ETA: 2:13 - loss: 0.3030 - regression_loss: 0.2861 - classification_loss: 0.0169 235/500 [=============>................] - ETA: 2:13 - loss: 0.3029 - regression_loss: 0.2860 - classification_loss: 0.0169 236/500 [=============>................] - ETA: 2:12 - loss: 0.3027 - regression_loss: 0.2858 - classification_loss: 0.0168 237/500 [=============>................] - ETA: 2:12 - loss: 0.3024 - regression_loss: 0.2855 - classification_loss: 0.0168 238/500 [=============>................] - ETA: 2:11 - loss: 0.3028 - regression_loss: 0.2860 - classification_loss: 0.0168 239/500 [=============>................] - ETA: 2:11 - loss: 0.3032 - regression_loss: 0.2863 - classification_loss: 0.0168 240/500 [=============>................] - ETA: 2:10 - loss: 0.3041 - regression_loss: 0.2873 - classification_loss: 0.0168 241/500 [=============>................] - ETA: 2:10 - loss: 0.3043 - regression_loss: 0.2875 - classification_loss: 0.0168 242/500 [=============>................] - ETA: 2:09 - loss: 0.3042 - regression_loss: 0.2874 - classification_loss: 0.0168 243/500 [=============>................] - ETA: 2:09 - loss: 0.3044 - regression_loss: 0.2876 - classification_loss: 0.0169 244/500 [=============>................] - ETA: 2:08 - loss: 0.3060 - regression_loss: 0.2890 - classification_loss: 0.0171 245/500 [=============>................] - ETA: 2:08 - loss: 0.3058 - regression_loss: 0.2887 - classification_loss: 0.0171 246/500 [=============>................] - ETA: 2:07 - loss: 0.3060 - regression_loss: 0.2889 - classification_loss: 0.0171 247/500 [=============>................] - ETA: 2:07 - loss: 0.3060 - regression_loss: 0.2889 - classification_loss: 0.0171 248/500 [=============>................] - ETA: 2:06 - loss: 0.3057 - regression_loss: 0.2886 - classification_loss: 0.0171 249/500 [=============>................] - ETA: 2:06 - loss: 0.3050 - regression_loss: 0.2880 - classification_loss: 0.0170 250/500 [==============>...............] - ETA: 2:05 - loss: 0.3049 - regression_loss: 0.2878 - classification_loss: 0.0170 251/500 [==============>...............] - ETA: 2:05 - loss: 0.3048 - regression_loss: 0.2878 - classification_loss: 0.0170 252/500 [==============>...............] - ETA: 2:04 - loss: 0.3048 - regression_loss: 0.2878 - classification_loss: 0.0170 253/500 [==============>...............] - ETA: 2:04 - loss: 0.3042 - regression_loss: 0.2872 - classification_loss: 0.0170 254/500 [==============>...............] - ETA: 2:03 - loss: 0.3048 - regression_loss: 0.2878 - classification_loss: 0.0170 255/500 [==============>...............] - ETA: 2:03 - loss: 0.3045 - regression_loss: 0.2876 - classification_loss: 0.0169 256/500 [==============>...............] - ETA: 2:02 - loss: 0.3045 - regression_loss: 0.2876 - classification_loss: 0.0169 257/500 [==============>...............] - ETA: 2:02 - loss: 0.3049 - regression_loss: 0.2879 - classification_loss: 0.0169 258/500 [==============>...............] - ETA: 2:01 - loss: 0.3054 - regression_loss: 0.2883 - classification_loss: 0.0171 259/500 [==============>...............] - ETA: 2:01 - loss: 0.3056 - regression_loss: 0.2886 - classification_loss: 0.0171 260/500 [==============>...............] - ETA: 2:00 - loss: 0.3065 - regression_loss: 0.2894 - classification_loss: 0.0171 261/500 [==============>...............] - ETA: 2:00 - loss: 0.3067 - regression_loss: 0.2896 - classification_loss: 0.0171 262/500 [==============>...............] - ETA: 1:59 - loss: 0.3068 - regression_loss: 0.2897 - classification_loss: 0.0171 263/500 [==============>...............] - ETA: 1:59 - loss: 0.3066 - regression_loss: 0.2895 - classification_loss: 0.0171 264/500 [==============>...............] - ETA: 1:58 - loss: 0.3068 - regression_loss: 0.2896 - classification_loss: 0.0171 265/500 [==============>...............] - ETA: 1:58 - loss: 0.3067 - regression_loss: 0.2896 - classification_loss: 0.0171 266/500 [==============>...............] - ETA: 1:57 - loss: 0.3062 - regression_loss: 0.2891 - classification_loss: 0.0171 267/500 [===============>..............] - ETA: 1:57 - loss: 0.3060 - regression_loss: 0.2889 - classification_loss: 0.0171 268/500 [===============>..............] - ETA: 1:56 - loss: 0.3063 - regression_loss: 0.2892 - classification_loss: 0.0171 269/500 [===============>..............] - ETA: 1:56 - loss: 0.3060 - regression_loss: 0.2889 - classification_loss: 0.0171 270/500 [===============>..............] - ETA: 1:55 - loss: 0.3070 - regression_loss: 0.2899 - classification_loss: 0.0171 271/500 [===============>..............] - ETA: 1:55 - loss: 0.3067 - regression_loss: 0.2897 - classification_loss: 0.0171 272/500 [===============>..............] - ETA: 1:54 - loss: 0.3065 - regression_loss: 0.2895 - classification_loss: 0.0170 273/500 [===============>..............] - ETA: 1:54 - loss: 0.3060 - regression_loss: 0.2890 - classification_loss: 0.0170 274/500 [===============>..............] - ETA: 1:53 - loss: 0.3055 - regression_loss: 0.2885 - classification_loss: 0.0170 275/500 [===============>..............] - ETA: 1:53 - loss: 0.3057 - regression_loss: 0.2887 - classification_loss: 0.0170 276/500 [===============>..............] - ETA: 1:52 - loss: 0.3055 - regression_loss: 0.2885 - classification_loss: 0.0169 277/500 [===============>..............] - ETA: 1:52 - loss: 0.3049 - regression_loss: 0.2880 - classification_loss: 0.0169 278/500 [===============>..............] - ETA: 1:51 - loss: 0.3052 - regression_loss: 0.2883 - classification_loss: 0.0169 279/500 [===============>..............] - ETA: 1:51 - loss: 0.3049 - regression_loss: 0.2880 - classification_loss: 0.0169 280/500 [===============>..............] - ETA: 1:50 - loss: 0.3048 - regression_loss: 0.2879 - classification_loss: 0.0170 281/500 [===============>..............] - ETA: 1:50 - loss: 0.3056 - regression_loss: 0.2884 - classification_loss: 0.0172 282/500 [===============>..............] - ETA: 1:49 - loss: 0.3056 - regression_loss: 0.2884 - classification_loss: 0.0172 283/500 [===============>..............] - ETA: 1:49 - loss: 0.3060 - regression_loss: 0.2888 - classification_loss: 0.0172 284/500 [================>.............] - ETA: 1:48 - loss: 0.3055 - regression_loss: 0.2884 - classification_loss: 0.0172 285/500 [================>.............] - ETA: 1:48 - loss: 0.3052 - regression_loss: 0.2880 - classification_loss: 0.0171 286/500 [================>.............] - ETA: 1:47 - loss: 0.3051 - regression_loss: 0.2880 - classification_loss: 0.0171 287/500 [================>.............] - ETA: 1:47 - loss: 0.3047 - regression_loss: 0.2876 - classification_loss: 0.0171 288/500 [================>.............] - ETA: 1:46 - loss: 0.3047 - regression_loss: 0.2876 - classification_loss: 0.0171 289/500 [================>.............] - ETA: 1:46 - loss: 0.3050 - regression_loss: 0.2880 - classification_loss: 0.0171 290/500 [================>.............] - ETA: 1:45 - loss: 0.3054 - regression_loss: 0.2882 - classification_loss: 0.0171 291/500 [================>.............] - ETA: 1:45 - loss: 0.3056 - regression_loss: 0.2885 - classification_loss: 0.0171 292/500 [================>.............] - ETA: 1:44 - loss: 0.3055 - regression_loss: 0.2884 - classification_loss: 0.0171 293/500 [================>.............] - ETA: 1:44 - loss: 0.3061 - regression_loss: 0.2889 - classification_loss: 0.0172 294/500 [================>.............] - ETA: 1:43 - loss: 0.3068 - regression_loss: 0.2896 - classification_loss: 0.0172 295/500 [================>.............] - ETA: 1:43 - loss: 0.3069 - regression_loss: 0.2897 - classification_loss: 0.0172 296/500 [================>.............] - ETA: 1:42 - loss: 0.3069 - regression_loss: 0.2897 - classification_loss: 0.0172 297/500 [================>.............] - ETA: 1:42 - loss: 0.3068 - regression_loss: 0.2896 - classification_loss: 0.0172 298/500 [================>.............] - ETA: 1:41 - loss: 0.3069 - regression_loss: 0.2897 - classification_loss: 0.0172 299/500 [================>.............] - ETA: 1:41 - loss: 0.3066 - regression_loss: 0.2894 - classification_loss: 0.0172 300/500 [=================>............] - ETA: 1:40 - loss: 0.3069 - regression_loss: 0.2896 - classification_loss: 0.0173 301/500 [=================>............] - ETA: 1:40 - loss: 0.3068 - regression_loss: 0.2896 - classification_loss: 0.0173 302/500 [=================>............] - ETA: 1:39 - loss: 0.3066 - regression_loss: 0.2894 - classification_loss: 0.0172 303/500 [=================>............] - ETA: 1:39 - loss: 0.3061 - regression_loss: 0.2889 - classification_loss: 0.0172 304/500 [=================>............] - ETA: 1:38 - loss: 0.3058 - regression_loss: 0.2886 - classification_loss: 0.0172 305/500 [=================>............] - ETA: 1:38 - loss: 0.3053 - regression_loss: 0.2881 - classification_loss: 0.0171 306/500 [=================>............] - ETA: 1:37 - loss: 0.3055 - regression_loss: 0.2883 - classification_loss: 0.0172 307/500 [=================>............] - ETA: 1:37 - loss: 0.3053 - regression_loss: 0.2881 - classification_loss: 0.0171 308/500 [=================>............] - ETA: 1:36 - loss: 0.3055 - regression_loss: 0.2883 - classification_loss: 0.0172 309/500 [=================>............] - ETA: 1:36 - loss: 0.3054 - regression_loss: 0.2882 - classification_loss: 0.0172 310/500 [=================>............] - ETA: 1:35 - loss: 0.3050 - regression_loss: 0.2879 - classification_loss: 0.0172 311/500 [=================>............] - ETA: 1:35 - loss: 0.3045 - regression_loss: 0.2873 - classification_loss: 0.0171 312/500 [=================>............] - ETA: 1:34 - loss: 0.3050 - regression_loss: 0.2878 - classification_loss: 0.0172 313/500 [=================>............] - ETA: 1:34 - loss: 0.3051 - regression_loss: 0.2880 - classification_loss: 0.0172 314/500 [=================>............] - ETA: 1:33 - loss: 0.3053 - regression_loss: 0.2881 - classification_loss: 0.0172 315/500 [=================>............] - ETA: 1:33 - loss: 0.3053 - regression_loss: 0.2881 - classification_loss: 0.0172 316/500 [=================>............] - ETA: 1:32 - loss: 0.3048 - regression_loss: 0.2876 - classification_loss: 0.0171 317/500 [==================>...........] - ETA: 1:32 - loss: 0.3048 - regression_loss: 0.2877 - classification_loss: 0.0171 318/500 [==================>...........] - ETA: 1:31 - loss: 0.3046 - regression_loss: 0.2875 - classification_loss: 0.0171 319/500 [==================>...........] - ETA: 1:31 - loss: 0.3047 - regression_loss: 0.2876 - classification_loss: 0.0171 320/500 [==================>...........] - ETA: 1:30 - loss: 0.3050 - regression_loss: 0.2879 - classification_loss: 0.0171 321/500 [==================>...........] - ETA: 1:30 - loss: 0.3046 - regression_loss: 0.2875 - classification_loss: 0.0171 322/500 [==================>...........] - ETA: 1:29 - loss: 0.3042 - regression_loss: 0.2871 - classification_loss: 0.0171 323/500 [==================>...........] - ETA: 1:29 - loss: 0.3039 - regression_loss: 0.2869 - classification_loss: 0.0171 324/500 [==================>...........] - ETA: 1:28 - loss: 0.3045 - regression_loss: 0.2874 - classification_loss: 0.0171 325/500 [==================>...........] - ETA: 1:28 - loss: 0.3039 - regression_loss: 0.2869 - classification_loss: 0.0170 326/500 [==================>...........] - ETA: 1:27 - loss: 0.3036 - regression_loss: 0.2866 - classification_loss: 0.0170 327/500 [==================>...........] - ETA: 1:27 - loss: 0.3034 - regression_loss: 0.2864 - classification_loss: 0.0170 328/500 [==================>...........] - ETA: 1:26 - loss: 0.3031 - regression_loss: 0.2861 - classification_loss: 0.0170 329/500 [==================>...........] - ETA: 1:26 - loss: 0.3030 - regression_loss: 0.2860 - classification_loss: 0.0170 330/500 [==================>...........] - ETA: 1:25 - loss: 0.3023 - regression_loss: 0.2854 - classification_loss: 0.0169 331/500 [==================>...........] - ETA: 1:25 - loss: 0.3026 - regression_loss: 0.2857 - classification_loss: 0.0169 332/500 [==================>...........] - ETA: 1:24 - loss: 0.3030 - regression_loss: 0.2860 - classification_loss: 0.0170 333/500 [==================>...........] - ETA: 1:24 - loss: 0.3029 - regression_loss: 0.2859 - classification_loss: 0.0170 334/500 [===================>..........] - ETA: 1:23 - loss: 0.3027 - regression_loss: 0.2857 - classification_loss: 0.0170 335/500 [===================>..........] - ETA: 1:23 - loss: 0.3021 - regression_loss: 0.2851 - classification_loss: 0.0169 336/500 [===================>..........] - ETA: 1:22 - loss: 0.3019 - regression_loss: 0.2850 - classification_loss: 0.0169 337/500 [===================>..........] - ETA: 1:22 - loss: 0.3016 - regression_loss: 0.2847 - classification_loss: 0.0169 338/500 [===================>..........] - ETA: 1:21 - loss: 0.3013 - regression_loss: 0.2844 - classification_loss: 0.0169 339/500 [===================>..........] - ETA: 1:21 - loss: 0.3011 - regression_loss: 0.2843 - classification_loss: 0.0169 340/500 [===================>..........] - ETA: 1:20 - loss: 0.3015 - regression_loss: 0.2845 - classification_loss: 0.0170 341/500 [===================>..........] - ETA: 1:20 - loss: 0.3016 - regression_loss: 0.2845 - classification_loss: 0.0170 342/500 [===================>..........] - ETA: 1:19 - loss: 0.3012 - regression_loss: 0.2842 - classification_loss: 0.0170 343/500 [===================>..........] - ETA: 1:19 - loss: 0.3009 - regression_loss: 0.2839 - classification_loss: 0.0170 344/500 [===================>..........] - ETA: 1:18 - loss: 0.3008 - regression_loss: 0.2838 - classification_loss: 0.0170 345/500 [===================>..........] - ETA: 1:18 - loss: 0.3005 - regression_loss: 0.2835 - classification_loss: 0.0170 346/500 [===================>..........] - ETA: 1:17 - loss: 0.3001 - regression_loss: 0.2832 - classification_loss: 0.0170 347/500 [===================>..........] - ETA: 1:16 - loss: 0.3000 - regression_loss: 0.2831 - classification_loss: 0.0170 348/500 [===================>..........] - ETA: 1:16 - loss: 0.3002 - regression_loss: 0.2833 - classification_loss: 0.0169 349/500 [===================>..........] - ETA: 1:15 - loss: 0.3002 - regression_loss: 0.2832 - classification_loss: 0.0170 350/500 [====================>.........] - ETA: 1:15 - loss: 0.2998 - regression_loss: 0.2829 - classification_loss: 0.0169 351/500 [====================>.........] - ETA: 1:14 - loss: 0.2996 - regression_loss: 0.2827 - classification_loss: 0.0169 352/500 [====================>.........] - ETA: 1:14 - loss: 0.2989 - regression_loss: 0.2820 - classification_loss: 0.0169 353/500 [====================>.........] - ETA: 1:13 - loss: 0.2993 - regression_loss: 0.2824 - classification_loss: 0.0169 354/500 [====================>.........] - ETA: 1:13 - loss: 0.2998 - regression_loss: 0.2828 - classification_loss: 0.0170 355/500 [====================>.........] - ETA: 1:12 - loss: 0.3001 - regression_loss: 0.2831 - classification_loss: 0.0170 356/500 [====================>.........] - ETA: 1:12 - loss: 0.3002 - regression_loss: 0.2833 - classification_loss: 0.0170 357/500 [====================>.........] - ETA: 1:11 - loss: 0.3004 - regression_loss: 0.2834 - classification_loss: 0.0170 358/500 [====================>.........] - ETA: 1:11 - loss: 0.3001 - regression_loss: 0.2832 - classification_loss: 0.0170 359/500 [====================>.........] - ETA: 1:10 - loss: 0.2999 - regression_loss: 0.2829 - classification_loss: 0.0169 360/500 [====================>.........] - ETA: 1:10 - loss: 0.2996 - regression_loss: 0.2827 - classification_loss: 0.0169 361/500 [====================>.........] - ETA: 1:09 - loss: 0.2991 - regression_loss: 0.2822 - classification_loss: 0.0169 362/500 [====================>.........] - ETA: 1:09 - loss: 0.2988 - regression_loss: 0.2819 - classification_loss: 0.0168 363/500 [====================>.........] - ETA: 1:08 - loss: 0.2992 - regression_loss: 0.2823 - classification_loss: 0.0168 364/500 [====================>.........] - ETA: 1:08 - loss: 0.2991 - regression_loss: 0.2823 - classification_loss: 0.0168 365/500 [====================>.........] - ETA: 1:07 - loss: 0.2988 - regression_loss: 0.2820 - classification_loss: 0.0168 366/500 [====================>.........] - ETA: 1:07 - loss: 0.2989 - regression_loss: 0.2821 - classification_loss: 0.0168 367/500 [=====================>........] - ETA: 1:06 - loss: 0.2989 - regression_loss: 0.2821 - classification_loss: 0.0168 368/500 [=====================>........] - ETA: 1:06 - loss: 0.2991 - regression_loss: 0.2823 - classification_loss: 0.0168 369/500 [=====================>........] - ETA: 1:05 - loss: 0.3001 - regression_loss: 0.2833 - classification_loss: 0.0168 370/500 [=====================>........] - ETA: 1:05 - loss: 0.3003 - regression_loss: 0.2835 - classification_loss: 0.0168 371/500 [=====================>........] - ETA: 1:04 - loss: 0.3006 - regression_loss: 0.2839 - classification_loss: 0.0168 372/500 [=====================>........] - ETA: 1:04 - loss: 0.3005 - regression_loss: 0.2837 - classification_loss: 0.0168 373/500 [=====================>........] - ETA: 1:03 - loss: 0.3013 - regression_loss: 0.2846 - classification_loss: 0.0168 374/500 [=====================>........] - ETA: 1:03 - loss: 0.3012 - regression_loss: 0.2845 - classification_loss: 0.0168 375/500 [=====================>........] - ETA: 1:02 - loss: 0.3013 - regression_loss: 0.2846 - classification_loss: 0.0168 376/500 [=====================>........] - ETA: 1:02 - loss: 0.3022 - regression_loss: 0.2854 - classification_loss: 0.0168 377/500 [=====================>........] - ETA: 1:01 - loss: 0.3031 - regression_loss: 0.2863 - classification_loss: 0.0168 378/500 [=====================>........] - ETA: 1:01 - loss: 0.3034 - regression_loss: 0.2865 - classification_loss: 0.0168 379/500 [=====================>........] - ETA: 1:00 - loss: 0.3039 - regression_loss: 0.2871 - classification_loss: 0.0168 380/500 [=====================>........] - ETA: 1:00 - loss: 0.3038 - regression_loss: 0.2870 - classification_loss: 0.0168 381/500 [=====================>........] - ETA: 59s - loss: 0.3035 - regression_loss: 0.2867 - classification_loss: 0.0168  382/500 [=====================>........] - ETA: 59s - loss: 0.3031 - regression_loss: 0.2864 - classification_loss: 0.0167 383/500 [=====================>........] - ETA: 58s - loss: 0.3035 - regression_loss: 0.2868 - classification_loss: 0.0168 384/500 [======================>.......] - ETA: 58s - loss: 0.3042 - regression_loss: 0.2874 - classification_loss: 0.0168 385/500 [======================>.......] - ETA: 57s - loss: 0.3044 - regression_loss: 0.2876 - classification_loss: 0.0168 386/500 [======================>.......] - ETA: 57s - loss: 0.3042 - regression_loss: 0.2874 - classification_loss: 0.0168 387/500 [======================>.......] - ETA: 56s - loss: 0.3043 - regression_loss: 0.2875 - classification_loss: 0.0168 388/500 [======================>.......] - ETA: 56s - loss: 0.3040 - regression_loss: 0.2872 - classification_loss: 0.0167 389/500 [======================>.......] - ETA: 55s - loss: 0.3039 - regression_loss: 0.2872 - classification_loss: 0.0167 390/500 [======================>.......] - ETA: 55s - loss: 0.3038 - regression_loss: 0.2871 - classification_loss: 0.0167 391/500 [======================>.......] - ETA: 54s - loss: 0.3035 - regression_loss: 0.2868 - classification_loss: 0.0167 392/500 [======================>.......] - ETA: 54s - loss: 0.3031 - regression_loss: 0.2864 - classification_loss: 0.0167 393/500 [======================>.......] - ETA: 53s - loss: 0.3029 - regression_loss: 0.2862 - classification_loss: 0.0167 394/500 [======================>.......] - ETA: 53s - loss: 0.3028 - regression_loss: 0.2861 - classification_loss: 0.0167 395/500 [======================>.......] - ETA: 52s - loss: 0.3023 - regression_loss: 0.2857 - classification_loss: 0.0167 396/500 [======================>.......] - ETA: 52s - loss: 0.3024 - regression_loss: 0.2857 - classification_loss: 0.0167 397/500 [======================>.......] - ETA: 51s - loss: 0.3021 - regression_loss: 0.2854 - classification_loss: 0.0166 398/500 [======================>.......] - ETA: 51s - loss: 0.3020 - regression_loss: 0.2854 - classification_loss: 0.0166 399/500 [======================>.......] - ETA: 50s - loss: 0.3024 - regression_loss: 0.2857 - classification_loss: 0.0166 400/500 [=======================>......] - ETA: 50s - loss: 0.3023 - regression_loss: 0.2857 - classification_loss: 0.0167 401/500 [=======================>......] - ETA: 49s - loss: 0.3018 - regression_loss: 0.2852 - classification_loss: 0.0166 402/500 [=======================>......] - ETA: 49s - loss: 0.3019 - regression_loss: 0.2853 - classification_loss: 0.0166 403/500 [=======================>......] - ETA: 48s - loss: 0.3018 - regression_loss: 0.2852 - classification_loss: 0.0166 404/500 [=======================>......] - ETA: 48s - loss: 0.3018 - regression_loss: 0.2852 - classification_loss: 0.0166 405/500 [=======================>......] - ETA: 47s - loss: 0.3021 - regression_loss: 0.2855 - classification_loss: 0.0166 406/500 [=======================>......] - ETA: 47s - loss: 0.3020 - regression_loss: 0.2854 - classification_loss: 0.0166 407/500 [=======================>......] - ETA: 46s - loss: 0.3017 - regression_loss: 0.2851 - classification_loss: 0.0166 408/500 [=======================>......] - ETA: 46s - loss: 0.3017 - regression_loss: 0.2851 - classification_loss: 0.0166 409/500 [=======================>......] - ETA: 45s - loss: 0.3013 - regression_loss: 0.2847 - classification_loss: 0.0166 410/500 [=======================>......] - ETA: 45s - loss: 0.3017 - regression_loss: 0.2851 - classification_loss: 0.0166 411/500 [=======================>......] - ETA: 44s - loss: 0.3027 - regression_loss: 0.2860 - classification_loss: 0.0167 412/500 [=======================>......] - ETA: 44s - loss: 0.3027 - regression_loss: 0.2860 - classification_loss: 0.0167 413/500 [=======================>......] - ETA: 43s - loss: 0.3026 - regression_loss: 0.2859 - classification_loss: 0.0167 414/500 [=======================>......] - ETA: 43s - loss: 0.3027 - regression_loss: 0.2860 - classification_loss: 0.0167 415/500 [=======================>......] - ETA: 42s - loss: 0.3027 - regression_loss: 0.2860 - classification_loss: 0.0167 416/500 [=======================>......] - ETA: 42s - loss: 0.3029 - regression_loss: 0.2863 - classification_loss: 0.0167 417/500 [========================>.....] - ETA: 41s - loss: 0.3028 - regression_loss: 0.2861 - classification_loss: 0.0166 418/500 [========================>.....] - ETA: 41s - loss: 0.3024 - regression_loss: 0.2858 - classification_loss: 0.0166 419/500 [========================>.....] - ETA: 40s - loss: 0.3023 - regression_loss: 0.2857 - classification_loss: 0.0166 420/500 [========================>.....] - ETA: 40s - loss: 0.3027 - regression_loss: 0.2860 - classification_loss: 0.0166 421/500 [========================>.....] - ETA: 39s - loss: 0.3027 - regression_loss: 0.2861 - classification_loss: 0.0166 422/500 [========================>.....] - ETA: 39s - loss: 0.3028 - regression_loss: 0.2861 - classification_loss: 0.0166 423/500 [========================>.....] - ETA: 38s - loss: 0.3029 - regression_loss: 0.2862 - classification_loss: 0.0167 424/500 [========================>.....] - ETA: 38s - loss: 0.3030 - regression_loss: 0.2863 - classification_loss: 0.0167 425/500 [========================>.....] - ETA: 37s - loss: 0.3033 - regression_loss: 0.2865 - classification_loss: 0.0167 426/500 [========================>.....] - ETA: 37s - loss: 0.3033 - regression_loss: 0.2866 - classification_loss: 0.0167 427/500 [========================>.....] - ETA: 36s - loss: 0.3030 - regression_loss: 0.2863 - classification_loss: 0.0167 428/500 [========================>.....] - ETA: 36s - loss: 0.3026 - regression_loss: 0.2859 - classification_loss: 0.0167 429/500 [========================>.....] - ETA: 35s - loss: 0.3022 - regression_loss: 0.2855 - classification_loss: 0.0167 430/500 [========================>.....] - ETA: 35s - loss: 0.3021 - regression_loss: 0.2854 - classification_loss: 0.0167 431/500 [========================>.....] - ETA: 34s - loss: 0.3017 - regression_loss: 0.2851 - classification_loss: 0.0166 432/500 [========================>.....] - ETA: 34s - loss: 0.3013 - regression_loss: 0.2847 - classification_loss: 0.0166 433/500 [========================>.....] - ETA: 33s - loss: 0.3013 - regression_loss: 0.2847 - classification_loss: 0.0166 434/500 [=========================>....] - ETA: 33s - loss: 0.3014 - regression_loss: 0.2848 - classification_loss: 0.0166 435/500 [=========================>....] - ETA: 32s - loss: 0.3011 - regression_loss: 0.2845 - classification_loss: 0.0166 436/500 [=========================>....] - ETA: 32s - loss: 0.3010 - regression_loss: 0.2844 - classification_loss: 0.0166 437/500 [=========================>....] - ETA: 31s - loss: 0.3013 - regression_loss: 0.2847 - classification_loss: 0.0167 438/500 [=========================>....] - ETA: 31s - loss: 0.3011 - regression_loss: 0.2845 - classification_loss: 0.0166 439/500 [=========================>....] - ETA: 30s - loss: 0.3008 - regression_loss: 0.2842 - classification_loss: 0.0166 440/500 [=========================>....] - ETA: 30s - loss: 0.3011 - regression_loss: 0.2845 - classification_loss: 0.0166 441/500 [=========================>....] - ETA: 29s - loss: 0.3007 - regression_loss: 0.2841 - classification_loss: 0.0166 442/500 [=========================>....] - ETA: 29s - loss: 0.3010 - regression_loss: 0.2844 - classification_loss: 0.0166 443/500 [=========================>....] - ETA: 28s - loss: 0.3010 - regression_loss: 0.2844 - classification_loss: 0.0166 444/500 [=========================>....] - ETA: 28s - loss: 0.3008 - regression_loss: 0.2842 - classification_loss: 0.0166 445/500 [=========================>....] - ETA: 27s - loss: 0.3005 - regression_loss: 0.2839 - classification_loss: 0.0166 446/500 [=========================>....] - ETA: 27s - loss: 0.3003 - regression_loss: 0.2838 - classification_loss: 0.0165 447/500 [=========================>....] - ETA: 26s - loss: 0.3001 - regression_loss: 0.2835 - classification_loss: 0.0165 448/500 [=========================>....] - ETA: 26s - loss: 0.2999 - regression_loss: 0.2834 - classification_loss: 0.0165 449/500 [=========================>....] - ETA: 25s - loss: 0.2995 - regression_loss: 0.2830 - classification_loss: 0.0165 450/500 [==========================>...] - ETA: 25s - loss: 0.2999 - regression_loss: 0.2834 - classification_loss: 0.0166 451/500 [==========================>...] - ETA: 24s - loss: 0.2999 - regression_loss: 0.2833 - classification_loss: 0.0165 452/500 [==========================>...] - ETA: 24s - loss: 0.2997 - regression_loss: 0.2832 - classification_loss: 0.0165 453/500 [==========================>...] - ETA: 23s - loss: 0.2995 - regression_loss: 0.2830 - classification_loss: 0.0165 454/500 [==========================>...] - ETA: 23s - loss: 0.3000 - regression_loss: 0.2834 - classification_loss: 0.0166 455/500 [==========================>...] - ETA: 22s - loss: 0.3002 - regression_loss: 0.2836 - classification_loss: 0.0166 456/500 [==========================>...] - ETA: 22s - loss: 0.3001 - regression_loss: 0.2835 - classification_loss: 0.0166 457/500 [==========================>...] - ETA: 21s - loss: 0.3000 - regression_loss: 0.2834 - classification_loss: 0.0166 458/500 [==========================>...] - ETA: 21s - loss: 0.2999 - regression_loss: 0.2833 - classification_loss: 0.0166 459/500 [==========================>...] - ETA: 20s - loss: 0.2996 - regression_loss: 0.2831 - classification_loss: 0.0166 460/500 [==========================>...] - ETA: 20s - loss: 0.2999 - regression_loss: 0.2833 - classification_loss: 0.0165 461/500 [==========================>...] - ETA: 19s - loss: 0.2998 - regression_loss: 0.2833 - classification_loss: 0.0165 462/500 [==========================>...] - ETA: 19s - loss: 0.2997 - regression_loss: 0.2832 - classification_loss: 0.0165 463/500 [==========================>...] - ETA: 18s - loss: 0.2995 - regression_loss: 0.2830 - classification_loss: 0.0165 464/500 [==========================>...] - ETA: 18s - loss: 0.2993 - regression_loss: 0.2828 - classification_loss: 0.0165 465/500 [==========================>...] - ETA: 17s - loss: 0.2995 - regression_loss: 0.2831 - classification_loss: 0.0165 466/500 [==========================>...] - ETA: 17s - loss: 0.2992 - regression_loss: 0.2828 - classification_loss: 0.0164 467/500 [===========================>..] - ETA: 16s - loss: 0.2992 - regression_loss: 0.2827 - classification_loss: 0.0165 468/500 [===========================>..] - ETA: 16s - loss: 0.2993 - regression_loss: 0.2828 - classification_loss: 0.0165 469/500 [===========================>..] - ETA: 15s - loss: 0.2988 - regression_loss: 0.2824 - classification_loss: 0.0164 470/500 [===========================>..] - ETA: 15s - loss: 0.2989 - regression_loss: 0.2824 - classification_loss: 0.0164 471/500 [===========================>..] - ETA: 14s - loss: 0.2987 - regression_loss: 0.2823 - classification_loss: 0.0164 472/500 [===========================>..] - ETA: 14s - loss: 0.2984 - regression_loss: 0.2820 - classification_loss: 0.0164 473/500 [===========================>..] - ETA: 13s - loss: 0.2983 - regression_loss: 0.2819 - classification_loss: 0.0164 474/500 [===========================>..] - ETA: 13s - loss: 0.2982 - regression_loss: 0.2818 - classification_loss: 0.0164 475/500 [===========================>..] - ETA: 12s - loss: 0.2980 - regression_loss: 0.2815 - classification_loss: 0.0164 476/500 [===========================>..] - ETA: 12s - loss: 0.2984 - regression_loss: 0.2819 - classification_loss: 0.0164 477/500 [===========================>..] - ETA: 11s - loss: 0.2986 - regression_loss: 0.2822 - classification_loss: 0.0164 478/500 [===========================>..] - ETA: 11s - loss: 0.2988 - regression_loss: 0.2823 - classification_loss: 0.0165 479/500 [===========================>..] - ETA: 10s - loss: 0.2993 - regression_loss: 0.2828 - classification_loss: 0.0166 480/500 [===========================>..] - ETA: 10s - loss: 0.2991 - regression_loss: 0.2826 - classification_loss: 0.0165 481/500 [===========================>..] - ETA: 9s - loss: 0.2992 - regression_loss: 0.2826 - classification_loss: 0.0165  482/500 [===========================>..] - ETA: 9s - loss: 0.2992 - regression_loss: 0.2826 - classification_loss: 0.0165 483/500 [===========================>..] - ETA: 8s - loss: 0.2992 - regression_loss: 0.2827 - classification_loss: 0.0166 484/500 [============================>.] - ETA: 8s - loss: 0.2995 - regression_loss: 0.2829 - classification_loss: 0.0166 485/500 [============================>.] - ETA: 7s - loss: 0.2995 - regression_loss: 0.2829 - classification_loss: 0.0166 486/500 [============================>.] - ETA: 7s - loss: 0.2996 - regression_loss: 0.2830 - classification_loss: 0.0166 487/500 [============================>.] - ETA: 6s - loss: 0.2995 - regression_loss: 0.2829 - classification_loss: 0.0166 488/500 [============================>.] - ETA: 6s - loss: 0.2993 - regression_loss: 0.2827 - classification_loss: 0.0166 489/500 [============================>.] - ETA: 5s - loss: 0.2997 - regression_loss: 0.2830 - classification_loss: 0.0167 490/500 [============================>.] - ETA: 5s - loss: 0.3000 - regression_loss: 0.2833 - classification_loss: 0.0167 491/500 [============================>.] - ETA: 4s - loss: 0.3001 - regression_loss: 0.2835 - classification_loss: 0.0167 492/500 [============================>.] - ETA: 4s - loss: 0.3000 - regression_loss: 0.2833 - classification_loss: 0.0167 493/500 [============================>.] - ETA: 3s - loss: 0.2998 - regression_loss: 0.2831 - classification_loss: 0.0167 494/500 [============================>.] - ETA: 3s - loss: 0.2997 - regression_loss: 0.2831 - classification_loss: 0.0167 495/500 [============================>.] - ETA: 2s - loss: 0.2996 - regression_loss: 0.2830 - classification_loss: 0.0167 496/500 [============================>.] - ETA: 2s - loss: 0.2993 - regression_loss: 0.2827 - classification_loss: 0.0166 497/500 [============================>.] - ETA: 1s - loss: 0.2995 - regression_loss: 0.2828 - classification_loss: 0.0166 498/500 [============================>.] - ETA: 1s - loss: 0.2995 - regression_loss: 0.2829 - classification_loss: 0.0167 499/500 [============================>.] - ETA: 0s - loss: 0.2990 - regression_loss: 0.2824 - classification_loss: 0.0166 500/500 [==============================] - 251s 503ms/step - loss: 0.2986 - regression_loss: 0.2820 - classification_loss: 0.0166 1172 instances of class plum with average precision: 0.7315 mAP: 0.7315 Epoch 00019: saving model to ./training/snapshots/resnet101_pascal_19.h5 Epoch 20/150 1/500 [..............................] - ETA: 4:09 - loss: 0.3550 - regression_loss: 0.3390 - classification_loss: 0.0159 2/500 [..............................] - ETA: 4:07 - loss: 0.4101 - regression_loss: 0.3771 - classification_loss: 0.0331 3/500 [..............................] - ETA: 4:07 - loss: 0.4105 - regression_loss: 0.3851 - classification_loss: 0.0255 4/500 [..............................] - ETA: 4:04 - loss: 0.3965 - regression_loss: 0.3736 - classification_loss: 0.0229 5/500 [..............................] - ETA: 4:02 - loss: 0.3625 - regression_loss: 0.3433 - classification_loss: 0.0192 6/500 [..............................] - ETA: 4:00 - loss: 0.3420 - regression_loss: 0.3234 - classification_loss: 0.0186 7/500 [..............................] - ETA: 4:02 - loss: 0.3223 - regression_loss: 0.3058 - classification_loss: 0.0165 8/500 [..............................] - ETA: 4:02 - loss: 0.3097 - regression_loss: 0.2940 - classification_loss: 0.0157 9/500 [..............................] - ETA: 4:02 - loss: 0.3100 - regression_loss: 0.2924 - classification_loss: 0.0176 10/500 [..............................] - ETA: 4:02 - loss: 0.3041 - regression_loss: 0.2866 - classification_loss: 0.0175 11/500 [..............................] - ETA: 4:02 - loss: 0.2978 - regression_loss: 0.2799 - classification_loss: 0.0179 12/500 [..............................] - ETA: 4:02 - loss: 0.2990 - regression_loss: 0.2815 - classification_loss: 0.0175 13/500 [..............................] - ETA: 4:02 - loss: 0.2952 - regression_loss: 0.2781 - classification_loss: 0.0171 14/500 [..............................] - ETA: 4:01 - loss: 0.2843 - regression_loss: 0.2680 - classification_loss: 0.0164 15/500 [..............................] - ETA: 4:01 - loss: 0.2749 - regression_loss: 0.2592 - classification_loss: 0.0158 16/500 [..............................] - ETA: 4:01 - loss: 0.2700 - regression_loss: 0.2544 - classification_loss: 0.0157 17/500 [>.............................] - ETA: 4:00 - loss: 0.2732 - regression_loss: 0.2573 - classification_loss: 0.0159 18/500 [>.............................] - ETA: 4:00 - loss: 0.2691 - regression_loss: 0.2536 - classification_loss: 0.0156 19/500 [>.............................] - ETA: 4:00 - loss: 0.2766 - regression_loss: 0.2610 - classification_loss: 0.0156 20/500 [>.............................] - ETA: 3:59 - loss: 0.2834 - regression_loss: 0.2675 - classification_loss: 0.0159 21/500 [>.............................] - ETA: 3:58 - loss: 0.2850 - regression_loss: 0.2691 - classification_loss: 0.0159 22/500 [>.............................] - ETA: 3:58 - loss: 0.2852 - regression_loss: 0.2695 - classification_loss: 0.0157 23/500 [>.............................] - ETA: 3:57 - loss: 0.2859 - regression_loss: 0.2703 - classification_loss: 0.0156 24/500 [>.............................] - ETA: 3:56 - loss: 0.2910 - regression_loss: 0.2748 - classification_loss: 0.0163 25/500 [>.............................] - ETA: 3:56 - loss: 0.2937 - regression_loss: 0.2773 - classification_loss: 0.0164 26/500 [>.............................] - ETA: 3:56 - loss: 0.2916 - regression_loss: 0.2750 - classification_loss: 0.0166 27/500 [>.............................] - ETA: 3:56 - loss: 0.2936 - regression_loss: 0.2769 - classification_loss: 0.0167 28/500 [>.............................] - ETA: 3:56 - loss: 0.2962 - regression_loss: 0.2795 - classification_loss: 0.0168 29/500 [>.............................] - ETA: 3:55 - loss: 0.2901 - regression_loss: 0.2738 - classification_loss: 0.0164 30/500 [>.............................] - ETA: 3:54 - loss: 0.2888 - regression_loss: 0.2726 - classification_loss: 0.0162 31/500 [>.............................] - ETA: 3:54 - loss: 0.2895 - regression_loss: 0.2732 - classification_loss: 0.0163 32/500 [>.............................] - ETA: 3:53 - loss: 0.2938 - regression_loss: 0.2765 - classification_loss: 0.0173 33/500 [>.............................] - ETA: 3:53 - loss: 0.2883 - regression_loss: 0.2712 - classification_loss: 0.0170 34/500 [=>............................] - ETA: 3:52 - loss: 0.2848 - regression_loss: 0.2681 - classification_loss: 0.0167 35/500 [=>............................] - ETA: 3:52 - loss: 0.2848 - regression_loss: 0.2682 - classification_loss: 0.0166 36/500 [=>............................] - ETA: 3:52 - loss: 0.2829 - regression_loss: 0.2665 - classification_loss: 0.0164 37/500 [=>............................] - ETA: 3:51 - loss: 0.2807 - regression_loss: 0.2646 - classification_loss: 0.0161 38/500 [=>............................] - ETA: 3:51 - loss: 0.2781 - regression_loss: 0.2622 - classification_loss: 0.0159 39/500 [=>............................] - ETA: 3:50 - loss: 0.2792 - regression_loss: 0.2633 - classification_loss: 0.0159 40/500 [=>............................] - ETA: 3:50 - loss: 0.2772 - regression_loss: 0.2615 - classification_loss: 0.0157 41/500 [=>............................] - ETA: 3:49 - loss: 0.2768 - regression_loss: 0.2614 - classification_loss: 0.0155 42/500 [=>............................] - ETA: 3:49 - loss: 0.2760 - regression_loss: 0.2606 - classification_loss: 0.0153 43/500 [=>............................] - ETA: 3:48 - loss: 0.2742 - regression_loss: 0.2591 - classification_loss: 0.0151 44/500 [=>............................] - ETA: 3:48 - loss: 0.2734 - regression_loss: 0.2585 - classification_loss: 0.0149 45/500 [=>............................] - ETA: 3:48 - loss: 0.2707 - regression_loss: 0.2561 - classification_loss: 0.0146 46/500 [=>............................] - ETA: 3:47 - loss: 0.2703 - regression_loss: 0.2556 - classification_loss: 0.0146 47/500 [=>............................] - ETA: 3:46 - loss: 0.2707 - regression_loss: 0.2563 - classification_loss: 0.0144 48/500 [=>............................] - ETA: 3:46 - loss: 0.2734 - regression_loss: 0.2589 - classification_loss: 0.0145 49/500 [=>............................] - ETA: 3:45 - loss: 0.2763 - regression_loss: 0.2615 - classification_loss: 0.0148 50/500 [==>...........................] - ETA: 3:45 - loss: 0.2774 - regression_loss: 0.2628 - classification_loss: 0.0147 51/500 [==>...........................] - ETA: 3:45 - loss: 0.2731 - regression_loss: 0.2586 - classification_loss: 0.0145 52/500 [==>...........................] - ETA: 3:44 - loss: 0.2696 - regression_loss: 0.2553 - classification_loss: 0.0142 53/500 [==>...........................] - ETA: 3:44 - loss: 0.2714 - regression_loss: 0.2567 - classification_loss: 0.0147 54/500 [==>...........................] - ETA: 3:43 - loss: 0.2764 - regression_loss: 0.2605 - classification_loss: 0.0159 55/500 [==>...........................] - ETA: 3:43 - loss: 0.2777 - regression_loss: 0.2618 - classification_loss: 0.0159 56/500 [==>...........................] - ETA: 3:42 - loss: 0.2763 - regression_loss: 0.2605 - classification_loss: 0.0158 57/500 [==>...........................] - ETA: 3:42 - loss: 0.2748 - regression_loss: 0.2591 - classification_loss: 0.0157 58/500 [==>...........................] - ETA: 3:41 - loss: 0.2754 - regression_loss: 0.2596 - classification_loss: 0.0158 59/500 [==>...........................] - ETA: 3:41 - loss: 0.2739 - regression_loss: 0.2582 - classification_loss: 0.0157 60/500 [==>...........................] - ETA: 3:40 - loss: 0.2749 - regression_loss: 0.2594 - classification_loss: 0.0155 61/500 [==>...........................] - ETA: 3:40 - loss: 0.2761 - regression_loss: 0.2605 - classification_loss: 0.0156 62/500 [==>...........................] - ETA: 3:39 - loss: 0.2794 - regression_loss: 0.2635 - classification_loss: 0.0159 63/500 [==>...........................] - ETA: 3:39 - loss: 0.2813 - regression_loss: 0.2652 - classification_loss: 0.0161 64/500 [==>...........................] - ETA: 3:38 - loss: 0.2818 - regression_loss: 0.2658 - classification_loss: 0.0160 65/500 [==>...........................] - ETA: 3:38 - loss: 0.2833 - regression_loss: 0.2673 - classification_loss: 0.0160 66/500 [==>...........................] - ETA: 3:37 - loss: 0.2856 - regression_loss: 0.2695 - classification_loss: 0.0161 67/500 [===>..........................] - ETA: 3:37 - loss: 0.2855 - regression_loss: 0.2694 - classification_loss: 0.0161 68/500 [===>..........................] - ETA: 3:36 - loss: 0.2860 - regression_loss: 0.2699 - classification_loss: 0.0160 69/500 [===>..........................] - ETA: 3:36 - loss: 0.2852 - regression_loss: 0.2692 - classification_loss: 0.0160 70/500 [===>..........................] - ETA: 3:35 - loss: 0.2861 - regression_loss: 0.2701 - classification_loss: 0.0161 71/500 [===>..........................] - ETA: 3:35 - loss: 0.2874 - regression_loss: 0.2713 - classification_loss: 0.0162 72/500 [===>..........................] - ETA: 3:34 - loss: 0.2869 - regression_loss: 0.2709 - classification_loss: 0.0161 73/500 [===>..........................] - ETA: 3:34 - loss: 0.2856 - regression_loss: 0.2697 - classification_loss: 0.0160 74/500 [===>..........................] - ETA: 3:33 - loss: 0.2855 - regression_loss: 0.2697 - classification_loss: 0.0158 75/500 [===>..........................] - ETA: 3:33 - loss: 0.2872 - regression_loss: 0.2713 - classification_loss: 0.0159 76/500 [===>..........................] - ETA: 3:32 - loss: 0.2910 - regression_loss: 0.2748 - classification_loss: 0.0162 77/500 [===>..........................] - ETA: 3:32 - loss: 0.2905 - regression_loss: 0.2745 - classification_loss: 0.0161 78/500 [===>..........................] - ETA: 3:31 - loss: 0.2924 - regression_loss: 0.2760 - classification_loss: 0.0165 79/500 [===>..........................] - ETA: 3:31 - loss: 0.2902 - regression_loss: 0.2740 - classification_loss: 0.0163 80/500 [===>..........................] - ETA: 3:30 - loss: 0.2895 - regression_loss: 0.2732 - classification_loss: 0.0163 81/500 [===>..........................] - ETA: 3:30 - loss: 0.2875 - regression_loss: 0.2714 - classification_loss: 0.0161 82/500 [===>..........................] - ETA: 3:29 - loss: 0.2875 - regression_loss: 0.2714 - classification_loss: 0.0160 83/500 [===>..........................] - ETA: 3:29 - loss: 0.2875 - regression_loss: 0.2715 - classification_loss: 0.0160 84/500 [====>.........................] - ETA: 3:28 - loss: 0.2864 - regression_loss: 0.2705 - classification_loss: 0.0159 85/500 [====>.........................] - ETA: 3:28 - loss: 0.2878 - regression_loss: 0.2718 - classification_loss: 0.0159 86/500 [====>.........................] - ETA: 3:27 - loss: 0.2869 - regression_loss: 0.2711 - classification_loss: 0.0159 87/500 [====>.........................] - ETA: 3:27 - loss: 0.2861 - regression_loss: 0.2703 - classification_loss: 0.0158 88/500 [====>.........................] - ETA: 3:27 - loss: 0.2892 - regression_loss: 0.2733 - classification_loss: 0.0159 89/500 [====>.........................] - ETA: 3:26 - loss: 0.2889 - regression_loss: 0.2731 - classification_loss: 0.0159 90/500 [====>.........................] - ETA: 3:26 - loss: 0.2895 - regression_loss: 0.2736 - classification_loss: 0.0159 91/500 [====>.........................] - ETA: 3:25 - loss: 0.2902 - regression_loss: 0.2742 - classification_loss: 0.0159 92/500 [====>.........................] - ETA: 3:25 - loss: 0.2912 - regression_loss: 0.2750 - classification_loss: 0.0163 93/500 [====>.........................] - ETA: 3:24 - loss: 0.2897 - regression_loss: 0.2735 - classification_loss: 0.0162 94/500 [====>.........................] - ETA: 3:24 - loss: 0.2898 - regression_loss: 0.2736 - classification_loss: 0.0162 95/500 [====>.........................] - ETA: 3:24 - loss: 0.2881 - regression_loss: 0.2720 - classification_loss: 0.0161 96/500 [====>.........................] - ETA: 3:23 - loss: 0.2884 - regression_loss: 0.2724 - classification_loss: 0.0160 97/500 [====>.........................] - ETA: 3:23 - loss: 0.2894 - regression_loss: 0.2732 - classification_loss: 0.0162 98/500 [====>.........................] - ETA: 3:22 - loss: 0.2882 - regression_loss: 0.2721 - classification_loss: 0.0161 99/500 [====>.........................] - ETA: 3:22 - loss: 0.2903 - regression_loss: 0.2740 - classification_loss: 0.0163 100/500 [=====>........................] - ETA: 3:21 - loss: 0.2892 - regression_loss: 0.2730 - classification_loss: 0.0162 101/500 [=====>........................] - ETA: 3:21 - loss: 0.2896 - regression_loss: 0.2734 - classification_loss: 0.0161 102/500 [=====>........................] - ETA: 3:20 - loss: 0.2889 - regression_loss: 0.2728 - classification_loss: 0.0161 103/500 [=====>........................] - ETA: 3:20 - loss: 0.2893 - regression_loss: 0.2733 - classification_loss: 0.0161 104/500 [=====>........................] - ETA: 3:19 - loss: 0.2878 - regression_loss: 0.2719 - classification_loss: 0.0159 105/500 [=====>........................] - ETA: 3:19 - loss: 0.2875 - regression_loss: 0.2716 - classification_loss: 0.0159 106/500 [=====>........................] - ETA: 3:18 - loss: 0.2877 - regression_loss: 0.2719 - classification_loss: 0.0159 107/500 [=====>........................] - ETA: 3:18 - loss: 0.2883 - regression_loss: 0.2724 - classification_loss: 0.0159 108/500 [=====>........................] - ETA: 3:17 - loss: 0.2874 - regression_loss: 0.2716 - classification_loss: 0.0158 109/500 [=====>........................] - ETA: 3:17 - loss: 0.2876 - regression_loss: 0.2719 - classification_loss: 0.0157 110/500 [=====>........................] - ETA: 3:16 - loss: 0.2893 - regression_loss: 0.2736 - classification_loss: 0.0157 111/500 [=====>........................] - ETA: 3:16 - loss: 0.2896 - regression_loss: 0.2739 - classification_loss: 0.0158 112/500 [=====>........................] - ETA: 3:15 - loss: 0.2904 - regression_loss: 0.2746 - classification_loss: 0.0158 113/500 [=====>........................] - ETA: 3:15 - loss: 0.2910 - regression_loss: 0.2752 - classification_loss: 0.0158 114/500 [=====>........................] - ETA: 3:14 - loss: 0.2909 - regression_loss: 0.2752 - classification_loss: 0.0157 115/500 [=====>........................] - ETA: 3:14 - loss: 0.2896 - regression_loss: 0.2740 - classification_loss: 0.0156 116/500 [=====>........................] - ETA: 3:13 - loss: 0.2888 - regression_loss: 0.2732 - classification_loss: 0.0155 117/500 [======>.......................] - ETA: 3:13 - loss: 0.2869 - regression_loss: 0.2715 - classification_loss: 0.0154 118/500 [======>.......................] - ETA: 3:12 - loss: 0.2867 - regression_loss: 0.2714 - classification_loss: 0.0154 119/500 [======>.......................] - ETA: 3:11 - loss: 0.2879 - regression_loss: 0.2724 - classification_loss: 0.0155 120/500 [======>.......................] - ETA: 3:11 - loss: 0.2876 - regression_loss: 0.2721 - classification_loss: 0.0154 121/500 [======>.......................] - ETA: 3:10 - loss: 0.2896 - regression_loss: 0.2741 - classification_loss: 0.0154 122/500 [======>.......................] - ETA: 3:10 - loss: 0.2945 - regression_loss: 0.2787 - classification_loss: 0.0158 123/500 [======>.......................] - ETA: 3:09 - loss: 0.2963 - regression_loss: 0.2803 - classification_loss: 0.0159 124/500 [======>.......................] - ETA: 3:09 - loss: 0.2951 - regression_loss: 0.2792 - classification_loss: 0.0158 125/500 [======>.......................] - ETA: 3:08 - loss: 0.2963 - regression_loss: 0.2803 - classification_loss: 0.0160 126/500 [======>.......................] - ETA: 3:08 - loss: 0.2972 - regression_loss: 0.2811 - classification_loss: 0.0160 127/500 [======>.......................] - ETA: 3:07 - loss: 0.2981 - regression_loss: 0.2820 - classification_loss: 0.0162 128/500 [======>.......................] - ETA: 3:07 - loss: 0.2987 - regression_loss: 0.2825 - classification_loss: 0.0161 129/500 [======>.......................] - ETA: 3:06 - loss: 0.2982 - regression_loss: 0.2821 - classification_loss: 0.0160 130/500 [======>.......................] - ETA: 3:06 - loss: 0.2971 - regression_loss: 0.2812 - classification_loss: 0.0160 131/500 [======>.......................] - ETA: 3:05 - loss: 0.2968 - regression_loss: 0.2809 - classification_loss: 0.0159 132/500 [======>.......................] - ETA: 3:05 - loss: 0.2981 - regression_loss: 0.2821 - classification_loss: 0.0160 133/500 [======>.......................] - ETA: 3:04 - loss: 0.2987 - regression_loss: 0.2827 - classification_loss: 0.0160 134/500 [=======>......................] - ETA: 3:04 - loss: 0.2972 - regression_loss: 0.2813 - classification_loss: 0.0159 135/500 [=======>......................] - ETA: 3:03 - loss: 0.2959 - regression_loss: 0.2801 - classification_loss: 0.0158 136/500 [=======>......................] - ETA: 3:03 - loss: 0.2963 - regression_loss: 0.2805 - classification_loss: 0.0158 137/500 [=======>......................] - ETA: 3:02 - loss: 0.2962 - regression_loss: 0.2804 - classification_loss: 0.0158 138/500 [=======>......................] - ETA: 3:02 - loss: 0.2963 - regression_loss: 0.2805 - classification_loss: 0.0158 139/500 [=======>......................] - ETA: 3:01 - loss: 0.2954 - regression_loss: 0.2797 - classification_loss: 0.0157 140/500 [=======>......................] - ETA: 3:01 - loss: 0.2957 - regression_loss: 0.2800 - classification_loss: 0.0157 141/500 [=======>......................] - ETA: 3:00 - loss: 0.2946 - regression_loss: 0.2790 - classification_loss: 0.0156 142/500 [=======>......................] - ETA: 3:00 - loss: 0.2949 - regression_loss: 0.2791 - classification_loss: 0.0158 143/500 [=======>......................] - ETA: 2:59 - loss: 0.2944 - regression_loss: 0.2786 - classification_loss: 0.0158 144/500 [=======>......................] - ETA: 2:59 - loss: 0.2946 - regression_loss: 0.2787 - classification_loss: 0.0159 145/500 [=======>......................] - ETA: 2:58 - loss: 0.2943 - regression_loss: 0.2784 - classification_loss: 0.0158 146/500 [=======>......................] - ETA: 2:58 - loss: 0.2959 - regression_loss: 0.2798 - classification_loss: 0.0161 147/500 [=======>......................] - ETA: 2:57 - loss: 0.2971 - regression_loss: 0.2811 - classification_loss: 0.0160 148/500 [=======>......................] - ETA: 2:57 - loss: 0.2964 - regression_loss: 0.2805 - classification_loss: 0.0160 149/500 [=======>......................] - ETA: 2:56 - loss: 0.2981 - regression_loss: 0.2818 - classification_loss: 0.0162 150/500 [========>.....................] - ETA: 2:56 - loss: 0.2985 - regression_loss: 0.2823 - classification_loss: 0.0162 151/500 [========>.....................] - ETA: 2:55 - loss: 0.2991 - regression_loss: 0.2829 - classification_loss: 0.0162 152/500 [========>.....................] - ETA: 2:55 - loss: 0.2992 - regression_loss: 0.2830 - classification_loss: 0.0162 153/500 [========>.....................] - ETA: 2:54 - loss: 0.2990 - regression_loss: 0.2827 - classification_loss: 0.0162 154/500 [========>.....................] - ETA: 2:54 - loss: 0.2981 - regression_loss: 0.2819 - classification_loss: 0.0162 155/500 [========>.....................] - ETA: 2:53 - loss: 0.2977 - regression_loss: 0.2816 - classification_loss: 0.0162 156/500 [========>.....................] - ETA: 2:53 - loss: 0.2970 - regression_loss: 0.2809 - classification_loss: 0.0161 157/500 [========>.....................] - ETA: 2:52 - loss: 0.2974 - regression_loss: 0.2812 - classification_loss: 0.0161 158/500 [========>.....................] - ETA: 2:52 - loss: 0.2970 - regression_loss: 0.2809 - classification_loss: 0.0161 159/500 [========>.....................] - ETA: 2:51 - loss: 0.2966 - regression_loss: 0.2805 - classification_loss: 0.0160 160/500 [========>.....................] - ETA: 2:51 - loss: 0.2959 - regression_loss: 0.2799 - classification_loss: 0.0160 161/500 [========>.....................] - ETA: 2:50 - loss: 0.2974 - regression_loss: 0.2813 - classification_loss: 0.0160 162/500 [========>.....................] - ETA: 2:49 - loss: 0.2989 - regression_loss: 0.2829 - classification_loss: 0.0160 163/500 [========>.....................] - ETA: 2:49 - loss: 0.2996 - regression_loss: 0.2835 - classification_loss: 0.0162 164/500 [========>.....................] - ETA: 2:48 - loss: 0.2981 - regression_loss: 0.2820 - classification_loss: 0.0161 165/500 [========>.....................] - ETA: 2:48 - loss: 0.2986 - regression_loss: 0.2825 - classification_loss: 0.0161 166/500 [========>.....................] - ETA: 2:48 - loss: 0.2984 - regression_loss: 0.2824 - classification_loss: 0.0161 167/500 [=========>....................] - ETA: 2:47 - loss: 0.2975 - regression_loss: 0.2815 - classification_loss: 0.0160 168/500 [=========>....................] - ETA: 2:47 - loss: 0.2975 - regression_loss: 0.2815 - classification_loss: 0.0160 169/500 [=========>....................] - ETA: 2:46 - loss: 0.2966 - regression_loss: 0.2807 - classification_loss: 0.0159 170/500 [=========>....................] - ETA: 2:46 - loss: 0.2975 - regression_loss: 0.2816 - classification_loss: 0.0160 171/500 [=========>....................] - ETA: 2:45 - loss: 0.2981 - regression_loss: 0.2821 - classification_loss: 0.0160 172/500 [=========>....................] - ETA: 2:44 - loss: 0.2982 - regression_loss: 0.2822 - classification_loss: 0.0160 173/500 [=========>....................] - ETA: 2:44 - loss: 0.2975 - regression_loss: 0.2816 - classification_loss: 0.0159 174/500 [=========>....................] - ETA: 2:43 - loss: 0.2984 - regression_loss: 0.2825 - classification_loss: 0.0159 175/500 [=========>....................] - ETA: 2:43 - loss: 0.2989 - regression_loss: 0.2830 - classification_loss: 0.0159 176/500 [=========>....................] - ETA: 2:42 - loss: 0.2992 - regression_loss: 0.2832 - classification_loss: 0.0160 177/500 [=========>....................] - ETA: 2:42 - loss: 0.2988 - regression_loss: 0.2828 - classification_loss: 0.0160 178/500 [=========>....................] - ETA: 2:42 - loss: 0.2988 - regression_loss: 0.2828 - classification_loss: 0.0160 179/500 [=========>....................] - ETA: 2:41 - loss: 0.2986 - regression_loss: 0.2825 - classification_loss: 0.0161 180/500 [=========>....................] - ETA: 2:40 - loss: 0.2991 - regression_loss: 0.2829 - classification_loss: 0.0161 181/500 [=========>....................] - ETA: 2:40 - loss: 0.2984 - regression_loss: 0.2823 - classification_loss: 0.0161 182/500 [=========>....................] - ETA: 2:39 - loss: 0.2980 - regression_loss: 0.2820 - classification_loss: 0.0160 183/500 [=========>....................] - ETA: 2:39 - loss: 0.2976 - regression_loss: 0.2816 - classification_loss: 0.0160 184/500 [==========>...................] - ETA: 2:38 - loss: 0.2978 - regression_loss: 0.2818 - classification_loss: 0.0159 185/500 [==========>...................] - ETA: 2:38 - loss: 0.2979 - regression_loss: 0.2819 - classification_loss: 0.0160 186/500 [==========>...................] - ETA: 2:37 - loss: 0.2980 - regression_loss: 0.2820 - classification_loss: 0.0160 187/500 [==========>...................] - ETA: 2:37 - loss: 0.2991 - regression_loss: 0.2831 - classification_loss: 0.0160 188/500 [==========>...................] - ETA: 2:36 - loss: 0.3001 - regression_loss: 0.2839 - classification_loss: 0.0162 189/500 [==========>...................] - ETA: 2:36 - loss: 0.3000 - regression_loss: 0.2838 - classification_loss: 0.0161 190/500 [==========>...................] - ETA: 2:35 - loss: 0.3004 - regression_loss: 0.2842 - classification_loss: 0.0162 191/500 [==========>...................] - ETA: 2:35 - loss: 0.2998 - regression_loss: 0.2836 - classification_loss: 0.0161 192/500 [==========>...................] - ETA: 2:34 - loss: 0.3008 - regression_loss: 0.2846 - classification_loss: 0.0162 193/500 [==========>...................] - ETA: 2:34 - loss: 0.3015 - regression_loss: 0.2852 - classification_loss: 0.0163 194/500 [==========>...................] - ETA: 2:33 - loss: 0.3015 - regression_loss: 0.2852 - classification_loss: 0.0163 195/500 [==========>...................] - ETA: 2:33 - loss: 0.3013 - regression_loss: 0.2850 - classification_loss: 0.0162 196/500 [==========>...................] - ETA: 2:32 - loss: 0.3009 - regression_loss: 0.2847 - classification_loss: 0.0162 197/500 [==========>...................] - ETA: 2:32 - loss: 0.3001 - regression_loss: 0.2839 - classification_loss: 0.0162 198/500 [==========>...................] - ETA: 2:31 - loss: 0.3001 - regression_loss: 0.2840 - classification_loss: 0.0161 199/500 [==========>...................] - ETA: 2:31 - loss: 0.3008 - regression_loss: 0.2846 - classification_loss: 0.0162 200/500 [===========>..................] - ETA: 2:30 - loss: 0.3005 - regression_loss: 0.2843 - classification_loss: 0.0162 201/500 [===========>..................] - ETA: 2:30 - loss: 0.2999 - regression_loss: 0.2838 - classification_loss: 0.0161 202/500 [===========>..................] - ETA: 2:29 - loss: 0.2992 - regression_loss: 0.2831 - classification_loss: 0.0161 203/500 [===========>..................] - ETA: 2:29 - loss: 0.2992 - regression_loss: 0.2831 - classification_loss: 0.0161 204/500 [===========>..................] - ETA: 2:28 - loss: 0.2993 - regression_loss: 0.2833 - classification_loss: 0.0161 205/500 [===========>..................] - ETA: 2:28 - loss: 0.2989 - regression_loss: 0.2829 - classification_loss: 0.0160 206/500 [===========>..................] - ETA: 2:27 - loss: 0.2985 - regression_loss: 0.2825 - classification_loss: 0.0160 207/500 [===========>..................] - ETA: 2:27 - loss: 0.2981 - regression_loss: 0.2821 - classification_loss: 0.0160 208/500 [===========>..................] - ETA: 2:26 - loss: 0.2985 - regression_loss: 0.2825 - classification_loss: 0.0160 209/500 [===========>..................] - ETA: 2:26 - loss: 0.2976 - regression_loss: 0.2817 - classification_loss: 0.0159 210/500 [===========>..................] - ETA: 2:25 - loss: 0.2976 - regression_loss: 0.2816 - classification_loss: 0.0159 211/500 [===========>..................] - ETA: 2:25 - loss: 0.2976 - regression_loss: 0.2817 - classification_loss: 0.0159 212/500 [===========>..................] - ETA: 2:24 - loss: 0.2974 - regression_loss: 0.2814 - classification_loss: 0.0159 213/500 [===========>..................] - ETA: 2:24 - loss: 0.2979 - regression_loss: 0.2820 - classification_loss: 0.0159 214/500 [===========>..................] - ETA: 2:23 - loss: 0.2973 - regression_loss: 0.2815 - classification_loss: 0.0159 215/500 [===========>..................] - ETA: 2:23 - loss: 0.2980 - regression_loss: 0.2820 - classification_loss: 0.0159 216/500 [===========>..................] - ETA: 2:22 - loss: 0.2973 - regression_loss: 0.2814 - classification_loss: 0.0159 217/500 [============>.................] - ETA: 2:22 - loss: 0.2971 - regression_loss: 0.2812 - classification_loss: 0.0159 218/500 [============>.................] - ETA: 2:21 - loss: 0.2965 - regression_loss: 0.2806 - classification_loss: 0.0158 219/500 [============>.................] - ETA: 2:21 - loss: 0.2970 - regression_loss: 0.2811 - classification_loss: 0.0159 220/500 [============>.................] - ETA: 2:20 - loss: 0.2979 - regression_loss: 0.2820 - classification_loss: 0.0159 221/500 [============>.................] - ETA: 2:20 - loss: 0.2974 - regression_loss: 0.2815 - classification_loss: 0.0159 222/500 [============>.................] - ETA: 2:19 - loss: 0.2984 - regression_loss: 0.2824 - classification_loss: 0.0160 223/500 [============>.................] - ETA: 2:19 - loss: 0.2977 - regression_loss: 0.2817 - classification_loss: 0.0160 224/500 [============>.................] - ETA: 2:18 - loss: 0.2971 - regression_loss: 0.2812 - classification_loss: 0.0159 225/500 [============>.................] - ETA: 2:18 - loss: 0.2970 - regression_loss: 0.2810 - classification_loss: 0.0160 226/500 [============>.................] - ETA: 2:17 - loss: 0.2964 - regression_loss: 0.2805 - classification_loss: 0.0159 227/500 [============>.................] - ETA: 2:17 - loss: 0.2962 - regression_loss: 0.2804 - classification_loss: 0.0159 228/500 [============>.................] - ETA: 2:16 - loss: 0.2961 - regression_loss: 0.2802 - classification_loss: 0.0158 229/500 [============>.................] - ETA: 2:16 - loss: 0.2961 - regression_loss: 0.2803 - classification_loss: 0.0158 230/500 [============>.................] - ETA: 2:15 - loss: 0.2960 - regression_loss: 0.2802 - classification_loss: 0.0158 231/500 [============>.................] - ETA: 2:15 - loss: 0.2954 - regression_loss: 0.2796 - classification_loss: 0.0158 232/500 [============>.................] - ETA: 2:14 - loss: 0.2981 - regression_loss: 0.2822 - classification_loss: 0.0158 233/500 [============>.................] - ETA: 2:14 - loss: 0.2978 - regression_loss: 0.2820 - classification_loss: 0.0159 234/500 [=============>................] - ETA: 2:13 - loss: 0.2983 - regression_loss: 0.2824 - classification_loss: 0.0159 235/500 [=============>................] - ETA: 2:13 - loss: 0.2982 - regression_loss: 0.2823 - classification_loss: 0.0159 236/500 [=============>................] - ETA: 2:12 - loss: 0.2976 - regression_loss: 0.2818 - classification_loss: 0.0158 237/500 [=============>................] - ETA: 2:12 - loss: 0.2977 - regression_loss: 0.2818 - classification_loss: 0.0159 238/500 [=============>................] - ETA: 2:11 - loss: 0.2977 - regression_loss: 0.2818 - classification_loss: 0.0159 239/500 [=============>................] - ETA: 2:11 - loss: 0.2985 - regression_loss: 0.2826 - classification_loss: 0.0159 240/500 [=============>................] - ETA: 2:10 - loss: 0.2986 - regression_loss: 0.2827 - classification_loss: 0.0159 241/500 [=============>................] - ETA: 2:10 - loss: 0.2981 - regression_loss: 0.2822 - classification_loss: 0.0160 242/500 [=============>................] - ETA: 2:09 - loss: 0.2981 - regression_loss: 0.2821 - classification_loss: 0.0160 243/500 [=============>................] - ETA: 2:09 - loss: 0.2989 - regression_loss: 0.2829 - classification_loss: 0.0160 244/500 [=============>................] - ETA: 2:08 - loss: 0.2987 - regression_loss: 0.2827 - classification_loss: 0.0160 245/500 [=============>................] - ETA: 2:08 - loss: 0.2986 - regression_loss: 0.2825 - classification_loss: 0.0160 246/500 [=============>................] - ETA: 2:07 - loss: 0.2982 - regression_loss: 0.2822 - classification_loss: 0.0160 247/500 [=============>................] - ETA: 2:07 - loss: 0.2979 - regression_loss: 0.2819 - classification_loss: 0.0160 248/500 [=============>................] - ETA: 2:06 - loss: 0.2983 - regression_loss: 0.2822 - classification_loss: 0.0161 249/500 [=============>................] - ETA: 2:06 - loss: 0.2997 - regression_loss: 0.2834 - classification_loss: 0.0163 250/500 [==============>...............] - ETA: 2:05 - loss: 0.2996 - regression_loss: 0.2833 - classification_loss: 0.0163 251/500 [==============>...............] - ETA: 2:05 - loss: 0.2990 - regression_loss: 0.2827 - classification_loss: 0.0163 252/500 [==============>...............] - ETA: 2:04 - loss: 0.2985 - regression_loss: 0.2822 - classification_loss: 0.0162 253/500 [==============>...............] - ETA: 2:04 - loss: 0.2991 - regression_loss: 0.2828 - classification_loss: 0.0163 254/500 [==============>...............] - ETA: 2:03 - loss: 0.2988 - regression_loss: 0.2825 - classification_loss: 0.0163 255/500 [==============>...............] - ETA: 2:03 - loss: 0.2981 - regression_loss: 0.2819 - classification_loss: 0.0162 256/500 [==============>...............] - ETA: 2:02 - loss: 0.2976 - regression_loss: 0.2814 - classification_loss: 0.0162 257/500 [==============>...............] - ETA: 2:02 - loss: 0.2979 - regression_loss: 0.2817 - classification_loss: 0.0162 258/500 [==============>...............] - ETA: 2:01 - loss: 0.2977 - regression_loss: 0.2815 - classification_loss: 0.0162 259/500 [==============>...............] - ETA: 2:01 - loss: 0.2973 - regression_loss: 0.2811 - classification_loss: 0.0162 260/500 [==============>...............] - ETA: 2:00 - loss: 0.2970 - regression_loss: 0.2809 - classification_loss: 0.0162 261/500 [==============>...............] - ETA: 2:00 - loss: 0.2971 - regression_loss: 0.2809 - classification_loss: 0.0162 262/500 [==============>...............] - ETA: 1:59 - loss: 0.2974 - regression_loss: 0.2813 - classification_loss: 0.0162 263/500 [==============>...............] - ETA: 1:59 - loss: 0.2979 - regression_loss: 0.2817 - classification_loss: 0.0162 264/500 [==============>...............] - ETA: 1:58 - loss: 0.2972 - regression_loss: 0.2810 - classification_loss: 0.0162 265/500 [==============>...............] - ETA: 1:58 - loss: 0.2968 - regression_loss: 0.2807 - classification_loss: 0.0161 266/500 [==============>...............] - ETA: 1:57 - loss: 0.2968 - regression_loss: 0.2807 - classification_loss: 0.0161 267/500 [===============>..............] - ETA: 1:57 - loss: 0.2963 - regression_loss: 0.2802 - classification_loss: 0.0161 268/500 [===============>..............] - ETA: 1:56 - loss: 0.2964 - regression_loss: 0.2802 - classification_loss: 0.0161 269/500 [===============>..............] - ETA: 1:56 - loss: 0.2961 - regression_loss: 0.2799 - classification_loss: 0.0161 270/500 [===============>..............] - ETA: 1:55 - loss: 0.2961 - regression_loss: 0.2800 - classification_loss: 0.0161 271/500 [===============>..............] - ETA: 1:55 - loss: 0.2960 - regression_loss: 0.2799 - classification_loss: 0.0161 272/500 [===============>..............] - ETA: 1:54 - loss: 0.2961 - regression_loss: 0.2799 - classification_loss: 0.0161 273/500 [===============>..............] - ETA: 1:54 - loss: 0.2960 - regression_loss: 0.2798 - classification_loss: 0.0161 274/500 [===============>..............] - ETA: 1:53 - loss: 0.2957 - regression_loss: 0.2796 - classification_loss: 0.0161 275/500 [===============>..............] - ETA: 1:53 - loss: 0.2955 - regression_loss: 0.2794 - classification_loss: 0.0161 276/500 [===============>..............] - ETA: 1:52 - loss: 0.2959 - regression_loss: 0.2799 - classification_loss: 0.0161 277/500 [===============>..............] - ETA: 1:52 - loss: 0.2954 - regression_loss: 0.2794 - classification_loss: 0.0160 278/500 [===============>..............] - ETA: 1:51 - loss: 0.2961 - regression_loss: 0.2801 - classification_loss: 0.0161 279/500 [===============>..............] - ETA: 1:51 - loss: 0.2963 - regression_loss: 0.2801 - classification_loss: 0.0161 280/500 [===============>..............] - ETA: 1:50 - loss: 0.2959 - regression_loss: 0.2798 - classification_loss: 0.0161 281/500 [===============>..............] - ETA: 1:50 - loss: 0.2957 - regression_loss: 0.2796 - classification_loss: 0.0161 282/500 [===============>..............] - ETA: 1:49 - loss: 0.2959 - regression_loss: 0.2798 - classification_loss: 0.0161 283/500 [===============>..............] - ETA: 1:49 - loss: 0.2963 - regression_loss: 0.2801 - classification_loss: 0.0161 284/500 [================>.............] - ETA: 1:48 - loss: 0.2965 - regression_loss: 0.2803 - classification_loss: 0.0161 285/500 [================>.............] - ETA: 1:48 - loss: 0.2966 - regression_loss: 0.2805 - classification_loss: 0.0161 286/500 [================>.............] - ETA: 1:47 - loss: 0.2962 - regression_loss: 0.2801 - classification_loss: 0.0161 287/500 [================>.............] - ETA: 1:47 - loss: 0.2963 - regression_loss: 0.2802 - classification_loss: 0.0161 288/500 [================>.............] - ETA: 1:46 - loss: 0.2963 - regression_loss: 0.2802 - classification_loss: 0.0161 289/500 [================>.............] - ETA: 1:46 - loss: 0.2961 - regression_loss: 0.2800 - classification_loss: 0.0161 290/500 [================>.............] - ETA: 1:45 - loss: 0.2957 - regression_loss: 0.2796 - classification_loss: 0.0161 291/500 [================>.............] - ETA: 1:45 - loss: 0.2953 - regression_loss: 0.2793 - classification_loss: 0.0160 292/500 [================>.............] - ETA: 1:44 - loss: 0.2957 - regression_loss: 0.2797 - classification_loss: 0.0161 293/500 [================>.............] - ETA: 1:44 - loss: 0.2955 - regression_loss: 0.2794 - classification_loss: 0.0160 294/500 [================>.............] - ETA: 1:43 - loss: 0.2951 - regression_loss: 0.2791 - classification_loss: 0.0160 295/500 [================>.............] - ETA: 1:43 - loss: 0.2946 - regression_loss: 0.2786 - classification_loss: 0.0159 296/500 [================>.............] - ETA: 1:42 - loss: 0.2939 - regression_loss: 0.2780 - classification_loss: 0.0159 297/500 [================>.............] - ETA: 1:42 - loss: 0.2936 - regression_loss: 0.2777 - classification_loss: 0.0159 298/500 [================>.............] - ETA: 1:41 - loss: 0.2929 - regression_loss: 0.2771 - classification_loss: 0.0159 299/500 [================>.............] - ETA: 1:41 - loss: 0.2924 - regression_loss: 0.2766 - classification_loss: 0.0158 300/500 [=================>............] - ETA: 1:40 - loss: 0.2921 - regression_loss: 0.2763 - classification_loss: 0.0158 301/500 [=================>............] - ETA: 1:40 - loss: 0.2919 - regression_loss: 0.2761 - classification_loss: 0.0158 302/500 [=================>............] - ETA: 1:39 - loss: 0.2920 - regression_loss: 0.2762 - classification_loss: 0.0158 303/500 [=================>............] - ETA: 1:39 - loss: 0.2917 - regression_loss: 0.2760 - classification_loss: 0.0158 304/500 [=================>............] - ETA: 1:38 - loss: 0.2915 - regression_loss: 0.2757 - classification_loss: 0.0158 305/500 [=================>............] - ETA: 1:38 - loss: 0.2923 - regression_loss: 0.2765 - classification_loss: 0.0158 306/500 [=================>............] - ETA: 1:37 - loss: 0.2923 - regression_loss: 0.2766 - classification_loss: 0.0157 307/500 [=================>............] - ETA: 1:37 - loss: 0.2924 - regression_loss: 0.2767 - classification_loss: 0.0157 308/500 [=================>............] - ETA: 1:36 - loss: 0.2924 - regression_loss: 0.2767 - classification_loss: 0.0157 309/500 [=================>............] - ETA: 1:36 - loss: 0.2923 - regression_loss: 0.2766 - classification_loss: 0.0157 310/500 [=================>............] - ETA: 1:35 - loss: 0.2927 - regression_loss: 0.2769 - classification_loss: 0.0157 311/500 [=================>............] - ETA: 1:35 - loss: 0.2930 - regression_loss: 0.2772 - classification_loss: 0.0157 312/500 [=================>............] - ETA: 1:34 - loss: 0.2932 - regression_loss: 0.2774 - classification_loss: 0.0158 313/500 [=================>............] - ETA: 1:34 - loss: 0.2931 - regression_loss: 0.2773 - classification_loss: 0.0157 314/500 [=================>............] - ETA: 1:33 - loss: 0.2928 - regression_loss: 0.2771 - classification_loss: 0.0157 315/500 [=================>............] - ETA: 1:33 - loss: 0.2927 - regression_loss: 0.2770 - classification_loss: 0.0157 316/500 [=================>............] - ETA: 1:32 - loss: 0.2926 - regression_loss: 0.2769 - classification_loss: 0.0157 317/500 [==================>...........] - ETA: 1:32 - loss: 0.2927 - regression_loss: 0.2769 - classification_loss: 0.0158 318/500 [==================>...........] - ETA: 1:31 - loss: 0.2922 - regression_loss: 0.2764 - classification_loss: 0.0158 319/500 [==================>...........] - ETA: 1:31 - loss: 0.2918 - regression_loss: 0.2761 - classification_loss: 0.0157 320/500 [==================>...........] - ETA: 1:30 - loss: 0.2914 - regression_loss: 0.2757 - classification_loss: 0.0157 321/500 [==================>...........] - ETA: 1:30 - loss: 0.2908 - regression_loss: 0.2751 - classification_loss: 0.0157 322/500 [==================>...........] - ETA: 1:29 - loss: 0.2909 - regression_loss: 0.2753 - classification_loss: 0.0157 323/500 [==================>...........] - ETA: 1:29 - loss: 0.2909 - regression_loss: 0.2752 - classification_loss: 0.0157 324/500 [==================>...........] - ETA: 1:28 - loss: 0.2914 - regression_loss: 0.2757 - classification_loss: 0.0157 325/500 [==================>...........] - ETA: 1:28 - loss: 0.2919 - regression_loss: 0.2762 - classification_loss: 0.0157 326/500 [==================>...........] - ETA: 1:27 - loss: 0.2922 - regression_loss: 0.2765 - classification_loss: 0.0157 327/500 [==================>...........] - ETA: 1:27 - loss: 0.2928 - regression_loss: 0.2769 - classification_loss: 0.0159 328/500 [==================>...........] - ETA: 1:26 - loss: 0.2931 - regression_loss: 0.2772 - classification_loss: 0.0160 329/500 [==================>...........] - ETA: 1:26 - loss: 0.2936 - regression_loss: 0.2775 - classification_loss: 0.0160 330/500 [==================>...........] - ETA: 1:25 - loss: 0.2930 - regression_loss: 0.2770 - classification_loss: 0.0160 331/500 [==================>...........] - ETA: 1:25 - loss: 0.2930 - regression_loss: 0.2770 - classification_loss: 0.0160 332/500 [==================>...........] - ETA: 1:24 - loss: 0.2924 - regression_loss: 0.2765 - classification_loss: 0.0160 333/500 [==================>...........] - ETA: 1:24 - loss: 0.2923 - regression_loss: 0.2763 - classification_loss: 0.0160 334/500 [===================>..........] - ETA: 1:23 - loss: 0.2921 - regression_loss: 0.2761 - classification_loss: 0.0160 335/500 [===================>..........] - ETA: 1:23 - loss: 0.2919 - regression_loss: 0.2759 - classification_loss: 0.0160 336/500 [===================>..........] - ETA: 1:22 - loss: 0.2918 - regression_loss: 0.2759 - classification_loss: 0.0160 337/500 [===================>..........] - ETA: 1:21 - loss: 0.2918 - regression_loss: 0.2758 - classification_loss: 0.0160 338/500 [===================>..........] - ETA: 1:21 - loss: 0.2911 - regression_loss: 0.2752 - classification_loss: 0.0159 339/500 [===================>..........] - ETA: 1:20 - loss: 0.2908 - regression_loss: 0.2748 - classification_loss: 0.0160 340/500 [===================>..........] - ETA: 1:20 - loss: 0.2908 - regression_loss: 0.2748 - classification_loss: 0.0160 341/500 [===================>..........] - ETA: 1:19 - loss: 0.2917 - regression_loss: 0.2757 - classification_loss: 0.0160 342/500 [===================>..........] - ETA: 1:19 - loss: 0.2914 - regression_loss: 0.2754 - classification_loss: 0.0160 343/500 [===================>..........] - ETA: 1:18 - loss: 0.2910 - regression_loss: 0.2750 - classification_loss: 0.0159 344/500 [===================>..........] - ETA: 1:18 - loss: 0.2911 - regression_loss: 0.2752 - classification_loss: 0.0159 345/500 [===================>..........] - ETA: 1:17 - loss: 0.2906 - regression_loss: 0.2747 - classification_loss: 0.0159 346/500 [===================>..........] - ETA: 1:17 - loss: 0.2909 - regression_loss: 0.2750 - classification_loss: 0.0159 347/500 [===================>..........] - ETA: 1:16 - loss: 0.2905 - regression_loss: 0.2746 - classification_loss: 0.0159 348/500 [===================>..........] - ETA: 1:16 - loss: 0.2902 - regression_loss: 0.2743 - classification_loss: 0.0159 349/500 [===================>..........] - ETA: 1:15 - loss: 0.2898 - regression_loss: 0.2740 - classification_loss: 0.0159 350/500 [====================>.........] - ETA: 1:15 - loss: 0.2899 - regression_loss: 0.2740 - classification_loss: 0.0159 351/500 [====================>.........] - ETA: 1:14 - loss: 0.2901 - regression_loss: 0.2741 - classification_loss: 0.0159 352/500 [====================>.........] - ETA: 1:14 - loss: 0.2906 - regression_loss: 0.2746 - classification_loss: 0.0160 353/500 [====================>.........] - ETA: 1:13 - loss: 0.2904 - regression_loss: 0.2744 - classification_loss: 0.0160 354/500 [====================>.........] - ETA: 1:13 - loss: 0.2904 - regression_loss: 0.2744 - classification_loss: 0.0160 355/500 [====================>.........] - ETA: 1:12 - loss: 0.2905 - regression_loss: 0.2746 - classification_loss: 0.0160 356/500 [====================>.........] - ETA: 1:12 - loss: 0.2906 - regression_loss: 0.2747 - classification_loss: 0.0160 357/500 [====================>.........] - ETA: 1:11 - loss: 0.2902 - regression_loss: 0.2742 - classification_loss: 0.0159 358/500 [====================>.........] - ETA: 1:11 - loss: 0.2899 - regression_loss: 0.2740 - classification_loss: 0.0159 359/500 [====================>.........] - ETA: 1:10 - loss: 0.2902 - regression_loss: 0.2743 - classification_loss: 0.0159 360/500 [====================>.........] - ETA: 1:10 - loss: 0.2908 - regression_loss: 0.2748 - classification_loss: 0.0160 361/500 [====================>.........] - ETA: 1:09 - loss: 0.2906 - regression_loss: 0.2747 - classification_loss: 0.0160 362/500 [====================>.........] - ETA: 1:09 - loss: 0.2903 - regression_loss: 0.2744 - classification_loss: 0.0159 363/500 [====================>.........] - ETA: 1:08 - loss: 0.2901 - regression_loss: 0.2742 - classification_loss: 0.0159 364/500 [====================>.........] - ETA: 1:08 - loss: 0.2899 - regression_loss: 0.2740 - classification_loss: 0.0159 365/500 [====================>.........] - ETA: 1:07 - loss: 0.2901 - regression_loss: 0.2741 - classification_loss: 0.0159 366/500 [====================>.........] - ETA: 1:07 - loss: 0.2904 - regression_loss: 0.2745 - classification_loss: 0.0159 367/500 [=====================>........] - ETA: 1:06 - loss: 0.2906 - regression_loss: 0.2747 - classification_loss: 0.0159 368/500 [=====================>........] - ETA: 1:06 - loss: 0.2904 - regression_loss: 0.2745 - classification_loss: 0.0159 369/500 [=====================>........] - ETA: 1:05 - loss: 0.2901 - regression_loss: 0.2743 - classification_loss: 0.0159 370/500 [=====================>........] - ETA: 1:05 - loss: 0.2900 - regression_loss: 0.2742 - classification_loss: 0.0158 371/500 [=====================>........] - ETA: 1:04 - loss: 0.2900 - regression_loss: 0.2741 - classification_loss: 0.0158 372/500 [=====================>........] - ETA: 1:04 - loss: 0.2895 - regression_loss: 0.2737 - classification_loss: 0.0158 373/500 [=====================>........] - ETA: 1:03 - loss: 0.2895 - regression_loss: 0.2737 - classification_loss: 0.0158 374/500 [=====================>........] - ETA: 1:03 - loss: 0.2899 - regression_loss: 0.2740 - classification_loss: 0.0159 375/500 [=====================>........] - ETA: 1:02 - loss: 0.2900 - regression_loss: 0.2741 - classification_loss: 0.0159 376/500 [=====================>........] - ETA: 1:02 - loss: 0.2900 - regression_loss: 0.2741 - classification_loss: 0.0158 377/500 [=====================>........] - ETA: 1:01 - loss: 0.2897 - regression_loss: 0.2739 - classification_loss: 0.0158 378/500 [=====================>........] - ETA: 1:01 - loss: 0.2894 - regression_loss: 0.2736 - classification_loss: 0.0158 379/500 [=====================>........] - ETA: 1:00 - loss: 0.2898 - regression_loss: 0.2740 - classification_loss: 0.0158 380/500 [=====================>........] - ETA: 1:00 - loss: 0.2897 - regression_loss: 0.2739 - classification_loss: 0.0158 381/500 [=====================>........] - ETA: 59s - loss: 0.2897 - regression_loss: 0.2739 - classification_loss: 0.0158  382/500 [=====================>........] - ETA: 59s - loss: 0.2898 - regression_loss: 0.2739 - classification_loss: 0.0158 383/500 [=====================>........] - ETA: 58s - loss: 0.2899 - regression_loss: 0.2740 - classification_loss: 0.0158 384/500 [======================>.......] - ETA: 58s - loss: 0.2896 - regression_loss: 0.2737 - classification_loss: 0.0158 385/500 [======================>.......] - ETA: 57s - loss: 0.2897 - regression_loss: 0.2739 - classification_loss: 0.0158 386/500 [======================>.......] - ETA: 57s - loss: 0.2896 - regression_loss: 0.2738 - classification_loss: 0.0158 387/500 [======================>.......] - ETA: 56s - loss: 0.2895 - regression_loss: 0.2737 - classification_loss: 0.0158 388/500 [======================>.......] - ETA: 56s - loss: 0.2891 - regression_loss: 0.2733 - classification_loss: 0.0158 389/500 [======================>.......] - ETA: 55s - loss: 0.2894 - regression_loss: 0.2737 - classification_loss: 0.0158 390/500 [======================>.......] - ETA: 55s - loss: 0.2899 - regression_loss: 0.2741 - classification_loss: 0.0158 391/500 [======================>.......] - ETA: 54s - loss: 0.2898 - regression_loss: 0.2739 - classification_loss: 0.0158 392/500 [======================>.......] - ETA: 54s - loss: 0.2904 - regression_loss: 0.2746 - classification_loss: 0.0158 393/500 [======================>.......] - ETA: 53s - loss: 0.2907 - regression_loss: 0.2749 - classification_loss: 0.0158 394/500 [======================>.......] - ETA: 53s - loss: 0.2910 - regression_loss: 0.2751 - classification_loss: 0.0159 395/500 [======================>.......] - ETA: 52s - loss: 0.2908 - regression_loss: 0.2749 - classification_loss: 0.0158 396/500 [======================>.......] - ETA: 52s - loss: 0.2908 - regression_loss: 0.2750 - classification_loss: 0.0158 397/500 [======================>.......] - ETA: 51s - loss: 0.2912 - regression_loss: 0.2753 - classification_loss: 0.0159 398/500 [======================>.......] - ETA: 51s - loss: 0.2909 - regression_loss: 0.2751 - classification_loss: 0.0159 399/500 [======================>.......] - ETA: 50s - loss: 0.2906 - regression_loss: 0.2748 - classification_loss: 0.0158 400/500 [=======================>......] - ETA: 50s - loss: 0.2910 - regression_loss: 0.2752 - classification_loss: 0.0158 401/500 [=======================>......] - ETA: 49s - loss: 0.2913 - regression_loss: 0.2754 - classification_loss: 0.0158 402/500 [=======================>......] - ETA: 49s - loss: 0.2915 - regression_loss: 0.2756 - classification_loss: 0.0158 403/500 [=======================>......] - ETA: 48s - loss: 0.2921 - regression_loss: 0.2762 - classification_loss: 0.0159 404/500 [=======================>......] - ETA: 48s - loss: 0.2924 - regression_loss: 0.2765 - classification_loss: 0.0159 405/500 [=======================>......] - ETA: 47s - loss: 0.2924 - regression_loss: 0.2765 - classification_loss: 0.0159 406/500 [=======================>......] - ETA: 47s - loss: 0.2923 - regression_loss: 0.2764 - classification_loss: 0.0159 407/500 [=======================>......] - ETA: 46s - loss: 0.2922 - regression_loss: 0.2763 - classification_loss: 0.0159 408/500 [=======================>......] - ETA: 46s - loss: 0.2921 - regression_loss: 0.2762 - classification_loss: 0.0159 409/500 [=======================>......] - ETA: 45s - loss: 0.2919 - regression_loss: 0.2760 - classification_loss: 0.0159 410/500 [=======================>......] - ETA: 45s - loss: 0.2919 - regression_loss: 0.2760 - classification_loss: 0.0159 411/500 [=======================>......] - ETA: 44s - loss: 0.2918 - regression_loss: 0.2759 - classification_loss: 0.0159 412/500 [=======================>......] - ETA: 44s - loss: 0.2921 - regression_loss: 0.2763 - classification_loss: 0.0158 413/500 [=======================>......] - ETA: 43s - loss: 0.2924 - regression_loss: 0.2765 - classification_loss: 0.0158 414/500 [=======================>......] - ETA: 43s - loss: 0.2924 - regression_loss: 0.2766 - classification_loss: 0.0158 415/500 [=======================>......] - ETA: 42s - loss: 0.2925 - regression_loss: 0.2767 - classification_loss: 0.0158 416/500 [=======================>......] - ETA: 42s - loss: 0.2931 - regression_loss: 0.2772 - classification_loss: 0.0159 417/500 [========================>.....] - ETA: 41s - loss: 0.2928 - regression_loss: 0.2769 - classification_loss: 0.0159 418/500 [========================>.....] - ETA: 41s - loss: 0.2926 - regression_loss: 0.2767 - classification_loss: 0.0158 419/500 [========================>.....] - ETA: 40s - loss: 0.2922 - regression_loss: 0.2764 - classification_loss: 0.0158 420/500 [========================>.....] - ETA: 40s - loss: 0.2926 - regression_loss: 0.2767 - classification_loss: 0.0159 421/500 [========================>.....] - ETA: 39s - loss: 0.2922 - regression_loss: 0.2763 - classification_loss: 0.0159 422/500 [========================>.....] - ETA: 39s - loss: 0.2924 - regression_loss: 0.2765 - classification_loss: 0.0158 423/500 [========================>.....] - ETA: 38s - loss: 0.2926 - regression_loss: 0.2767 - classification_loss: 0.0159 424/500 [========================>.....] - ETA: 38s - loss: 0.2926 - regression_loss: 0.2767 - classification_loss: 0.0158 425/500 [========================>.....] - ETA: 37s - loss: 0.2950 - regression_loss: 0.2791 - classification_loss: 0.0160 426/500 [========================>.....] - ETA: 37s - loss: 0.2953 - regression_loss: 0.2794 - classification_loss: 0.0159 427/500 [========================>.....] - ETA: 36s - loss: 0.2957 - regression_loss: 0.2797 - classification_loss: 0.0160 428/500 [========================>.....] - ETA: 36s - loss: 0.2963 - regression_loss: 0.2802 - classification_loss: 0.0160 429/500 [========================>.....] - ETA: 35s - loss: 0.2968 - regression_loss: 0.2806 - classification_loss: 0.0161 430/500 [========================>.....] - ETA: 35s - loss: 0.2970 - regression_loss: 0.2809 - classification_loss: 0.0161 431/500 [========================>.....] - ETA: 34s - loss: 0.2976 - regression_loss: 0.2814 - classification_loss: 0.0162 432/500 [========================>.....] - ETA: 34s - loss: 0.2979 - regression_loss: 0.2817 - classification_loss: 0.0162 433/500 [========================>.....] - ETA: 33s - loss: 0.2978 - regression_loss: 0.2816 - classification_loss: 0.0162 434/500 [=========================>....] - ETA: 33s - loss: 0.2980 - regression_loss: 0.2818 - classification_loss: 0.0162 435/500 [=========================>....] - ETA: 32s - loss: 0.2985 - regression_loss: 0.2822 - classification_loss: 0.0162 436/500 [=========================>....] - ETA: 32s - loss: 0.2984 - regression_loss: 0.2822 - classification_loss: 0.0162 437/500 [=========================>....] - ETA: 31s - loss: 0.2983 - regression_loss: 0.2821 - classification_loss: 0.0162 438/500 [=========================>....] - ETA: 31s - loss: 0.2988 - regression_loss: 0.2825 - classification_loss: 0.0163 439/500 [=========================>....] - ETA: 30s - loss: 0.2998 - regression_loss: 0.2835 - classification_loss: 0.0164 440/500 [=========================>....] - ETA: 30s - loss: 0.3000 - regression_loss: 0.2836 - classification_loss: 0.0164 441/500 [=========================>....] - ETA: 29s - loss: 0.3005 - regression_loss: 0.2840 - classification_loss: 0.0164 442/500 [=========================>....] - ETA: 29s - loss: 0.3013 - regression_loss: 0.2848 - classification_loss: 0.0164 443/500 [=========================>....] - ETA: 28s - loss: 0.3010 - regression_loss: 0.2846 - classification_loss: 0.0164 444/500 [=========================>....] - ETA: 28s - loss: 0.3012 - regression_loss: 0.2847 - classification_loss: 0.0165 445/500 [=========================>....] - ETA: 27s - loss: 0.3008 - regression_loss: 0.2844 - classification_loss: 0.0164 446/500 [=========================>....] - ETA: 27s - loss: 0.3012 - regression_loss: 0.2847 - classification_loss: 0.0165 447/500 [=========================>....] - ETA: 26s - loss: 0.3010 - regression_loss: 0.2846 - classification_loss: 0.0165 448/500 [=========================>....] - ETA: 26s - loss: 0.3008 - regression_loss: 0.2844 - classification_loss: 0.0164 449/500 [=========================>....] - ETA: 25s - loss: 0.3012 - regression_loss: 0.2847 - classification_loss: 0.0165 450/500 [==========================>...] - ETA: 25s - loss: 0.3009 - regression_loss: 0.2845 - classification_loss: 0.0164 451/500 [==========================>...] - ETA: 24s - loss: 0.3014 - regression_loss: 0.2849 - classification_loss: 0.0165 452/500 [==========================>...] - ETA: 24s - loss: 0.3016 - regression_loss: 0.2851 - classification_loss: 0.0165 453/500 [==========================>...] - ETA: 23s - loss: 0.3018 - regression_loss: 0.2853 - classification_loss: 0.0165 454/500 [==========================>...] - ETA: 23s - loss: 0.3015 - regression_loss: 0.2850 - classification_loss: 0.0165 455/500 [==========================>...] - ETA: 22s - loss: 0.3014 - regression_loss: 0.2849 - classification_loss: 0.0165 456/500 [==========================>...] - ETA: 22s - loss: 0.3012 - regression_loss: 0.2848 - classification_loss: 0.0165 457/500 [==========================>...] - ETA: 21s - loss: 0.3017 - regression_loss: 0.2853 - classification_loss: 0.0165 458/500 [==========================>...] - ETA: 21s - loss: 0.3017 - regression_loss: 0.2853 - classification_loss: 0.0165 459/500 [==========================>...] - ETA: 20s - loss: 0.3017 - regression_loss: 0.2852 - classification_loss: 0.0165 460/500 [==========================>...] - ETA: 20s - loss: 0.3016 - regression_loss: 0.2851 - classification_loss: 0.0165 461/500 [==========================>...] - ETA: 19s - loss: 0.3015 - regression_loss: 0.2851 - classification_loss: 0.0165 462/500 [==========================>...] - ETA: 19s - loss: 0.3013 - regression_loss: 0.2849 - classification_loss: 0.0165 463/500 [==========================>...] - ETA: 18s - loss: 0.3009 - regression_loss: 0.2845 - classification_loss: 0.0165 464/500 [==========================>...] - ETA: 18s - loss: 0.3013 - regression_loss: 0.2848 - classification_loss: 0.0166 465/500 [==========================>...] - ETA: 17s - loss: 0.3015 - regression_loss: 0.2849 - classification_loss: 0.0166 466/500 [==========================>...] - ETA: 17s - loss: 0.3015 - regression_loss: 0.2849 - classification_loss: 0.0166 467/500 [===========================>..] - ETA: 16s - loss: 0.3015 - regression_loss: 0.2849 - classification_loss: 0.0166 468/500 [===========================>..] - ETA: 16s - loss: 0.3016 - regression_loss: 0.2850 - classification_loss: 0.0166 469/500 [===========================>..] - ETA: 15s - loss: 0.3013 - regression_loss: 0.2847 - classification_loss: 0.0166 470/500 [===========================>..] - ETA: 15s - loss: 0.3011 - regression_loss: 0.2845 - classification_loss: 0.0166 471/500 [===========================>..] - ETA: 14s - loss: 0.3012 - regression_loss: 0.2846 - classification_loss: 0.0166 472/500 [===========================>..] - ETA: 14s - loss: 0.3013 - regression_loss: 0.2847 - classification_loss: 0.0166 473/500 [===========================>..] - ETA: 13s - loss: 0.3012 - regression_loss: 0.2846 - classification_loss: 0.0166 474/500 [===========================>..] - ETA: 13s - loss: 0.3009 - regression_loss: 0.2843 - classification_loss: 0.0166 475/500 [===========================>..] - ETA: 12s - loss: 0.3007 - regression_loss: 0.2841 - classification_loss: 0.0165 476/500 [===========================>..] - ETA: 12s - loss: 0.3004 - regression_loss: 0.2839 - classification_loss: 0.0165 477/500 [===========================>..] - ETA: 11s - loss: 0.3004 - regression_loss: 0.2839 - classification_loss: 0.0165 478/500 [===========================>..] - ETA: 11s - loss: 0.3000 - regression_loss: 0.2835 - classification_loss: 0.0165 479/500 [===========================>..] - ETA: 10s - loss: 0.2997 - regression_loss: 0.2832 - classification_loss: 0.0165 480/500 [===========================>..] - ETA: 10s - loss: 0.2993 - regression_loss: 0.2828 - classification_loss: 0.0165 481/500 [===========================>..] - ETA: 9s - loss: 0.2990 - regression_loss: 0.2825 - classification_loss: 0.0164  482/500 [===========================>..] - ETA: 9s - loss: 0.2993 - regression_loss: 0.2828 - classification_loss: 0.0165 483/500 [===========================>..] - ETA: 8s - loss: 0.2990 - regression_loss: 0.2826 - classification_loss: 0.0165 484/500 [============================>.] - ETA: 8s - loss: 0.2993 - regression_loss: 0.2828 - classification_loss: 0.0165 485/500 [============================>.] - ETA: 7s - loss: 0.2991 - regression_loss: 0.2827 - classification_loss: 0.0165 486/500 [============================>.] - ETA: 7s - loss: 0.2988 - regression_loss: 0.2823 - classification_loss: 0.0165 487/500 [============================>.] - ETA: 6s - loss: 0.2983 - regression_loss: 0.2819 - classification_loss: 0.0164 488/500 [============================>.] - ETA: 6s - loss: 0.2979 - regression_loss: 0.2815 - classification_loss: 0.0164 489/500 [============================>.] - ETA: 5s - loss: 0.2978 - regression_loss: 0.2814 - classification_loss: 0.0164 490/500 [============================>.] - ETA: 5s - loss: 0.2973 - regression_loss: 0.2810 - classification_loss: 0.0164 491/500 [============================>.] - ETA: 4s - loss: 0.2971 - regression_loss: 0.2807 - classification_loss: 0.0164 492/500 [============================>.] - ETA: 4s - loss: 0.2971 - regression_loss: 0.2807 - classification_loss: 0.0164 493/500 [============================>.] - ETA: 3s - loss: 0.2969 - regression_loss: 0.2805 - classification_loss: 0.0164 494/500 [============================>.] - ETA: 3s - loss: 0.2967 - regression_loss: 0.2804 - classification_loss: 0.0163 495/500 [============================>.] - ETA: 2s - loss: 0.2964 - regression_loss: 0.2801 - classification_loss: 0.0163 496/500 [============================>.] - ETA: 2s - loss: 0.2963 - regression_loss: 0.2800 - classification_loss: 0.0163 497/500 [============================>.] - ETA: 1s - loss: 0.2964 - regression_loss: 0.2801 - classification_loss: 0.0164 498/500 [============================>.] - ETA: 1s - loss: 0.2968 - regression_loss: 0.2804 - classification_loss: 0.0164 499/500 [============================>.] - ETA: 0s - loss: 0.2968 - regression_loss: 0.2805 - classification_loss: 0.0164 500/500 [==============================] - 251s 503ms/step - loss: 0.2967 - regression_loss: 0.2804 - classification_loss: 0.0163 1172 instances of class plum with average precision: 0.7469 mAP: 0.7469 Epoch 00020: saving model to ./training/snapshots/resnet101_pascal_20.h5 Epoch 21/150 1/500 [..............................] - ETA: 4:16 - loss: 0.2016 - regression_loss: 0.1859 - classification_loss: 0.0158 2/500 [..............................] - ETA: 4:15 - loss: 0.2712 - regression_loss: 0.2474 - classification_loss: 0.0238 3/500 [..............................] - ETA: 4:14 - loss: 0.2502 - regression_loss: 0.2315 - classification_loss: 0.0187 4/500 [..............................] - ETA: 4:11 - loss: 0.2347 - regression_loss: 0.2193 - classification_loss: 0.0153 5/500 [..............................] - ETA: 4:10 - loss: 0.2376 - regression_loss: 0.2213 - classification_loss: 0.0163 6/500 [..............................] - ETA: 4:11 - loss: 0.2189 - regression_loss: 0.2038 - classification_loss: 0.0151 7/500 [..............................] - ETA: 4:11 - loss: 0.1936 - regression_loss: 0.1799 - classification_loss: 0.0137 8/500 [..............................] - ETA: 4:09 - loss: 0.2252 - regression_loss: 0.2104 - classification_loss: 0.0148 9/500 [..............................] - ETA: 4:08 - loss: 0.2243 - regression_loss: 0.2106 - classification_loss: 0.0137 10/500 [..............................] - ETA: 4:07 - loss: 0.2181 - regression_loss: 0.2052 - classification_loss: 0.0129 11/500 [..............................] - ETA: 4:07 - loss: 0.2141 - regression_loss: 0.2017 - classification_loss: 0.0124 12/500 [..............................] - ETA: 4:06 - loss: 0.2287 - regression_loss: 0.2152 - classification_loss: 0.0135 13/500 [..............................] - ETA: 4:05 - loss: 0.2320 - regression_loss: 0.2187 - classification_loss: 0.0133 14/500 [..............................] - ETA: 4:05 - loss: 0.2276 - regression_loss: 0.2147 - classification_loss: 0.0128 15/500 [..............................] - ETA: 4:04 - loss: 0.2233 - regression_loss: 0.2110 - classification_loss: 0.0123 16/500 [..............................] - ETA: 4:04 - loss: 0.2210 - regression_loss: 0.2092 - classification_loss: 0.0118 17/500 [>.............................] - ETA: 4:03 - loss: 0.2330 - regression_loss: 0.2213 - classification_loss: 0.0117 18/500 [>.............................] - ETA: 4:02 - loss: 0.2368 - regression_loss: 0.2247 - classification_loss: 0.0120 19/500 [>.............................] - ETA: 4:02 - loss: 0.2340 - regression_loss: 0.2222 - classification_loss: 0.0117 20/500 [>.............................] - ETA: 4:02 - loss: 0.2323 - regression_loss: 0.2208 - classification_loss: 0.0115 21/500 [>.............................] - ETA: 4:01 - loss: 0.2288 - regression_loss: 0.2175 - classification_loss: 0.0113 22/500 [>.............................] - ETA: 4:01 - loss: 0.2413 - regression_loss: 0.2298 - classification_loss: 0.0114 23/500 [>.............................] - ETA: 4:00 - loss: 0.2439 - regression_loss: 0.2324 - classification_loss: 0.0115 24/500 [>.............................] - ETA: 4:00 - loss: 0.2461 - regression_loss: 0.2347 - classification_loss: 0.0114 25/500 [>.............................] - ETA: 3:59 - loss: 0.2514 - regression_loss: 0.2400 - classification_loss: 0.0114 26/500 [>.............................] - ETA: 3:59 - loss: 0.2609 - regression_loss: 0.2484 - classification_loss: 0.0125 27/500 [>.............................] - ETA: 3:58 - loss: 0.2596 - regression_loss: 0.2473 - classification_loss: 0.0123 28/500 [>.............................] - ETA: 3:57 - loss: 0.2545 - regression_loss: 0.2426 - classification_loss: 0.0119 29/500 [>.............................] - ETA: 3:57 - loss: 0.2556 - regression_loss: 0.2436 - classification_loss: 0.0121 30/500 [>.............................] - ETA: 3:56 - loss: 0.2544 - regression_loss: 0.2424 - classification_loss: 0.0120 31/500 [>.............................] - ETA: 3:56 - loss: 0.2561 - regression_loss: 0.2440 - classification_loss: 0.0121 32/500 [>.............................] - ETA: 3:55 - loss: 0.2500 - regression_loss: 0.2383 - classification_loss: 0.0118 33/500 [>.............................] - ETA: 3:54 - loss: 0.2585 - regression_loss: 0.2463 - classification_loss: 0.0122 34/500 [=>............................] - ETA: 3:54 - loss: 0.2627 - regression_loss: 0.2503 - classification_loss: 0.0124 35/500 [=>............................] - ETA: 3:54 - loss: 0.2609 - regression_loss: 0.2487 - classification_loss: 0.0122 36/500 [=>............................] - ETA: 3:53 - loss: 0.2615 - regression_loss: 0.2495 - classification_loss: 0.0120 37/500 [=>............................] - ETA: 3:53 - loss: 0.2686 - regression_loss: 0.2564 - classification_loss: 0.0122 38/500 [=>............................] - ETA: 3:52 - loss: 0.2751 - regression_loss: 0.2626 - classification_loss: 0.0125 39/500 [=>............................] - ETA: 3:52 - loss: 0.2803 - regression_loss: 0.2675 - classification_loss: 0.0127 40/500 [=>............................] - ETA: 3:51 - loss: 0.2786 - regression_loss: 0.2660 - classification_loss: 0.0126 41/500 [=>............................] - ETA: 3:51 - loss: 0.2781 - regression_loss: 0.2654 - classification_loss: 0.0127 42/500 [=>............................] - ETA: 3:50 - loss: 0.2787 - regression_loss: 0.2660 - classification_loss: 0.0127 43/500 [=>............................] - ETA: 3:50 - loss: 0.2789 - regression_loss: 0.2662 - classification_loss: 0.0127 44/500 [=>............................] - ETA: 3:49 - loss: 0.2793 - regression_loss: 0.2667 - classification_loss: 0.0126 45/500 [=>............................] - ETA: 3:49 - loss: 0.2801 - regression_loss: 0.2673 - classification_loss: 0.0128 46/500 [=>............................] - ETA: 3:48 - loss: 0.2833 - regression_loss: 0.2702 - classification_loss: 0.0130 47/500 [=>............................] - ETA: 3:47 - loss: 0.2821 - regression_loss: 0.2688 - classification_loss: 0.0133 48/500 [=>............................] - ETA: 3:47 - loss: 0.2839 - regression_loss: 0.2702 - classification_loss: 0.0138 49/500 [=>............................] - ETA: 3:47 - loss: 0.2841 - regression_loss: 0.2705 - classification_loss: 0.0136 50/500 [==>...........................] - ETA: 3:46 - loss: 0.2844 - regression_loss: 0.2707 - classification_loss: 0.0136 51/500 [==>...........................] - ETA: 3:45 - loss: 0.2841 - regression_loss: 0.2706 - classification_loss: 0.0135 52/500 [==>...........................] - ETA: 3:45 - loss: 0.2884 - regression_loss: 0.2744 - classification_loss: 0.0140 53/500 [==>...........................] - ETA: 3:45 - loss: 0.2882 - regression_loss: 0.2743 - classification_loss: 0.0139 54/500 [==>...........................] - ETA: 3:44 - loss: 0.2899 - regression_loss: 0.2753 - classification_loss: 0.0145 55/500 [==>...........................] - ETA: 3:43 - loss: 0.2926 - regression_loss: 0.2779 - classification_loss: 0.0147 56/500 [==>...........................] - ETA: 3:43 - loss: 0.2925 - regression_loss: 0.2776 - classification_loss: 0.0149 57/500 [==>...........................] - ETA: 3:42 - loss: 0.2923 - regression_loss: 0.2775 - classification_loss: 0.0148 58/500 [==>...........................] - ETA: 3:42 - loss: 0.2930 - regression_loss: 0.2781 - classification_loss: 0.0149 59/500 [==>...........................] - ETA: 3:41 - loss: 0.2944 - regression_loss: 0.2794 - classification_loss: 0.0150 60/500 [==>...........................] - ETA: 3:41 - loss: 0.2938 - regression_loss: 0.2786 - classification_loss: 0.0152 61/500 [==>...........................] - ETA: 3:40 - loss: 0.2902 - regression_loss: 0.2752 - classification_loss: 0.0150 62/500 [==>...........................] - ETA: 3:40 - loss: 0.2871 - regression_loss: 0.2723 - classification_loss: 0.0148 63/500 [==>...........................] - ETA: 3:39 - loss: 0.2864 - regression_loss: 0.2716 - classification_loss: 0.0148 64/500 [==>...........................] - ETA: 3:38 - loss: 0.2835 - regression_loss: 0.2689 - classification_loss: 0.0146 65/500 [==>...........................] - ETA: 3:38 - loss: 0.2848 - regression_loss: 0.2700 - classification_loss: 0.0148 66/500 [==>...........................] - ETA: 3:37 - loss: 0.2832 - regression_loss: 0.2686 - classification_loss: 0.0146 67/500 [===>..........................] - ETA: 3:37 - loss: 0.2837 - regression_loss: 0.2687 - classification_loss: 0.0150 68/500 [===>..........................] - ETA: 3:36 - loss: 0.2862 - regression_loss: 0.2706 - classification_loss: 0.0156 69/500 [===>..........................] - ETA: 3:36 - loss: 0.2860 - regression_loss: 0.2705 - classification_loss: 0.0155 70/500 [===>..........................] - ETA: 3:35 - loss: 0.2885 - regression_loss: 0.2727 - classification_loss: 0.0158 71/500 [===>..........................] - ETA: 3:35 - loss: 0.2910 - regression_loss: 0.2751 - classification_loss: 0.0159 72/500 [===>..........................] - ETA: 3:34 - loss: 0.2915 - regression_loss: 0.2756 - classification_loss: 0.0159 73/500 [===>..........................] - ETA: 3:34 - loss: 0.2895 - regression_loss: 0.2738 - classification_loss: 0.0157 74/500 [===>..........................] - ETA: 3:33 - loss: 0.2914 - regression_loss: 0.2755 - classification_loss: 0.0160 75/500 [===>..........................] - ETA: 3:33 - loss: 0.2910 - regression_loss: 0.2751 - classification_loss: 0.0160 76/500 [===>..........................] - ETA: 3:32 - loss: 0.2908 - regression_loss: 0.2748 - classification_loss: 0.0159 77/500 [===>..........................] - ETA: 3:32 - loss: 0.2944 - regression_loss: 0.2782 - classification_loss: 0.0162 78/500 [===>..........................] - ETA: 3:31 - loss: 0.2923 - regression_loss: 0.2762 - classification_loss: 0.0161 79/500 [===>..........................] - ETA: 3:31 - loss: 0.2936 - regression_loss: 0.2775 - classification_loss: 0.0161 80/500 [===>..........................] - ETA: 3:30 - loss: 0.2947 - regression_loss: 0.2786 - classification_loss: 0.0162 81/500 [===>..........................] - ETA: 3:30 - loss: 0.2951 - regression_loss: 0.2788 - classification_loss: 0.0163 82/500 [===>..........................] - ETA: 3:29 - loss: 0.2941 - regression_loss: 0.2779 - classification_loss: 0.0162 83/500 [===>..........................] - ETA: 3:29 - loss: 0.2916 - regression_loss: 0.2755 - classification_loss: 0.0161 84/500 [====>.........................] - ETA: 3:28 - loss: 0.2914 - regression_loss: 0.2753 - classification_loss: 0.0160 85/500 [====>.........................] - ETA: 3:28 - loss: 0.2895 - regression_loss: 0.2736 - classification_loss: 0.0159 86/500 [====>.........................] - ETA: 3:27 - loss: 0.2896 - regression_loss: 0.2737 - classification_loss: 0.0159 87/500 [====>.........................] - ETA: 3:27 - loss: 0.2898 - regression_loss: 0.2740 - classification_loss: 0.0158 88/500 [====>.........................] - ETA: 3:26 - loss: 0.2878 - regression_loss: 0.2721 - classification_loss: 0.0157 89/500 [====>.........................] - ETA: 3:26 - loss: 0.2867 - regression_loss: 0.2710 - classification_loss: 0.0157 90/500 [====>.........................] - ETA: 3:25 - loss: 0.2875 - regression_loss: 0.2718 - classification_loss: 0.0157 91/500 [====>.........................] - ETA: 3:25 - loss: 0.2874 - regression_loss: 0.2716 - classification_loss: 0.0158 92/500 [====>.........................] - ETA: 3:24 - loss: 0.2865 - regression_loss: 0.2707 - classification_loss: 0.0158 93/500 [====>.........................] - ETA: 3:24 - loss: 0.2853 - regression_loss: 0.2696 - classification_loss: 0.0157 94/500 [====>.........................] - ETA: 3:23 - loss: 0.2862 - regression_loss: 0.2705 - classification_loss: 0.0157 95/500 [====>.........................] - ETA: 3:23 - loss: 0.2859 - regression_loss: 0.2702 - classification_loss: 0.0157 96/500 [====>.........................] - ETA: 3:22 - loss: 0.2877 - regression_loss: 0.2719 - classification_loss: 0.0157 97/500 [====>.........................] - ETA: 3:22 - loss: 0.2881 - regression_loss: 0.2724 - classification_loss: 0.0157 98/500 [====>.........................] - ETA: 3:21 - loss: 0.2870 - regression_loss: 0.2714 - classification_loss: 0.0156 99/500 [====>.........................] - ETA: 3:21 - loss: 0.2867 - regression_loss: 0.2710 - classification_loss: 0.0156 100/500 [=====>........................] - ETA: 3:20 - loss: 0.2859 - regression_loss: 0.2703 - classification_loss: 0.0156 101/500 [=====>........................] - ETA: 3:19 - loss: 0.2849 - regression_loss: 0.2693 - classification_loss: 0.0155 102/500 [=====>........................] - ETA: 3:19 - loss: 0.2833 - regression_loss: 0.2679 - classification_loss: 0.0154 103/500 [=====>........................] - ETA: 3:19 - loss: 0.2836 - regression_loss: 0.2682 - classification_loss: 0.0155 104/500 [=====>........................] - ETA: 3:18 - loss: 0.2826 - regression_loss: 0.2671 - classification_loss: 0.0154 105/500 [=====>........................] - ETA: 3:17 - loss: 0.2814 - regression_loss: 0.2661 - classification_loss: 0.0154 106/500 [=====>........................] - ETA: 3:17 - loss: 0.2799 - regression_loss: 0.2646 - classification_loss: 0.0153 107/500 [=====>........................] - ETA: 3:17 - loss: 0.2809 - regression_loss: 0.2656 - classification_loss: 0.0153 108/500 [=====>........................] - ETA: 3:16 - loss: 0.2828 - regression_loss: 0.2670 - classification_loss: 0.0158 109/500 [=====>........................] - ETA: 3:15 - loss: 0.2837 - regression_loss: 0.2677 - classification_loss: 0.0160 110/500 [=====>........................] - ETA: 3:15 - loss: 0.2828 - regression_loss: 0.2668 - classification_loss: 0.0159 111/500 [=====>........................] - ETA: 3:15 - loss: 0.2843 - regression_loss: 0.2681 - classification_loss: 0.0162 112/500 [=====>........................] - ETA: 3:14 - loss: 0.2838 - regression_loss: 0.2677 - classification_loss: 0.0161 113/500 [=====>........................] - ETA: 3:13 - loss: 0.2836 - regression_loss: 0.2676 - classification_loss: 0.0161 114/500 [=====>........................] - ETA: 3:13 - loss: 0.2834 - regression_loss: 0.2673 - classification_loss: 0.0160 115/500 [=====>........................] - ETA: 3:12 - loss: 0.2830 - regression_loss: 0.2670 - classification_loss: 0.0160 116/500 [=====>........................] - ETA: 3:12 - loss: 0.2833 - regression_loss: 0.2673 - classification_loss: 0.0160 117/500 [======>.......................] - ETA: 3:11 - loss: 0.2833 - regression_loss: 0.2673 - classification_loss: 0.0160 118/500 [======>.......................] - ETA: 3:11 - loss: 0.2819 - regression_loss: 0.2660 - classification_loss: 0.0159 119/500 [======>.......................] - ETA: 3:11 - loss: 0.2823 - regression_loss: 0.2663 - classification_loss: 0.0159 120/500 [======>.......................] - ETA: 3:10 - loss: 0.2819 - regression_loss: 0.2660 - classification_loss: 0.0159 121/500 [======>.......................] - ETA: 3:10 - loss: 0.2814 - regression_loss: 0.2655 - classification_loss: 0.0159 122/500 [======>.......................] - ETA: 3:09 - loss: 0.2816 - regression_loss: 0.2657 - classification_loss: 0.0159 123/500 [======>.......................] - ETA: 3:09 - loss: 0.2812 - regression_loss: 0.2654 - classification_loss: 0.0158 124/500 [======>.......................] - ETA: 3:08 - loss: 0.2810 - regression_loss: 0.2652 - classification_loss: 0.0158 125/500 [======>.......................] - ETA: 3:08 - loss: 0.2807 - regression_loss: 0.2650 - classification_loss: 0.0157 126/500 [======>.......................] - ETA: 3:07 - loss: 0.2800 - regression_loss: 0.2643 - classification_loss: 0.0156 127/500 [======>.......................] - ETA: 3:07 - loss: 0.2795 - regression_loss: 0.2639 - classification_loss: 0.0157 128/500 [======>.......................] - ETA: 3:06 - loss: 0.2794 - regression_loss: 0.2638 - classification_loss: 0.0156 129/500 [======>.......................] - ETA: 3:06 - loss: 0.2784 - regression_loss: 0.2629 - classification_loss: 0.0155 130/500 [======>.......................] - ETA: 3:05 - loss: 0.2773 - regression_loss: 0.2618 - classification_loss: 0.0155 131/500 [======>.......................] - ETA: 3:05 - loss: 0.2777 - regression_loss: 0.2622 - classification_loss: 0.0155 132/500 [======>.......................] - ETA: 3:04 - loss: 0.2795 - regression_loss: 0.2639 - classification_loss: 0.0156 133/500 [======>.......................] - ETA: 3:04 - loss: 0.2802 - regression_loss: 0.2646 - classification_loss: 0.0156 134/500 [=======>......................] - ETA: 3:03 - loss: 0.2806 - regression_loss: 0.2650 - classification_loss: 0.0156 135/500 [=======>......................] - ETA: 3:03 - loss: 0.2802 - regression_loss: 0.2647 - classification_loss: 0.0155 136/500 [=======>......................] - ETA: 3:02 - loss: 0.2789 - regression_loss: 0.2635 - classification_loss: 0.0154 137/500 [=======>......................] - ETA: 3:02 - loss: 0.2790 - regression_loss: 0.2635 - classification_loss: 0.0154 138/500 [=======>......................] - ETA: 3:01 - loss: 0.2782 - regression_loss: 0.2628 - classification_loss: 0.0154 139/500 [=======>......................] - ETA: 3:01 - loss: 0.2772 - regression_loss: 0.2619 - classification_loss: 0.0153 140/500 [=======>......................] - ETA: 3:00 - loss: 0.2767 - regression_loss: 0.2615 - classification_loss: 0.0153 141/500 [=======>......................] - ETA: 3:00 - loss: 0.2812 - regression_loss: 0.2657 - classification_loss: 0.0155 142/500 [=======>......................] - ETA: 2:59 - loss: 0.2844 - regression_loss: 0.2687 - classification_loss: 0.0157 143/500 [=======>......................] - ETA: 2:59 - loss: 0.2840 - regression_loss: 0.2683 - classification_loss: 0.0156 144/500 [=======>......................] - ETA: 2:58 - loss: 0.2829 - regression_loss: 0.2674 - classification_loss: 0.0155 145/500 [=======>......................] - ETA: 2:58 - loss: 0.2832 - regression_loss: 0.2677 - classification_loss: 0.0155 146/500 [=======>......................] - ETA: 2:57 - loss: 0.2828 - regression_loss: 0.2673 - classification_loss: 0.0154 147/500 [=======>......................] - ETA: 2:57 - loss: 0.2829 - regression_loss: 0.2674 - classification_loss: 0.0155 148/500 [=======>......................] - ETA: 2:56 - loss: 0.2818 - regression_loss: 0.2664 - classification_loss: 0.0154 149/500 [=======>......................] - ETA: 2:56 - loss: 0.2810 - regression_loss: 0.2656 - classification_loss: 0.0153 150/500 [========>.....................] - ETA: 2:55 - loss: 0.2816 - regression_loss: 0.2661 - classification_loss: 0.0154 151/500 [========>.....................] - ETA: 2:55 - loss: 0.2805 - regression_loss: 0.2651 - classification_loss: 0.0154 152/500 [========>.....................] - ETA: 2:54 - loss: 0.2807 - regression_loss: 0.2653 - classification_loss: 0.0153 153/500 [========>.....................] - ETA: 2:54 - loss: 0.2801 - regression_loss: 0.2649 - classification_loss: 0.0153 154/500 [========>.....................] - ETA: 2:53 - loss: 0.2796 - regression_loss: 0.2644 - classification_loss: 0.0152 155/500 [========>.....................] - ETA: 2:53 - loss: 0.2792 - regression_loss: 0.2640 - classification_loss: 0.0152 156/500 [========>.....................] - ETA: 2:52 - loss: 0.2787 - regression_loss: 0.2635 - classification_loss: 0.0151 157/500 [========>.....................] - ETA: 2:52 - loss: 0.2782 - regression_loss: 0.2631 - classification_loss: 0.0151 158/500 [========>.....................] - ETA: 2:51 - loss: 0.2783 - regression_loss: 0.2632 - classification_loss: 0.0151 159/500 [========>.....................] - ETA: 2:51 - loss: 0.2778 - regression_loss: 0.2628 - classification_loss: 0.0151 160/500 [========>.....................] - ETA: 2:50 - loss: 0.2770 - regression_loss: 0.2620 - classification_loss: 0.0150 161/500 [========>.....................] - ETA: 2:50 - loss: 0.2781 - regression_loss: 0.2630 - classification_loss: 0.0151 162/500 [========>.....................] - ETA: 2:49 - loss: 0.2782 - regression_loss: 0.2631 - classification_loss: 0.0151 163/500 [========>.....................] - ETA: 2:49 - loss: 0.2790 - regression_loss: 0.2639 - classification_loss: 0.0152 164/500 [========>.....................] - ETA: 2:48 - loss: 0.2803 - regression_loss: 0.2648 - classification_loss: 0.0154 165/500 [========>.....................] - ETA: 2:48 - loss: 0.2801 - regression_loss: 0.2647 - classification_loss: 0.0154 166/500 [========>.....................] - ETA: 2:47 - loss: 0.2797 - regression_loss: 0.2643 - classification_loss: 0.0154 167/500 [=========>....................] - ETA: 2:47 - loss: 0.2799 - regression_loss: 0.2645 - classification_loss: 0.0154 168/500 [=========>....................] - ETA: 2:46 - loss: 0.2799 - regression_loss: 0.2645 - classification_loss: 0.0154 169/500 [=========>....................] - ETA: 2:46 - loss: 0.2799 - regression_loss: 0.2645 - classification_loss: 0.0154 170/500 [=========>....................] - ETA: 2:45 - loss: 0.2796 - regression_loss: 0.2642 - classification_loss: 0.0154 171/500 [=========>....................] - ETA: 2:45 - loss: 0.2784 - regression_loss: 0.2631 - classification_loss: 0.0153 172/500 [=========>....................] - ETA: 2:44 - loss: 0.2785 - regression_loss: 0.2632 - classification_loss: 0.0153 173/500 [=========>....................] - ETA: 2:44 - loss: 0.2783 - regression_loss: 0.2630 - classification_loss: 0.0153 174/500 [=========>....................] - ETA: 2:43 - loss: 0.2782 - regression_loss: 0.2629 - classification_loss: 0.0153 175/500 [=========>....................] - ETA: 2:43 - loss: 0.2786 - regression_loss: 0.2633 - classification_loss: 0.0153 176/500 [=========>....................] - ETA: 2:42 - loss: 0.2781 - regression_loss: 0.2628 - classification_loss: 0.0153 177/500 [=========>....................] - ETA: 2:42 - loss: 0.2790 - regression_loss: 0.2635 - classification_loss: 0.0154 178/500 [=========>....................] - ETA: 2:41 - loss: 0.2787 - regression_loss: 0.2633 - classification_loss: 0.0154 179/500 [=========>....................] - ETA: 2:40 - loss: 0.2784 - regression_loss: 0.2630 - classification_loss: 0.0154 180/500 [=========>....................] - ETA: 2:40 - loss: 0.2791 - regression_loss: 0.2637 - classification_loss: 0.0154 181/500 [=========>....................] - ETA: 2:39 - loss: 0.2787 - regression_loss: 0.2634 - classification_loss: 0.0153 182/500 [=========>....................] - ETA: 2:39 - loss: 0.2782 - regression_loss: 0.2629 - classification_loss: 0.0153 183/500 [=========>....................] - ETA: 2:38 - loss: 0.2775 - regression_loss: 0.2623 - classification_loss: 0.0152 184/500 [==========>...................] - ETA: 2:38 - loss: 0.2775 - regression_loss: 0.2623 - classification_loss: 0.0152 185/500 [==========>...................] - ETA: 2:37 - loss: 0.2779 - regression_loss: 0.2627 - classification_loss: 0.0152 186/500 [==========>...................] - ETA: 2:37 - loss: 0.2779 - regression_loss: 0.2627 - classification_loss: 0.0152 187/500 [==========>...................] - ETA: 2:36 - loss: 0.2770 - regression_loss: 0.2619 - classification_loss: 0.0151 188/500 [==========>...................] - ETA: 2:36 - loss: 0.2771 - regression_loss: 0.2620 - classification_loss: 0.0151 189/500 [==========>...................] - ETA: 2:35 - loss: 0.2772 - regression_loss: 0.2621 - classification_loss: 0.0151 190/500 [==========>...................] - ETA: 2:35 - loss: 0.2766 - regression_loss: 0.2615 - classification_loss: 0.0150 191/500 [==========>...................] - ETA: 2:34 - loss: 0.2770 - regression_loss: 0.2619 - classification_loss: 0.0151 192/500 [==========>...................] - ETA: 2:34 - loss: 0.2771 - regression_loss: 0.2620 - classification_loss: 0.0151 193/500 [==========>...................] - ETA: 2:33 - loss: 0.2761 - regression_loss: 0.2611 - classification_loss: 0.0150 194/500 [==========>...................] - ETA: 2:33 - loss: 0.2770 - regression_loss: 0.2619 - classification_loss: 0.0151 195/500 [==========>...................] - ETA: 2:32 - loss: 0.2760 - regression_loss: 0.2609 - classification_loss: 0.0151 196/500 [==========>...................] - ETA: 2:32 - loss: 0.2766 - regression_loss: 0.2614 - classification_loss: 0.0152 197/500 [==========>...................] - ETA: 2:31 - loss: 0.2769 - regression_loss: 0.2617 - classification_loss: 0.0152 198/500 [==========>...................] - ETA: 2:31 - loss: 0.2774 - regression_loss: 0.2622 - classification_loss: 0.0152 199/500 [==========>...................] - ETA: 2:30 - loss: 0.2778 - regression_loss: 0.2624 - classification_loss: 0.0154 200/500 [===========>..................] - ETA: 2:30 - loss: 0.2780 - regression_loss: 0.2627 - classification_loss: 0.0154 201/500 [===========>..................] - ETA: 2:29 - loss: 0.2776 - regression_loss: 0.2623 - classification_loss: 0.0153 202/500 [===========>..................] - ETA: 2:29 - loss: 0.2783 - regression_loss: 0.2630 - classification_loss: 0.0154 203/500 [===========>..................] - ETA: 2:28 - loss: 0.2784 - regression_loss: 0.2631 - classification_loss: 0.0154 204/500 [===========>..................] - ETA: 2:28 - loss: 0.2810 - regression_loss: 0.2656 - classification_loss: 0.0154 205/500 [===========>..................] - ETA: 2:27 - loss: 0.2807 - regression_loss: 0.2653 - classification_loss: 0.0154 206/500 [===========>..................] - ETA: 2:27 - loss: 0.2802 - regression_loss: 0.2648 - classification_loss: 0.0154 207/500 [===========>..................] - ETA: 2:26 - loss: 0.2800 - regression_loss: 0.2645 - classification_loss: 0.0154 208/500 [===========>..................] - ETA: 2:26 - loss: 0.2801 - regression_loss: 0.2647 - classification_loss: 0.0154 209/500 [===========>..................] - ETA: 2:25 - loss: 0.2802 - regression_loss: 0.2648 - classification_loss: 0.0154 210/500 [===========>..................] - ETA: 2:25 - loss: 0.2801 - regression_loss: 0.2647 - classification_loss: 0.0154 211/500 [===========>..................] - ETA: 2:25 - loss: 0.2806 - regression_loss: 0.2651 - classification_loss: 0.0155 212/500 [===========>..................] - ETA: 2:24 - loss: 0.2799 - regression_loss: 0.2644 - classification_loss: 0.0155 213/500 [===========>..................] - ETA: 2:24 - loss: 0.2811 - regression_loss: 0.2656 - classification_loss: 0.0155 214/500 [===========>..................] - ETA: 2:23 - loss: 0.2817 - regression_loss: 0.2662 - classification_loss: 0.0155 215/500 [===========>..................] - ETA: 2:22 - loss: 0.2811 - regression_loss: 0.2656 - classification_loss: 0.0155 216/500 [===========>..................] - ETA: 2:22 - loss: 0.2803 - regression_loss: 0.2649 - classification_loss: 0.0154 217/500 [============>.................] - ETA: 2:21 - loss: 0.2798 - regression_loss: 0.2644 - classification_loss: 0.0154 218/500 [============>.................] - ETA: 2:21 - loss: 0.2797 - regression_loss: 0.2644 - classification_loss: 0.0153 219/500 [============>.................] - ETA: 2:20 - loss: 0.2792 - regression_loss: 0.2639 - classification_loss: 0.0153 220/500 [============>.................] - ETA: 2:20 - loss: 0.2787 - regression_loss: 0.2634 - classification_loss: 0.0153 221/500 [============>.................] - ETA: 2:19 - loss: 0.2795 - regression_loss: 0.2642 - classification_loss: 0.0153 222/500 [============>.................] - ETA: 2:19 - loss: 0.2798 - regression_loss: 0.2643 - classification_loss: 0.0154 223/500 [============>.................] - ETA: 2:18 - loss: 0.2793 - regression_loss: 0.2640 - classification_loss: 0.0154 224/500 [============>.................] - ETA: 2:18 - loss: 0.2790 - regression_loss: 0.2636 - classification_loss: 0.0154 225/500 [============>.................] - ETA: 2:17 - loss: 0.2793 - regression_loss: 0.2639 - classification_loss: 0.0154 226/500 [============>.................] - ETA: 2:17 - loss: 0.2798 - regression_loss: 0.2643 - classification_loss: 0.0155 227/500 [============>.................] - ETA: 2:16 - loss: 0.2791 - regression_loss: 0.2637 - classification_loss: 0.0154 228/500 [============>.................] - ETA: 2:16 - loss: 0.2790 - regression_loss: 0.2636 - classification_loss: 0.0154 229/500 [============>.................] - ETA: 2:15 - loss: 0.2787 - regression_loss: 0.2633 - classification_loss: 0.0154 230/500 [============>.................] - ETA: 2:15 - loss: 0.2785 - regression_loss: 0.2631 - classification_loss: 0.0153 231/500 [============>.................] - ETA: 2:15 - loss: 0.2782 - regression_loss: 0.2629 - classification_loss: 0.0153 232/500 [============>.................] - ETA: 2:14 - loss: 0.2777 - regression_loss: 0.2624 - classification_loss: 0.0153 233/500 [============>.................] - ETA: 2:13 - loss: 0.2777 - regression_loss: 0.2624 - classification_loss: 0.0153 234/500 [=============>................] - ETA: 2:13 - loss: 0.2777 - regression_loss: 0.2623 - classification_loss: 0.0153 235/500 [=============>................] - ETA: 2:12 - loss: 0.2780 - regression_loss: 0.2627 - classification_loss: 0.0154 236/500 [=============>................] - ETA: 2:12 - loss: 0.2776 - regression_loss: 0.2623 - classification_loss: 0.0154 237/500 [=============>................] - ETA: 2:11 - loss: 0.2774 - regression_loss: 0.2620 - classification_loss: 0.0154 238/500 [=============>................] - ETA: 2:11 - loss: 0.2781 - regression_loss: 0.2627 - classification_loss: 0.0154 239/500 [=============>................] - ETA: 2:10 - loss: 0.2774 - regression_loss: 0.2621 - classification_loss: 0.0153 240/500 [=============>................] - ETA: 2:10 - loss: 0.2778 - regression_loss: 0.2624 - classification_loss: 0.0153 241/500 [=============>................] - ETA: 2:09 - loss: 0.2778 - regression_loss: 0.2625 - classification_loss: 0.0153 242/500 [=============>................] - ETA: 2:09 - loss: 0.2774 - regression_loss: 0.2621 - classification_loss: 0.0153 243/500 [=============>................] - ETA: 2:08 - loss: 0.2770 - regression_loss: 0.2618 - classification_loss: 0.0153 244/500 [=============>................] - ETA: 2:08 - loss: 0.2773 - regression_loss: 0.2620 - classification_loss: 0.0153 245/500 [=============>................] - ETA: 2:07 - loss: 0.2779 - regression_loss: 0.2624 - classification_loss: 0.0155 246/500 [=============>................] - ETA: 2:07 - loss: 0.2785 - regression_loss: 0.2629 - classification_loss: 0.0156 247/500 [=============>................] - ETA: 2:06 - loss: 0.2787 - regression_loss: 0.2632 - classification_loss: 0.0156 248/500 [=============>................] - ETA: 2:06 - loss: 0.2781 - regression_loss: 0.2625 - classification_loss: 0.0155 249/500 [=============>................] - ETA: 2:05 - loss: 0.2775 - regression_loss: 0.2620 - classification_loss: 0.0155 250/500 [==============>...............] - ETA: 2:05 - loss: 0.2781 - regression_loss: 0.2627 - classification_loss: 0.0155 251/500 [==============>...............] - ETA: 2:04 - loss: 0.2779 - regression_loss: 0.2625 - classification_loss: 0.0154 252/500 [==============>...............] - ETA: 2:04 - loss: 0.2772 - regression_loss: 0.2618 - classification_loss: 0.0154 253/500 [==============>...............] - ETA: 2:03 - loss: 0.2769 - regression_loss: 0.2615 - classification_loss: 0.0154 254/500 [==============>...............] - ETA: 2:03 - loss: 0.2760 - regression_loss: 0.2607 - classification_loss: 0.0153 255/500 [==============>...............] - ETA: 2:02 - loss: 0.2769 - regression_loss: 0.2614 - classification_loss: 0.0155 256/500 [==============>...............] - ETA: 2:02 - loss: 0.2772 - regression_loss: 0.2616 - classification_loss: 0.0156 257/500 [==============>...............] - ETA: 2:01 - loss: 0.2769 - regression_loss: 0.2613 - classification_loss: 0.0155 258/500 [==============>...............] - ETA: 2:01 - loss: 0.2762 - regression_loss: 0.2607 - classification_loss: 0.0155 259/500 [==============>...............] - ETA: 2:00 - loss: 0.2762 - regression_loss: 0.2607 - classification_loss: 0.0155 260/500 [==============>...............] - ETA: 2:00 - loss: 0.2761 - regression_loss: 0.2606 - classification_loss: 0.0155 261/500 [==============>...............] - ETA: 1:59 - loss: 0.2768 - regression_loss: 0.2613 - classification_loss: 0.0156 262/500 [==============>...............] - ETA: 1:59 - loss: 0.2760 - regression_loss: 0.2605 - classification_loss: 0.0155 263/500 [==============>...............] - ETA: 1:58 - loss: 0.2753 - regression_loss: 0.2598 - classification_loss: 0.0155 264/500 [==============>...............] - ETA: 1:58 - loss: 0.2760 - regression_loss: 0.2605 - classification_loss: 0.0155 265/500 [==============>...............] - ETA: 1:57 - loss: 0.2761 - regression_loss: 0.2606 - classification_loss: 0.0155 266/500 [==============>...............] - ETA: 1:57 - loss: 0.2761 - regression_loss: 0.2606 - classification_loss: 0.0155 267/500 [===============>..............] - ETA: 1:56 - loss: 0.2762 - regression_loss: 0.2607 - classification_loss: 0.0155 268/500 [===============>..............] - ETA: 1:56 - loss: 0.2759 - regression_loss: 0.2605 - classification_loss: 0.0155 269/500 [===============>..............] - ETA: 1:55 - loss: 0.2770 - regression_loss: 0.2615 - classification_loss: 0.0155 270/500 [===============>..............] - ETA: 1:55 - loss: 0.2772 - regression_loss: 0.2618 - classification_loss: 0.0154 271/500 [===============>..............] - ETA: 1:54 - loss: 0.2785 - regression_loss: 0.2629 - classification_loss: 0.0156 272/500 [===============>..............] - ETA: 1:54 - loss: 0.2794 - regression_loss: 0.2638 - classification_loss: 0.0157 273/500 [===============>..............] - ETA: 1:53 - loss: 0.2792 - regression_loss: 0.2635 - classification_loss: 0.0157 274/500 [===============>..............] - ETA: 1:53 - loss: 0.2791 - regression_loss: 0.2635 - classification_loss: 0.0156 275/500 [===============>..............] - ETA: 1:52 - loss: 0.2790 - regression_loss: 0.2634 - classification_loss: 0.0156 276/500 [===============>..............] - ETA: 1:52 - loss: 0.2793 - regression_loss: 0.2636 - classification_loss: 0.0157 277/500 [===============>..............] - ETA: 1:51 - loss: 0.2792 - regression_loss: 0.2635 - classification_loss: 0.0156 278/500 [===============>..............] - ETA: 1:51 - loss: 0.2786 - regression_loss: 0.2630 - classification_loss: 0.0156 279/500 [===============>..............] - ETA: 1:50 - loss: 0.2799 - regression_loss: 0.2642 - classification_loss: 0.0157 280/500 [===============>..............] - ETA: 1:50 - loss: 0.2807 - regression_loss: 0.2650 - classification_loss: 0.0157 281/500 [===============>..............] - ETA: 1:49 - loss: 0.2816 - regression_loss: 0.2659 - classification_loss: 0.0157 282/500 [===============>..............] - ETA: 1:49 - loss: 0.2822 - regression_loss: 0.2665 - classification_loss: 0.0157 283/500 [===============>..............] - ETA: 1:48 - loss: 0.2824 - regression_loss: 0.2666 - classification_loss: 0.0157 284/500 [================>.............] - ETA: 1:48 - loss: 0.2825 - regression_loss: 0.2667 - classification_loss: 0.0157 285/500 [================>.............] - ETA: 1:47 - loss: 0.2827 - regression_loss: 0.2670 - classification_loss: 0.0157 286/500 [================>.............] - ETA: 1:47 - loss: 0.2829 - regression_loss: 0.2671 - classification_loss: 0.0157 287/500 [================>.............] - ETA: 1:46 - loss: 0.2827 - regression_loss: 0.2669 - classification_loss: 0.0157 288/500 [================>.............] - ETA: 1:46 - loss: 0.2825 - regression_loss: 0.2668 - classification_loss: 0.0157 289/500 [================>.............] - ETA: 1:45 - loss: 0.2827 - regression_loss: 0.2670 - classification_loss: 0.0157 290/500 [================>.............] - ETA: 1:45 - loss: 0.2834 - regression_loss: 0.2677 - classification_loss: 0.0157 291/500 [================>.............] - ETA: 1:44 - loss: 0.2836 - regression_loss: 0.2679 - classification_loss: 0.0157 292/500 [================>.............] - ETA: 1:44 - loss: 0.2840 - regression_loss: 0.2682 - classification_loss: 0.0157 293/500 [================>.............] - ETA: 1:43 - loss: 0.2839 - regression_loss: 0.2682 - classification_loss: 0.0157 294/500 [================>.............] - ETA: 1:43 - loss: 0.2841 - regression_loss: 0.2684 - classification_loss: 0.0157 295/500 [================>.............] - ETA: 1:42 - loss: 0.2838 - regression_loss: 0.2681 - classification_loss: 0.0157 296/500 [================>.............] - ETA: 1:42 - loss: 0.2841 - regression_loss: 0.2685 - classification_loss: 0.0157 297/500 [================>.............] - ETA: 1:41 - loss: 0.2837 - regression_loss: 0.2681 - classification_loss: 0.0156 298/500 [================>.............] - ETA: 1:41 - loss: 0.2832 - regression_loss: 0.2676 - classification_loss: 0.0156 299/500 [================>.............] - ETA: 1:40 - loss: 0.2827 - regression_loss: 0.2672 - classification_loss: 0.0156 300/500 [=================>............] - ETA: 1:40 - loss: 0.2838 - regression_loss: 0.2681 - classification_loss: 0.0157 301/500 [=================>............] - ETA: 1:39 - loss: 0.2842 - regression_loss: 0.2685 - classification_loss: 0.0157 302/500 [=================>............] - ETA: 1:39 - loss: 0.2844 - regression_loss: 0.2686 - classification_loss: 0.0157 303/500 [=================>............] - ETA: 1:38 - loss: 0.2848 - regression_loss: 0.2690 - classification_loss: 0.0158 304/500 [=================>............] - ETA: 1:38 - loss: 0.2855 - regression_loss: 0.2697 - classification_loss: 0.0158 305/500 [=================>............] - ETA: 1:37 - loss: 0.2861 - regression_loss: 0.2703 - classification_loss: 0.0158 306/500 [=================>............] - ETA: 1:37 - loss: 0.2857 - regression_loss: 0.2699 - classification_loss: 0.0158 307/500 [=================>............] - ETA: 1:36 - loss: 0.2853 - regression_loss: 0.2696 - classification_loss: 0.0158 308/500 [=================>............] - ETA: 1:36 - loss: 0.2856 - regression_loss: 0.2698 - classification_loss: 0.0158 309/500 [=================>............] - ETA: 1:35 - loss: 0.2855 - regression_loss: 0.2698 - classification_loss: 0.0158 310/500 [=================>............] - ETA: 1:35 - loss: 0.2854 - regression_loss: 0.2697 - classification_loss: 0.0157 311/500 [=================>............] - ETA: 1:34 - loss: 0.2854 - regression_loss: 0.2697 - classification_loss: 0.0157 312/500 [=================>............] - ETA: 1:34 - loss: 0.2852 - regression_loss: 0.2695 - classification_loss: 0.0157 313/500 [=================>............] - ETA: 1:33 - loss: 0.2855 - regression_loss: 0.2697 - classification_loss: 0.0157 314/500 [=================>............] - ETA: 1:33 - loss: 0.2854 - regression_loss: 0.2697 - classification_loss: 0.0157 315/500 [=================>............] - ETA: 1:32 - loss: 0.2849 - regression_loss: 0.2693 - classification_loss: 0.0157 316/500 [=================>............] - ETA: 1:32 - loss: 0.2854 - regression_loss: 0.2696 - classification_loss: 0.0158 317/500 [==================>...........] - ETA: 1:31 - loss: 0.2853 - regression_loss: 0.2696 - classification_loss: 0.0157 318/500 [==================>...........] - ETA: 1:31 - loss: 0.2854 - regression_loss: 0.2696 - classification_loss: 0.0157 319/500 [==================>...........] - ETA: 1:30 - loss: 0.2853 - regression_loss: 0.2696 - classification_loss: 0.0157 320/500 [==================>...........] - ETA: 1:30 - loss: 0.2857 - regression_loss: 0.2700 - classification_loss: 0.0157 321/500 [==================>...........] - ETA: 1:29 - loss: 0.2856 - regression_loss: 0.2700 - classification_loss: 0.0157 322/500 [==================>...........] - ETA: 1:29 - loss: 0.2861 - regression_loss: 0.2705 - classification_loss: 0.0157 323/500 [==================>...........] - ETA: 1:28 - loss: 0.2861 - regression_loss: 0.2704 - classification_loss: 0.0157 324/500 [==================>...........] - ETA: 1:28 - loss: 0.2863 - regression_loss: 0.2705 - classification_loss: 0.0158 325/500 [==================>...........] - ETA: 1:27 - loss: 0.2868 - regression_loss: 0.2709 - classification_loss: 0.0159 326/500 [==================>...........] - ETA: 1:27 - loss: 0.2868 - regression_loss: 0.2709 - classification_loss: 0.0159 327/500 [==================>...........] - ETA: 1:26 - loss: 0.2871 - regression_loss: 0.2712 - classification_loss: 0.0159 328/500 [==================>...........] - ETA: 1:26 - loss: 0.2870 - regression_loss: 0.2711 - classification_loss: 0.0159 329/500 [==================>...........] - ETA: 1:25 - loss: 0.2867 - regression_loss: 0.2708 - classification_loss: 0.0159 330/500 [==================>...........] - ETA: 1:25 - loss: 0.2869 - regression_loss: 0.2710 - classification_loss: 0.0159 331/500 [==================>...........] - ETA: 1:24 - loss: 0.2874 - regression_loss: 0.2715 - classification_loss: 0.0159 332/500 [==================>...........] - ETA: 1:24 - loss: 0.2872 - regression_loss: 0.2713 - classification_loss: 0.0159 333/500 [==================>...........] - ETA: 1:23 - loss: 0.2874 - regression_loss: 0.2715 - classification_loss: 0.0159 334/500 [===================>..........] - ETA: 1:23 - loss: 0.2872 - regression_loss: 0.2713 - classification_loss: 0.0159 335/500 [===================>..........] - ETA: 1:22 - loss: 0.2868 - regression_loss: 0.2709 - classification_loss: 0.0159 336/500 [===================>..........] - ETA: 1:22 - loss: 0.2866 - regression_loss: 0.2708 - classification_loss: 0.0159 337/500 [===================>..........] - ETA: 1:21 - loss: 0.2872 - regression_loss: 0.2713 - classification_loss: 0.0160 338/500 [===================>..........] - ETA: 1:21 - loss: 0.2871 - regression_loss: 0.2711 - classification_loss: 0.0160 339/500 [===================>..........] - ETA: 1:20 - loss: 0.2875 - regression_loss: 0.2715 - classification_loss: 0.0160 340/500 [===================>..........] - ETA: 1:20 - loss: 0.2876 - regression_loss: 0.2715 - classification_loss: 0.0160 341/500 [===================>..........] - ETA: 1:19 - loss: 0.2872 - regression_loss: 0.2712 - classification_loss: 0.0160 342/500 [===================>..........] - ETA: 1:19 - loss: 0.2872 - regression_loss: 0.2712 - classification_loss: 0.0160 343/500 [===================>..........] - ETA: 1:18 - loss: 0.2876 - regression_loss: 0.2716 - classification_loss: 0.0160 344/500 [===================>..........] - ETA: 1:18 - loss: 0.2877 - regression_loss: 0.2717 - classification_loss: 0.0160 345/500 [===================>..........] - ETA: 1:17 - loss: 0.2874 - regression_loss: 0.2714 - classification_loss: 0.0160 346/500 [===================>..........] - ETA: 1:17 - loss: 0.2872 - regression_loss: 0.2712 - classification_loss: 0.0160 347/500 [===================>..........] - ETA: 1:16 - loss: 0.2872 - regression_loss: 0.2712 - classification_loss: 0.0160 348/500 [===================>..........] - ETA: 1:16 - loss: 0.2872 - regression_loss: 0.2712 - classification_loss: 0.0160 349/500 [===================>..........] - ETA: 1:15 - loss: 0.2870 - regression_loss: 0.2711 - classification_loss: 0.0160 350/500 [====================>.........] - ETA: 1:15 - loss: 0.2866 - regression_loss: 0.2707 - classification_loss: 0.0159 351/500 [====================>.........] - ETA: 1:14 - loss: 0.2870 - regression_loss: 0.2711 - classification_loss: 0.0159 352/500 [====================>.........] - ETA: 1:14 - loss: 0.2871 - regression_loss: 0.2711 - classification_loss: 0.0159 353/500 [====================>.........] - ETA: 1:13 - loss: 0.2872 - regression_loss: 0.2713 - classification_loss: 0.0159 354/500 [====================>.........] - ETA: 1:13 - loss: 0.2874 - regression_loss: 0.2715 - classification_loss: 0.0159 355/500 [====================>.........] - ETA: 1:12 - loss: 0.2874 - regression_loss: 0.2715 - classification_loss: 0.0159 356/500 [====================>.........] - ETA: 1:12 - loss: 0.2876 - regression_loss: 0.2716 - classification_loss: 0.0160 357/500 [====================>.........] - ETA: 1:11 - loss: 0.2871 - regression_loss: 0.2712 - classification_loss: 0.0159 358/500 [====================>.........] - ETA: 1:11 - loss: 0.2874 - regression_loss: 0.2714 - classification_loss: 0.0160 359/500 [====================>.........] - ETA: 1:10 - loss: 0.2881 - regression_loss: 0.2720 - classification_loss: 0.0161 360/500 [====================>.........] - ETA: 1:10 - loss: 0.2879 - regression_loss: 0.2718 - classification_loss: 0.0161 361/500 [====================>.........] - ETA: 1:09 - loss: 0.2873 - regression_loss: 0.2713 - classification_loss: 0.0161 362/500 [====================>.........] - ETA: 1:09 - loss: 0.2872 - regression_loss: 0.2712 - classification_loss: 0.0160 363/500 [====================>.........] - ETA: 1:08 - loss: 0.2870 - regression_loss: 0.2710 - classification_loss: 0.0160 364/500 [====================>.........] - ETA: 1:08 - loss: 0.2872 - regression_loss: 0.2711 - classification_loss: 0.0160 365/500 [====================>.........] - ETA: 1:07 - loss: 0.2867 - regression_loss: 0.2708 - classification_loss: 0.0160 366/500 [====================>.........] - ETA: 1:07 - loss: 0.2865 - regression_loss: 0.2705 - classification_loss: 0.0160 367/500 [=====================>........] - ETA: 1:06 - loss: 0.2863 - regression_loss: 0.2704 - classification_loss: 0.0160 368/500 [=====================>........] - ETA: 1:06 - loss: 0.2861 - regression_loss: 0.2702 - classification_loss: 0.0159 369/500 [=====================>........] - ETA: 1:05 - loss: 0.2861 - regression_loss: 0.2702 - classification_loss: 0.0159 370/500 [=====================>........] - ETA: 1:05 - loss: 0.2860 - regression_loss: 0.2701 - classification_loss: 0.0159 371/500 [=====================>........] - ETA: 1:04 - loss: 0.2861 - regression_loss: 0.2703 - classification_loss: 0.0159 372/500 [=====================>........] - ETA: 1:04 - loss: 0.2866 - regression_loss: 0.2707 - classification_loss: 0.0159 373/500 [=====================>........] - ETA: 1:03 - loss: 0.2865 - regression_loss: 0.2706 - classification_loss: 0.0159 374/500 [=====================>........] - ETA: 1:03 - loss: 0.2868 - regression_loss: 0.2709 - classification_loss: 0.0159 375/500 [=====================>........] - ETA: 1:02 - loss: 0.2869 - regression_loss: 0.2710 - classification_loss: 0.0159 376/500 [=====================>........] - ETA: 1:02 - loss: 0.2872 - regression_loss: 0.2713 - classification_loss: 0.0160 377/500 [=====================>........] - ETA: 1:01 - loss: 0.2875 - regression_loss: 0.2715 - classification_loss: 0.0160 378/500 [=====================>........] - ETA: 1:01 - loss: 0.2874 - regression_loss: 0.2714 - classification_loss: 0.0160 379/500 [=====================>........] - ETA: 1:00 - loss: 0.2872 - regression_loss: 0.2712 - classification_loss: 0.0160 380/500 [=====================>........] - ETA: 1:00 - loss: 0.2871 - regression_loss: 0.2711 - classification_loss: 0.0161 381/500 [=====================>........] - ETA: 59s - loss: 0.2868 - regression_loss: 0.2707 - classification_loss: 0.0160  382/500 [=====================>........] - ETA: 59s - loss: 0.2868 - regression_loss: 0.2708 - classification_loss: 0.0160 383/500 [=====================>........] - ETA: 58s - loss: 0.2866 - regression_loss: 0.2705 - classification_loss: 0.0160 384/500 [======================>.......] - ETA: 58s - loss: 0.2866 - regression_loss: 0.2705 - classification_loss: 0.0160 385/500 [======================>.......] - ETA: 57s - loss: 0.2870 - regression_loss: 0.2710 - classification_loss: 0.0160 386/500 [======================>.......] - ETA: 57s - loss: 0.2872 - regression_loss: 0.2711 - classification_loss: 0.0161 387/500 [======================>.......] - ETA: 56s - loss: 0.2868 - regression_loss: 0.2708 - classification_loss: 0.0161 388/500 [======================>.......] - ETA: 56s - loss: 0.2872 - regression_loss: 0.2711 - classification_loss: 0.0161 389/500 [======================>.......] - ETA: 55s - loss: 0.2869 - regression_loss: 0.2708 - classification_loss: 0.0161 390/500 [======================>.......] - ETA: 55s - loss: 0.2868 - regression_loss: 0.2707 - classification_loss: 0.0161 391/500 [======================>.......] - ETA: 54s - loss: 0.2870 - regression_loss: 0.2709 - classification_loss: 0.0161 392/500 [======================>.......] - ETA: 54s - loss: 0.2873 - regression_loss: 0.2711 - classification_loss: 0.0162 393/500 [======================>.......] - ETA: 53s - loss: 0.2874 - regression_loss: 0.2712 - classification_loss: 0.0162 394/500 [======================>.......] - ETA: 53s - loss: 0.2872 - regression_loss: 0.2710 - classification_loss: 0.0162 395/500 [======================>.......] - ETA: 52s - loss: 0.2874 - regression_loss: 0.2712 - classification_loss: 0.0162 396/500 [======================>.......] - ETA: 52s - loss: 0.2877 - regression_loss: 0.2715 - classification_loss: 0.0162 397/500 [======================>.......] - ETA: 51s - loss: 0.2875 - regression_loss: 0.2713 - classification_loss: 0.0162 398/500 [======================>.......] - ETA: 51s - loss: 0.2879 - regression_loss: 0.2716 - classification_loss: 0.0163 399/500 [======================>.......] - ETA: 50s - loss: 0.2877 - regression_loss: 0.2713 - classification_loss: 0.0163 400/500 [=======================>......] - ETA: 50s - loss: 0.2877 - regression_loss: 0.2714 - classification_loss: 0.0163 401/500 [=======================>......] - ETA: 49s - loss: 0.2871 - regression_loss: 0.2709 - classification_loss: 0.0163 402/500 [=======================>......] - ETA: 49s - loss: 0.2871 - regression_loss: 0.2708 - classification_loss: 0.0163 403/500 [=======================>......] - ETA: 48s - loss: 0.2868 - regression_loss: 0.2705 - classification_loss: 0.0162 404/500 [=======================>......] - ETA: 48s - loss: 0.2867 - regression_loss: 0.2704 - classification_loss: 0.0162 405/500 [=======================>......] - ETA: 47s - loss: 0.2863 - regression_loss: 0.2701 - classification_loss: 0.0162 406/500 [=======================>......] - ETA: 47s - loss: 0.2860 - regression_loss: 0.2698 - classification_loss: 0.0162 407/500 [=======================>......] - ETA: 46s - loss: 0.2860 - regression_loss: 0.2698 - classification_loss: 0.0162 408/500 [=======================>......] - ETA: 46s - loss: 0.2859 - regression_loss: 0.2697 - classification_loss: 0.0162 409/500 [=======================>......] - ETA: 45s - loss: 0.2858 - regression_loss: 0.2696 - classification_loss: 0.0162 410/500 [=======================>......] - ETA: 45s - loss: 0.2858 - regression_loss: 0.2696 - classification_loss: 0.0162 411/500 [=======================>......] - ETA: 44s - loss: 0.2859 - regression_loss: 0.2697 - classification_loss: 0.0162 412/500 [=======================>......] - ETA: 44s - loss: 0.2855 - regression_loss: 0.2693 - classification_loss: 0.0162 413/500 [=======================>......] - ETA: 43s - loss: 0.2859 - regression_loss: 0.2698 - classification_loss: 0.0162 414/500 [=======================>......] - ETA: 43s - loss: 0.2859 - regression_loss: 0.2697 - classification_loss: 0.0162 415/500 [=======================>......] - ETA: 42s - loss: 0.2864 - regression_loss: 0.2701 - classification_loss: 0.0162 416/500 [=======================>......] - ETA: 42s - loss: 0.2865 - regression_loss: 0.2703 - classification_loss: 0.0162 417/500 [========================>.....] - ETA: 41s - loss: 0.2868 - regression_loss: 0.2705 - classification_loss: 0.0162 418/500 [========================>.....] - ETA: 41s - loss: 0.2874 - regression_loss: 0.2711 - classification_loss: 0.0162 419/500 [========================>.....] - ETA: 40s - loss: 0.2873 - regression_loss: 0.2711 - classification_loss: 0.0162 420/500 [========================>.....] - ETA: 40s - loss: 0.2873 - regression_loss: 0.2710 - classification_loss: 0.0163 421/500 [========================>.....] - ETA: 39s - loss: 0.2876 - regression_loss: 0.2714 - classification_loss: 0.0163 422/500 [========================>.....] - ETA: 39s - loss: 0.2876 - regression_loss: 0.2713 - classification_loss: 0.0162 423/500 [========================>.....] - ETA: 38s - loss: 0.2874 - regression_loss: 0.2712 - classification_loss: 0.0162 424/500 [========================>.....] - ETA: 38s - loss: 0.2870 - regression_loss: 0.2708 - classification_loss: 0.0162 425/500 [========================>.....] - ETA: 37s - loss: 0.2878 - regression_loss: 0.2716 - classification_loss: 0.0162 426/500 [========================>.....] - ETA: 37s - loss: 0.2882 - regression_loss: 0.2721 - classification_loss: 0.0162 427/500 [========================>.....] - ETA: 36s - loss: 0.2882 - regression_loss: 0.2720 - classification_loss: 0.0162 428/500 [========================>.....] - ETA: 36s - loss: 0.2880 - regression_loss: 0.2719 - classification_loss: 0.0162 429/500 [========================>.....] - ETA: 35s - loss: 0.2880 - regression_loss: 0.2718 - classification_loss: 0.0162 430/500 [========================>.....] - ETA: 35s - loss: 0.2883 - regression_loss: 0.2721 - classification_loss: 0.0162 431/500 [========================>.....] - ETA: 34s - loss: 0.2882 - regression_loss: 0.2720 - classification_loss: 0.0162 432/500 [========================>.....] - ETA: 34s - loss: 0.2886 - regression_loss: 0.2724 - classification_loss: 0.0162 433/500 [========================>.....] - ETA: 33s - loss: 0.2884 - regression_loss: 0.2722 - classification_loss: 0.0162 434/500 [=========================>....] - ETA: 33s - loss: 0.2883 - regression_loss: 0.2721 - classification_loss: 0.0162 435/500 [=========================>....] - ETA: 32s - loss: 0.2882 - regression_loss: 0.2720 - classification_loss: 0.0162 436/500 [=========================>....] - ETA: 32s - loss: 0.2882 - regression_loss: 0.2720 - classification_loss: 0.0162 437/500 [=========================>....] - ETA: 31s - loss: 0.2882 - regression_loss: 0.2720 - classification_loss: 0.0162 438/500 [=========================>....] - ETA: 31s - loss: 0.2881 - regression_loss: 0.2719 - classification_loss: 0.0162 439/500 [=========================>....] - ETA: 30s - loss: 0.2877 - regression_loss: 0.2715 - classification_loss: 0.0161 440/500 [=========================>....] - ETA: 30s - loss: 0.2874 - regression_loss: 0.2713 - classification_loss: 0.0161 441/500 [=========================>....] - ETA: 29s - loss: 0.2877 - regression_loss: 0.2716 - classification_loss: 0.0161 442/500 [=========================>....] - ETA: 29s - loss: 0.2877 - regression_loss: 0.2715 - classification_loss: 0.0161 443/500 [=========================>....] - ETA: 28s - loss: 0.2879 - regression_loss: 0.2717 - classification_loss: 0.0162 444/500 [=========================>....] - ETA: 28s - loss: 0.2878 - regression_loss: 0.2716 - classification_loss: 0.0161 445/500 [=========================>....] - ETA: 27s - loss: 0.2877 - regression_loss: 0.2715 - classification_loss: 0.0162 446/500 [=========================>....] - ETA: 27s - loss: 0.2876 - regression_loss: 0.2715 - classification_loss: 0.0162 447/500 [=========================>....] - ETA: 26s - loss: 0.2877 - regression_loss: 0.2716 - classification_loss: 0.0161 448/500 [=========================>....] - ETA: 26s - loss: 0.2875 - regression_loss: 0.2714 - classification_loss: 0.0161 449/500 [=========================>....] - ETA: 25s - loss: 0.2874 - regression_loss: 0.2712 - classification_loss: 0.0161 450/500 [==========================>...] - ETA: 25s - loss: 0.2873 - regression_loss: 0.2712 - classification_loss: 0.0161 451/500 [==========================>...] - ETA: 24s - loss: 0.2874 - regression_loss: 0.2713 - classification_loss: 0.0161 452/500 [==========================>...] - ETA: 24s - loss: 0.2874 - regression_loss: 0.2713 - classification_loss: 0.0161 453/500 [==========================>...] - ETA: 23s - loss: 0.2872 - regression_loss: 0.2711 - classification_loss: 0.0161 454/500 [==========================>...] - ETA: 23s - loss: 0.2869 - regression_loss: 0.2708 - classification_loss: 0.0161 455/500 [==========================>...] - ETA: 22s - loss: 0.2867 - regression_loss: 0.2707 - classification_loss: 0.0160 456/500 [==========================>...] - ETA: 22s - loss: 0.2870 - regression_loss: 0.2710 - classification_loss: 0.0160 457/500 [==========================>...] - ETA: 21s - loss: 0.2869 - regression_loss: 0.2709 - classification_loss: 0.0160 458/500 [==========================>...] - ETA: 21s - loss: 0.2869 - regression_loss: 0.2709 - classification_loss: 0.0160 459/500 [==========================>...] - ETA: 20s - loss: 0.2869 - regression_loss: 0.2709 - classification_loss: 0.0160 460/500 [==========================>...] - ETA: 20s - loss: 0.2867 - regression_loss: 0.2707 - classification_loss: 0.0160 461/500 [==========================>...] - ETA: 19s - loss: 0.2867 - regression_loss: 0.2707 - classification_loss: 0.0160 462/500 [==========================>...] - ETA: 19s - loss: 0.2875 - regression_loss: 0.2714 - classification_loss: 0.0160 463/500 [==========================>...] - ETA: 18s - loss: 0.2873 - regression_loss: 0.2713 - classification_loss: 0.0160 464/500 [==========================>...] - ETA: 18s - loss: 0.2872 - regression_loss: 0.2713 - classification_loss: 0.0160 465/500 [==========================>...] - ETA: 17s - loss: 0.2874 - regression_loss: 0.2714 - classification_loss: 0.0160 466/500 [==========================>...] - ETA: 17s - loss: 0.2871 - regression_loss: 0.2711 - classification_loss: 0.0160 467/500 [===========================>..] - ETA: 16s - loss: 0.2869 - regression_loss: 0.2709 - classification_loss: 0.0160 468/500 [===========================>..] - ETA: 16s - loss: 0.2871 - regression_loss: 0.2711 - classification_loss: 0.0160 469/500 [===========================>..] - ETA: 15s - loss: 0.2869 - regression_loss: 0.2710 - classification_loss: 0.0160 470/500 [===========================>..] - ETA: 15s - loss: 0.2868 - regression_loss: 0.2708 - classification_loss: 0.0160 471/500 [===========================>..] - ETA: 14s - loss: 0.2873 - regression_loss: 0.2713 - classification_loss: 0.0160 472/500 [===========================>..] - ETA: 14s - loss: 0.2870 - regression_loss: 0.2710 - classification_loss: 0.0160 473/500 [===========================>..] - ETA: 13s - loss: 0.2871 - regression_loss: 0.2711 - classification_loss: 0.0160 474/500 [===========================>..] - ETA: 13s - loss: 0.2868 - regression_loss: 0.2709 - classification_loss: 0.0160 475/500 [===========================>..] - ETA: 12s - loss: 0.2868 - regression_loss: 0.2709 - classification_loss: 0.0159 476/500 [===========================>..] - ETA: 12s - loss: 0.2867 - regression_loss: 0.2708 - classification_loss: 0.0159 477/500 [===========================>..] - ETA: 11s - loss: 0.2864 - regression_loss: 0.2705 - classification_loss: 0.0159 478/500 [===========================>..] - ETA: 11s - loss: 0.2865 - regression_loss: 0.2706 - classification_loss: 0.0159 479/500 [===========================>..] - ETA: 10s - loss: 0.2872 - regression_loss: 0.2712 - classification_loss: 0.0160 480/500 [===========================>..] - ETA: 10s - loss: 0.2872 - regression_loss: 0.2712 - classification_loss: 0.0160 481/500 [===========================>..] - ETA: 9s - loss: 0.2872 - regression_loss: 0.2712 - classification_loss: 0.0160  482/500 [===========================>..] - ETA: 9s - loss: 0.2874 - regression_loss: 0.2714 - classification_loss: 0.0160 483/500 [===========================>..] - ETA: 8s - loss: 0.2875 - regression_loss: 0.2715 - classification_loss: 0.0159 484/500 [============================>.] - ETA: 8s - loss: 0.2874 - regression_loss: 0.2715 - classification_loss: 0.0159 485/500 [============================>.] - ETA: 7s - loss: 0.2873 - regression_loss: 0.2714 - classification_loss: 0.0159 486/500 [============================>.] - ETA: 7s - loss: 0.2874 - regression_loss: 0.2715 - classification_loss: 0.0159 487/500 [============================>.] - ETA: 6s - loss: 0.2875 - regression_loss: 0.2716 - classification_loss: 0.0159 488/500 [============================>.] - ETA: 6s - loss: 0.2874 - regression_loss: 0.2714 - classification_loss: 0.0159 489/500 [============================>.] - ETA: 5s - loss: 0.2876 - regression_loss: 0.2716 - classification_loss: 0.0160 490/500 [============================>.] - ETA: 5s - loss: 0.2877 - regression_loss: 0.2717 - classification_loss: 0.0160 491/500 [============================>.] - ETA: 4s - loss: 0.2877 - regression_loss: 0.2717 - classification_loss: 0.0160 492/500 [============================>.] - ETA: 4s - loss: 0.2876 - regression_loss: 0.2716 - classification_loss: 0.0160 493/500 [============================>.] - ETA: 3s - loss: 0.2880 - regression_loss: 0.2719 - classification_loss: 0.0161 494/500 [============================>.] - ETA: 3s - loss: 0.2879 - regression_loss: 0.2718 - classification_loss: 0.0160 495/500 [============================>.] - ETA: 2s - loss: 0.2877 - regression_loss: 0.2717 - classification_loss: 0.0160 496/500 [============================>.] - ETA: 2s - loss: 0.2873 - regression_loss: 0.2713 - classification_loss: 0.0160 497/500 [============================>.] - ETA: 1s - loss: 0.2872 - regression_loss: 0.2712 - classification_loss: 0.0160 498/500 [============================>.] - ETA: 1s - loss: 0.2872 - regression_loss: 0.2712 - classification_loss: 0.0160 499/500 [============================>.] - ETA: 0s - loss: 0.2871 - regression_loss: 0.2711 - classification_loss: 0.0160 500/500 [==============================] - 251s 502ms/step - loss: 0.2871 - regression_loss: 0.2711 - classification_loss: 0.0160 1172 instances of class plum with average precision: 0.7462 mAP: 0.7462 Epoch 00021: saving model to ./training/snapshots/resnet101_pascal_21.h5 Epoch 22/150 1/500 [..............................] - ETA: 3:58 - loss: 0.3370 - regression_loss: 0.3256 - classification_loss: 0.0114 2/500 [..............................] - ETA: 4:03 - loss: 0.2193 - regression_loss: 0.2122 - classification_loss: 0.0071 3/500 [..............................] - ETA: 4:04 - loss: 0.2392 - regression_loss: 0.2315 - classification_loss: 0.0077 4/500 [..............................] - ETA: 4:08 - loss: 0.2842 - regression_loss: 0.2704 - classification_loss: 0.0137 5/500 [..............................] - ETA: 4:08 - loss: 0.2890 - regression_loss: 0.2740 - classification_loss: 0.0150 6/500 [..............................] - ETA: 4:07 - loss: 0.2948 - regression_loss: 0.2772 - classification_loss: 0.0175 7/500 [..............................] - ETA: 4:05 - loss: 0.2999 - regression_loss: 0.2827 - classification_loss: 0.0172 8/500 [..............................] - ETA: 4:05 - loss: 0.3263 - regression_loss: 0.3068 - classification_loss: 0.0196 9/500 [..............................] - ETA: 4:05 - loss: 0.3166 - regression_loss: 0.2974 - classification_loss: 0.0192 10/500 [..............................] - ETA: 4:05 - loss: 0.2980 - regression_loss: 0.2806 - classification_loss: 0.0175 11/500 [..............................] - ETA: 4:04 - loss: 0.2888 - regression_loss: 0.2725 - classification_loss: 0.0162 12/500 [..............................] - ETA: 4:03 - loss: 0.2942 - regression_loss: 0.2774 - classification_loss: 0.0169 13/500 [..............................] - ETA: 4:03 - loss: 0.3040 - regression_loss: 0.2849 - classification_loss: 0.0191 14/500 [..............................] - ETA: 4:03 - loss: 0.3087 - regression_loss: 0.2881 - classification_loss: 0.0206 15/500 [..............................] - ETA: 4:02 - loss: 0.3082 - regression_loss: 0.2878 - classification_loss: 0.0204 16/500 [..............................] - ETA: 4:01 - loss: 0.3003 - regression_loss: 0.2804 - classification_loss: 0.0198 17/500 [>.............................] - ETA: 4:01 - loss: 0.3063 - regression_loss: 0.2866 - classification_loss: 0.0197 18/500 [>.............................] - ETA: 4:01 - loss: 0.2943 - regression_loss: 0.2756 - classification_loss: 0.0188 19/500 [>.............................] - ETA: 4:01 - loss: 0.2989 - regression_loss: 0.2799 - classification_loss: 0.0190 20/500 [>.............................] - ETA: 4:00 - loss: 0.2926 - regression_loss: 0.2737 - classification_loss: 0.0189 21/500 [>.............................] - ETA: 4:00 - loss: 0.2845 - regression_loss: 0.2664 - classification_loss: 0.0181 22/500 [>.............................] - ETA: 3:59 - loss: 0.2812 - regression_loss: 0.2633 - classification_loss: 0.0179 23/500 [>.............................] - ETA: 3:59 - loss: 0.2845 - regression_loss: 0.2665 - classification_loss: 0.0180 24/500 [>.............................] - ETA: 3:59 - loss: 0.2793 - regression_loss: 0.2618 - classification_loss: 0.0175 25/500 [>.............................] - ETA: 3:58 - loss: 0.2769 - regression_loss: 0.2599 - classification_loss: 0.0169 26/500 [>.............................] - ETA: 3:57 - loss: 0.2708 - regression_loss: 0.2543 - classification_loss: 0.0165 27/500 [>.............................] - ETA: 3:56 - loss: 0.2666 - regression_loss: 0.2505 - classification_loss: 0.0161 28/500 [>.............................] - ETA: 3:55 - loss: 0.2751 - regression_loss: 0.2587 - classification_loss: 0.0164 29/500 [>.............................] - ETA: 3:54 - loss: 0.2727 - regression_loss: 0.2566 - classification_loss: 0.0161 30/500 [>.............................] - ETA: 3:54 - loss: 0.2834 - regression_loss: 0.2670 - classification_loss: 0.0164 31/500 [>.............................] - ETA: 3:54 - loss: 0.2780 - regression_loss: 0.2620 - classification_loss: 0.0161 32/500 [>.............................] - ETA: 3:53 - loss: 0.2738 - regression_loss: 0.2581 - classification_loss: 0.0157 33/500 [>.............................] - ETA: 3:52 - loss: 0.2730 - regression_loss: 0.2577 - classification_loss: 0.0154 34/500 [=>............................] - ETA: 3:52 - loss: 0.2799 - regression_loss: 0.2644 - classification_loss: 0.0155 35/500 [=>............................] - ETA: 3:52 - loss: 0.2806 - regression_loss: 0.2648 - classification_loss: 0.0158 36/500 [=>............................] - ETA: 3:51 - loss: 0.2825 - regression_loss: 0.2665 - classification_loss: 0.0160 37/500 [=>............................] - ETA: 3:51 - loss: 0.2888 - regression_loss: 0.2727 - classification_loss: 0.0161 38/500 [=>............................] - ETA: 3:50 - loss: 0.2908 - regression_loss: 0.2751 - classification_loss: 0.0158 39/500 [=>............................] - ETA: 3:50 - loss: 0.2912 - regression_loss: 0.2756 - classification_loss: 0.0156 40/500 [=>............................] - ETA: 3:49 - loss: 0.2932 - regression_loss: 0.2775 - classification_loss: 0.0157 41/500 [=>............................] - ETA: 3:49 - loss: 0.2948 - regression_loss: 0.2792 - classification_loss: 0.0157 42/500 [=>............................] - ETA: 3:48 - loss: 0.2935 - regression_loss: 0.2780 - classification_loss: 0.0155 43/500 [=>............................] - ETA: 3:48 - loss: 0.2978 - regression_loss: 0.2824 - classification_loss: 0.0153 44/500 [=>............................] - ETA: 3:47 - loss: 0.3048 - regression_loss: 0.2883 - classification_loss: 0.0165 45/500 [=>............................] - ETA: 3:47 - loss: 0.3022 - regression_loss: 0.2860 - classification_loss: 0.0162 46/500 [=>............................] - ETA: 3:47 - loss: 0.2994 - regression_loss: 0.2832 - classification_loss: 0.0163 47/500 [=>............................] - ETA: 3:46 - loss: 0.2979 - regression_loss: 0.2817 - classification_loss: 0.0162 48/500 [=>............................] - ETA: 3:46 - loss: 0.2936 - regression_loss: 0.2777 - classification_loss: 0.0160 49/500 [=>............................] - ETA: 3:45 - loss: 0.2980 - regression_loss: 0.2817 - classification_loss: 0.0164 50/500 [==>...........................] - ETA: 3:44 - loss: 0.2970 - regression_loss: 0.2807 - classification_loss: 0.0163 51/500 [==>...........................] - ETA: 3:44 - loss: 0.2976 - regression_loss: 0.2815 - classification_loss: 0.0161 52/500 [==>...........................] - ETA: 3:43 - loss: 0.2970 - regression_loss: 0.2811 - classification_loss: 0.0159 53/500 [==>...........................] - ETA: 3:43 - loss: 0.2964 - regression_loss: 0.2806 - classification_loss: 0.0159 54/500 [==>...........................] - ETA: 3:43 - loss: 0.2945 - regression_loss: 0.2788 - classification_loss: 0.0157 55/500 [==>...........................] - ETA: 3:42 - loss: 0.2997 - regression_loss: 0.2838 - classification_loss: 0.0160 56/500 [==>...........................] - ETA: 3:42 - loss: 0.2977 - regression_loss: 0.2820 - classification_loss: 0.0158 57/500 [==>...........................] - ETA: 3:41 - loss: 0.3009 - regression_loss: 0.2843 - classification_loss: 0.0165 58/500 [==>...........................] - ETA: 3:41 - loss: 0.2984 - regression_loss: 0.2821 - classification_loss: 0.0163 59/500 [==>...........................] - ETA: 3:40 - loss: 0.2970 - regression_loss: 0.2809 - classification_loss: 0.0161 60/500 [==>...........................] - ETA: 3:40 - loss: 0.2964 - regression_loss: 0.2804 - classification_loss: 0.0159 61/500 [==>...........................] - ETA: 3:39 - loss: 0.2953 - regression_loss: 0.2795 - classification_loss: 0.0158 62/500 [==>...........................] - ETA: 3:39 - loss: 0.2956 - regression_loss: 0.2796 - classification_loss: 0.0161 63/500 [==>...........................] - ETA: 3:38 - loss: 0.2954 - regression_loss: 0.2791 - classification_loss: 0.0163 64/500 [==>...........................] - ETA: 3:38 - loss: 0.2948 - regression_loss: 0.2787 - classification_loss: 0.0162 65/500 [==>...........................] - ETA: 3:38 - loss: 0.2947 - regression_loss: 0.2780 - classification_loss: 0.0167 66/500 [==>...........................] - ETA: 3:37 - loss: 0.2952 - regression_loss: 0.2786 - classification_loss: 0.0166 67/500 [===>..........................] - ETA: 3:36 - loss: 0.2950 - regression_loss: 0.2782 - classification_loss: 0.0168 68/500 [===>..........................] - ETA: 3:36 - loss: 0.2928 - regression_loss: 0.2761 - classification_loss: 0.0167 69/500 [===>..........................] - ETA: 3:35 - loss: 0.2939 - regression_loss: 0.2773 - classification_loss: 0.0165 70/500 [===>..........................] - ETA: 3:35 - loss: 0.2935 - regression_loss: 0.2771 - classification_loss: 0.0164 71/500 [===>..........................] - ETA: 3:34 - loss: 0.2924 - regression_loss: 0.2761 - classification_loss: 0.0163 72/500 [===>..........................] - ETA: 3:34 - loss: 0.2924 - regression_loss: 0.2761 - classification_loss: 0.0163 73/500 [===>..........................] - ETA: 3:33 - loss: 0.2910 - regression_loss: 0.2748 - classification_loss: 0.0162 74/500 [===>..........................] - ETA: 3:33 - loss: 0.2904 - regression_loss: 0.2744 - classification_loss: 0.0161 75/500 [===>..........................] - ETA: 3:32 - loss: 0.2893 - regression_loss: 0.2732 - classification_loss: 0.0161 76/500 [===>..........................] - ETA: 3:32 - loss: 0.2880 - regression_loss: 0.2721 - classification_loss: 0.0159 77/500 [===>..........................] - ETA: 3:31 - loss: 0.2893 - regression_loss: 0.2734 - classification_loss: 0.0159 78/500 [===>..........................] - ETA: 3:31 - loss: 0.2905 - regression_loss: 0.2742 - classification_loss: 0.0163 79/500 [===>..........................] - ETA: 3:30 - loss: 0.2894 - regression_loss: 0.2732 - classification_loss: 0.0162 80/500 [===>..........................] - ETA: 3:30 - loss: 0.2923 - regression_loss: 0.2761 - classification_loss: 0.0161 81/500 [===>..........................] - ETA: 3:29 - loss: 0.2920 - regression_loss: 0.2760 - classification_loss: 0.0160 82/500 [===>..........................] - ETA: 3:29 - loss: 0.2912 - regression_loss: 0.2752 - classification_loss: 0.0160 83/500 [===>..........................] - ETA: 3:28 - loss: 0.2918 - regression_loss: 0.2756 - classification_loss: 0.0162 84/500 [====>.........................] - ETA: 3:28 - loss: 0.2912 - regression_loss: 0.2751 - classification_loss: 0.0161 85/500 [====>.........................] - ETA: 3:27 - loss: 0.2901 - regression_loss: 0.2741 - classification_loss: 0.0160 86/500 [====>.........................] - ETA: 3:27 - loss: 0.2884 - regression_loss: 0.2725 - classification_loss: 0.0159 87/500 [====>.........................] - ETA: 3:26 - loss: 0.2884 - regression_loss: 0.2724 - classification_loss: 0.0159 88/500 [====>.........................] - ETA: 3:26 - loss: 0.2885 - regression_loss: 0.2726 - classification_loss: 0.0159 89/500 [====>.........................] - ETA: 3:25 - loss: 0.2898 - regression_loss: 0.2738 - classification_loss: 0.0160 90/500 [====>.........................] - ETA: 3:25 - loss: 0.2911 - regression_loss: 0.2751 - classification_loss: 0.0160 91/500 [====>.........................] - ETA: 3:24 - loss: 0.2919 - regression_loss: 0.2758 - classification_loss: 0.0161 92/500 [====>.........................] - ETA: 3:24 - loss: 0.2942 - regression_loss: 0.2775 - classification_loss: 0.0167 93/500 [====>.........................] - ETA: 3:23 - loss: 0.2957 - regression_loss: 0.2790 - classification_loss: 0.0167 94/500 [====>.........................] - ETA: 3:23 - loss: 0.2938 - regression_loss: 0.2772 - classification_loss: 0.0166 95/500 [====>.........................] - ETA: 3:22 - loss: 0.2932 - regression_loss: 0.2766 - classification_loss: 0.0166 96/500 [====>.........................] - ETA: 3:21 - loss: 0.2919 - regression_loss: 0.2754 - classification_loss: 0.0165 97/500 [====>.........................] - ETA: 3:21 - loss: 0.2909 - regression_loss: 0.2745 - classification_loss: 0.0164 98/500 [====>.........................] - ETA: 3:21 - loss: 0.2901 - regression_loss: 0.2738 - classification_loss: 0.0163 99/500 [====>.........................] - ETA: 3:20 - loss: 0.2877 - regression_loss: 0.2715 - classification_loss: 0.0162 100/500 [=====>........................] - ETA: 3:20 - loss: 0.2880 - regression_loss: 0.2718 - classification_loss: 0.0162 101/500 [=====>........................] - ETA: 3:19 - loss: 0.2867 - regression_loss: 0.2706 - classification_loss: 0.0161 102/500 [=====>........................] - ETA: 3:19 - loss: 0.2874 - regression_loss: 0.2713 - classification_loss: 0.0161 103/500 [=====>........................] - ETA: 3:18 - loss: 0.2871 - regression_loss: 0.2711 - classification_loss: 0.0160 104/500 [=====>........................] - ETA: 3:18 - loss: 0.2895 - regression_loss: 0.2731 - classification_loss: 0.0164 105/500 [=====>........................] - ETA: 3:17 - loss: 0.2924 - regression_loss: 0.2760 - classification_loss: 0.0165 106/500 [=====>........................] - ETA: 3:17 - loss: 0.2942 - regression_loss: 0.2777 - classification_loss: 0.0165 107/500 [=====>........................] - ETA: 3:16 - loss: 0.2947 - regression_loss: 0.2782 - classification_loss: 0.0165 108/500 [=====>........................] - ETA: 3:16 - loss: 0.2950 - regression_loss: 0.2785 - classification_loss: 0.0164 109/500 [=====>........................] - ETA: 3:15 - loss: 0.2945 - regression_loss: 0.2782 - classification_loss: 0.0163 110/500 [=====>........................] - ETA: 3:15 - loss: 0.2953 - regression_loss: 0.2791 - classification_loss: 0.0163 111/500 [=====>........................] - ETA: 3:14 - loss: 0.2944 - regression_loss: 0.2783 - classification_loss: 0.0162 112/500 [=====>........................] - ETA: 3:14 - loss: 0.2937 - regression_loss: 0.2776 - classification_loss: 0.0161 113/500 [=====>........................] - ETA: 3:13 - loss: 0.2938 - regression_loss: 0.2778 - classification_loss: 0.0161 114/500 [=====>........................] - ETA: 3:13 - loss: 0.2931 - regression_loss: 0.2771 - classification_loss: 0.0161 115/500 [=====>........................] - ETA: 3:12 - loss: 0.2930 - regression_loss: 0.2769 - classification_loss: 0.0161 116/500 [=====>........................] - ETA: 3:12 - loss: 0.2929 - regression_loss: 0.2768 - classification_loss: 0.0162 117/500 [======>.......................] - ETA: 3:11 - loss: 0.2919 - regression_loss: 0.2758 - classification_loss: 0.0161 118/500 [======>.......................] - ETA: 3:11 - loss: 0.2908 - regression_loss: 0.2748 - classification_loss: 0.0160 119/500 [======>.......................] - ETA: 3:10 - loss: 0.2902 - regression_loss: 0.2743 - classification_loss: 0.0159 120/500 [======>.......................] - ETA: 3:10 - loss: 0.2897 - regression_loss: 0.2737 - classification_loss: 0.0159 121/500 [======>.......................] - ETA: 3:09 - loss: 0.2878 - regression_loss: 0.2720 - classification_loss: 0.0158 122/500 [======>.......................] - ETA: 3:09 - loss: 0.2901 - regression_loss: 0.2741 - classification_loss: 0.0160 123/500 [======>.......................] - ETA: 3:08 - loss: 0.2889 - regression_loss: 0.2730 - classification_loss: 0.0159 124/500 [======>.......................] - ETA: 3:08 - loss: 0.2890 - regression_loss: 0.2731 - classification_loss: 0.0159 125/500 [======>.......................] - ETA: 3:07 - loss: 0.2878 - regression_loss: 0.2719 - classification_loss: 0.0158 126/500 [======>.......................] - ETA: 3:07 - loss: 0.2872 - regression_loss: 0.2715 - classification_loss: 0.0158 127/500 [======>.......................] - ETA: 3:06 - loss: 0.2874 - regression_loss: 0.2716 - classification_loss: 0.0158 128/500 [======>.......................] - ETA: 3:06 - loss: 0.2866 - regression_loss: 0.2708 - classification_loss: 0.0158 129/500 [======>.......................] - ETA: 3:05 - loss: 0.2860 - regression_loss: 0.2702 - classification_loss: 0.0158 130/500 [======>.......................] - ETA: 3:05 - loss: 0.2856 - regression_loss: 0.2699 - classification_loss: 0.0157 131/500 [======>.......................] - ETA: 3:04 - loss: 0.2860 - regression_loss: 0.2702 - classification_loss: 0.0158 132/500 [======>.......................] - ETA: 3:04 - loss: 0.2853 - regression_loss: 0.2696 - classification_loss: 0.0157 133/500 [======>.......................] - ETA: 3:04 - loss: 0.2847 - regression_loss: 0.2690 - classification_loss: 0.0157 134/500 [=======>......................] - ETA: 3:03 - loss: 0.2846 - regression_loss: 0.2687 - classification_loss: 0.0158 135/500 [=======>......................] - ETA: 3:03 - loss: 0.2838 - regression_loss: 0.2680 - classification_loss: 0.0158 136/500 [=======>......................] - ETA: 3:02 - loss: 0.2840 - regression_loss: 0.2682 - classification_loss: 0.0157 137/500 [=======>......................] - ETA: 3:02 - loss: 0.2842 - regression_loss: 0.2685 - classification_loss: 0.0157 138/500 [=======>......................] - ETA: 3:01 - loss: 0.2841 - regression_loss: 0.2684 - classification_loss: 0.0157 139/500 [=======>......................] - ETA: 3:01 - loss: 0.2835 - regression_loss: 0.2678 - classification_loss: 0.0156 140/500 [=======>......................] - ETA: 3:00 - loss: 0.2827 - regression_loss: 0.2671 - classification_loss: 0.0156 141/500 [=======>......................] - ETA: 3:00 - loss: 0.2812 - regression_loss: 0.2657 - classification_loss: 0.0155 142/500 [=======>......................] - ETA: 2:59 - loss: 0.2818 - regression_loss: 0.2663 - classification_loss: 0.0156 143/500 [=======>......................] - ETA: 2:59 - loss: 0.2818 - regression_loss: 0.2662 - classification_loss: 0.0156 144/500 [=======>......................] - ETA: 2:58 - loss: 0.2828 - regression_loss: 0.2671 - classification_loss: 0.0157 145/500 [=======>......................] - ETA: 2:58 - loss: 0.2822 - regression_loss: 0.2665 - classification_loss: 0.0157 146/500 [=======>......................] - ETA: 2:57 - loss: 0.2831 - regression_loss: 0.2674 - classification_loss: 0.0158 147/500 [=======>......................] - ETA: 2:57 - loss: 0.2825 - regression_loss: 0.2668 - classification_loss: 0.0157 148/500 [=======>......................] - ETA: 2:56 - loss: 0.2840 - regression_loss: 0.2681 - classification_loss: 0.0159 149/500 [=======>......................] - ETA: 2:56 - loss: 0.2847 - regression_loss: 0.2688 - classification_loss: 0.0159 150/500 [========>.....................] - ETA: 2:55 - loss: 0.2842 - regression_loss: 0.2683 - classification_loss: 0.0159 151/500 [========>.....................] - ETA: 2:55 - loss: 0.2844 - regression_loss: 0.2685 - classification_loss: 0.0159 152/500 [========>.....................] - ETA: 2:54 - loss: 0.2843 - regression_loss: 0.2685 - classification_loss: 0.0159 153/500 [========>.....................] - ETA: 2:54 - loss: 0.2852 - regression_loss: 0.2692 - classification_loss: 0.0160 154/500 [========>.....................] - ETA: 2:53 - loss: 0.2841 - regression_loss: 0.2682 - classification_loss: 0.0159 155/500 [========>.....................] - ETA: 2:53 - loss: 0.2837 - regression_loss: 0.2678 - classification_loss: 0.0159 156/500 [========>.....................] - ETA: 2:52 - loss: 0.2838 - regression_loss: 0.2679 - classification_loss: 0.0159 157/500 [========>.....................] - ETA: 2:52 - loss: 0.2840 - regression_loss: 0.2682 - classification_loss: 0.0159 158/500 [========>.....................] - ETA: 2:51 - loss: 0.2847 - regression_loss: 0.2688 - classification_loss: 0.0159 159/500 [========>.....................] - ETA: 2:51 - loss: 0.2860 - regression_loss: 0.2699 - classification_loss: 0.0160 160/500 [========>.....................] - ETA: 2:50 - loss: 0.2851 - regression_loss: 0.2692 - classification_loss: 0.0159 161/500 [========>.....................] - ETA: 2:50 - loss: 0.2876 - regression_loss: 0.2716 - classification_loss: 0.0160 162/500 [========>.....................] - ETA: 2:49 - loss: 0.2880 - regression_loss: 0.2720 - classification_loss: 0.0160 163/500 [========>.....................] - ETA: 2:49 - loss: 0.2876 - regression_loss: 0.2716 - classification_loss: 0.0160 164/500 [========>.....................] - ETA: 2:48 - loss: 0.2876 - regression_loss: 0.2716 - classification_loss: 0.0161 165/500 [========>.....................] - ETA: 2:48 - loss: 0.2865 - regression_loss: 0.2704 - classification_loss: 0.0160 166/500 [========>.....................] - ETA: 2:47 - loss: 0.2856 - regression_loss: 0.2696 - classification_loss: 0.0160 167/500 [=========>....................] - ETA: 2:47 - loss: 0.2851 - regression_loss: 0.2692 - classification_loss: 0.0159 168/500 [=========>....................] - ETA: 2:46 - loss: 0.2850 - regression_loss: 0.2691 - classification_loss: 0.0159 169/500 [=========>....................] - ETA: 2:46 - loss: 0.2841 - regression_loss: 0.2682 - classification_loss: 0.0159 170/500 [=========>....................] - ETA: 2:45 - loss: 0.2834 - regression_loss: 0.2676 - classification_loss: 0.0158 171/500 [=========>....................] - ETA: 2:45 - loss: 0.2828 - regression_loss: 0.2671 - classification_loss: 0.0158 172/500 [=========>....................] - ETA: 2:44 - loss: 0.2817 - regression_loss: 0.2660 - classification_loss: 0.0157 173/500 [=========>....................] - ETA: 2:44 - loss: 0.2819 - regression_loss: 0.2662 - classification_loss: 0.0157 174/500 [=========>....................] - ETA: 2:43 - loss: 0.2814 - regression_loss: 0.2657 - classification_loss: 0.0157 175/500 [=========>....................] - ETA: 2:43 - loss: 0.2807 - regression_loss: 0.2651 - classification_loss: 0.0156 176/500 [=========>....................] - ETA: 2:42 - loss: 0.2794 - regression_loss: 0.2639 - classification_loss: 0.0155 177/500 [=========>....................] - ETA: 2:42 - loss: 0.2797 - regression_loss: 0.2642 - classification_loss: 0.0155 178/500 [=========>....................] - ETA: 2:41 - loss: 0.2792 - regression_loss: 0.2637 - classification_loss: 0.0155 179/500 [=========>....................] - ETA: 2:41 - loss: 0.2782 - regression_loss: 0.2628 - classification_loss: 0.0154 180/500 [=========>....................] - ETA: 2:40 - loss: 0.2775 - regression_loss: 0.2621 - classification_loss: 0.0154 181/500 [=========>....................] - ETA: 2:40 - loss: 0.2776 - regression_loss: 0.2621 - classification_loss: 0.0155 182/500 [=========>....................] - ETA: 2:39 - loss: 0.2775 - regression_loss: 0.2620 - classification_loss: 0.0155 183/500 [=========>....................] - ETA: 2:39 - loss: 0.2764 - regression_loss: 0.2610 - classification_loss: 0.0154 184/500 [==========>...................] - ETA: 2:38 - loss: 0.2769 - regression_loss: 0.2614 - classification_loss: 0.0155 185/500 [==========>...................] - ETA: 2:38 - loss: 0.2781 - regression_loss: 0.2625 - classification_loss: 0.0156 186/500 [==========>...................] - ETA: 2:37 - loss: 0.2786 - regression_loss: 0.2630 - classification_loss: 0.0156 187/500 [==========>...................] - ETA: 2:37 - loss: 0.2787 - regression_loss: 0.2630 - classification_loss: 0.0156 188/500 [==========>...................] - ETA: 2:36 - loss: 0.2787 - regression_loss: 0.2630 - classification_loss: 0.0157 189/500 [==========>...................] - ETA: 2:36 - loss: 0.2789 - regression_loss: 0.2632 - classification_loss: 0.0157 190/500 [==========>...................] - ETA: 2:35 - loss: 0.2808 - regression_loss: 0.2649 - classification_loss: 0.0159 191/500 [==========>...................] - ETA: 2:35 - loss: 0.2803 - regression_loss: 0.2645 - classification_loss: 0.0158 192/500 [==========>...................] - ETA: 2:34 - loss: 0.2807 - regression_loss: 0.2649 - classification_loss: 0.0158 193/500 [==========>...................] - ETA: 2:34 - loss: 0.2807 - regression_loss: 0.2649 - classification_loss: 0.0158 194/500 [==========>...................] - ETA: 2:33 - loss: 0.2811 - regression_loss: 0.2652 - classification_loss: 0.0159 195/500 [==========>...................] - ETA: 2:33 - loss: 0.2814 - regression_loss: 0.2655 - classification_loss: 0.0159 196/500 [==========>...................] - ETA: 2:33 - loss: 0.2827 - regression_loss: 0.2668 - classification_loss: 0.0159 197/500 [==========>...................] - ETA: 2:32 - loss: 0.2828 - regression_loss: 0.2668 - classification_loss: 0.0160 198/500 [==========>...................] - ETA: 2:32 - loss: 0.2829 - regression_loss: 0.2669 - classification_loss: 0.0160 199/500 [==========>...................] - ETA: 2:31 - loss: 0.2830 - regression_loss: 0.2671 - classification_loss: 0.0159 200/500 [===========>..................] - ETA: 2:30 - loss: 0.2824 - regression_loss: 0.2665 - classification_loss: 0.0159 201/500 [===========>..................] - ETA: 2:30 - loss: 0.2828 - regression_loss: 0.2669 - classification_loss: 0.0159 202/500 [===========>..................] - ETA: 2:29 - loss: 0.2832 - regression_loss: 0.2672 - classification_loss: 0.0160 203/500 [===========>..................] - ETA: 2:29 - loss: 0.2831 - regression_loss: 0.2671 - classification_loss: 0.0159 204/500 [===========>..................] - ETA: 2:28 - loss: 0.2833 - regression_loss: 0.2673 - classification_loss: 0.0160 205/500 [===========>..................] - ETA: 2:28 - loss: 0.2822 - regression_loss: 0.2663 - classification_loss: 0.0160 206/500 [===========>..................] - ETA: 2:27 - loss: 0.2829 - regression_loss: 0.2670 - classification_loss: 0.0160 207/500 [===========>..................] - ETA: 2:27 - loss: 0.2823 - regression_loss: 0.2664 - classification_loss: 0.0159 208/500 [===========>..................] - ETA: 2:26 - loss: 0.2819 - regression_loss: 0.2660 - classification_loss: 0.0159 209/500 [===========>..................] - ETA: 2:26 - loss: 0.2831 - regression_loss: 0.2672 - classification_loss: 0.0159 210/500 [===========>..................] - ETA: 2:25 - loss: 0.2841 - regression_loss: 0.2681 - classification_loss: 0.0160 211/500 [===========>..................] - ETA: 2:25 - loss: 0.2834 - regression_loss: 0.2675 - classification_loss: 0.0159 212/500 [===========>..................] - ETA: 2:24 - loss: 0.2834 - regression_loss: 0.2675 - classification_loss: 0.0159 213/500 [===========>..................] - ETA: 2:24 - loss: 0.2824 - regression_loss: 0.2666 - classification_loss: 0.0159 214/500 [===========>..................] - ETA: 2:23 - loss: 0.2823 - regression_loss: 0.2664 - classification_loss: 0.0158 215/500 [===========>..................] - ETA: 2:23 - loss: 0.2817 - regression_loss: 0.2659 - classification_loss: 0.0158 216/500 [===========>..................] - ETA: 2:22 - loss: 0.2812 - regression_loss: 0.2655 - classification_loss: 0.0158 217/500 [============>.................] - ETA: 2:22 - loss: 0.2808 - regression_loss: 0.2651 - classification_loss: 0.0157 218/500 [============>.................] - ETA: 2:21 - loss: 0.2802 - regression_loss: 0.2645 - classification_loss: 0.0157 219/500 [============>.................] - ETA: 2:21 - loss: 0.2802 - regression_loss: 0.2645 - classification_loss: 0.0157 220/500 [============>.................] - ETA: 2:20 - loss: 0.2799 - regression_loss: 0.2643 - classification_loss: 0.0156 221/500 [============>.................] - ETA: 2:20 - loss: 0.2803 - regression_loss: 0.2646 - classification_loss: 0.0158 222/500 [============>.................] - ETA: 2:19 - loss: 0.2797 - regression_loss: 0.2639 - classification_loss: 0.0157 223/500 [============>.................] - ETA: 2:19 - loss: 0.2806 - regression_loss: 0.2649 - classification_loss: 0.0157 224/500 [============>.................] - ETA: 2:18 - loss: 0.2807 - regression_loss: 0.2650 - classification_loss: 0.0157 225/500 [============>.................] - ETA: 2:18 - loss: 0.2801 - regression_loss: 0.2644 - classification_loss: 0.0157 226/500 [============>.................] - ETA: 2:17 - loss: 0.2799 - regression_loss: 0.2642 - classification_loss: 0.0157 227/500 [============>.................] - ETA: 2:17 - loss: 0.2797 - regression_loss: 0.2639 - classification_loss: 0.0157 228/500 [============>.................] - ETA: 2:16 - loss: 0.2791 - regression_loss: 0.2634 - classification_loss: 0.0157 229/500 [============>.................] - ETA: 2:16 - loss: 0.2787 - regression_loss: 0.2631 - classification_loss: 0.0156 230/500 [============>.................] - ETA: 2:15 - loss: 0.2785 - regression_loss: 0.2628 - classification_loss: 0.0156 231/500 [============>.................] - ETA: 2:15 - loss: 0.2779 - regression_loss: 0.2623 - classification_loss: 0.0156 232/500 [============>.................] - ETA: 2:14 - loss: 0.2776 - regression_loss: 0.2620 - classification_loss: 0.0156 233/500 [============>.................] - ETA: 2:14 - loss: 0.2775 - regression_loss: 0.2619 - classification_loss: 0.0156 234/500 [=============>................] - ETA: 2:13 - loss: 0.2769 - regression_loss: 0.2614 - classification_loss: 0.0155 235/500 [=============>................] - ETA: 2:13 - loss: 0.2771 - regression_loss: 0.2616 - classification_loss: 0.0155 236/500 [=============>................] - ETA: 2:12 - loss: 0.2769 - regression_loss: 0.2614 - classification_loss: 0.0155 237/500 [=============>................] - ETA: 2:12 - loss: 0.2764 - regression_loss: 0.2610 - classification_loss: 0.0154 238/500 [=============>................] - ETA: 2:11 - loss: 0.2759 - regression_loss: 0.2605 - classification_loss: 0.0154 239/500 [=============>................] - ETA: 2:11 - loss: 0.2760 - regression_loss: 0.2606 - classification_loss: 0.0154 240/500 [=============>................] - ETA: 2:10 - loss: 0.2754 - regression_loss: 0.2601 - classification_loss: 0.0153 241/500 [=============>................] - ETA: 2:10 - loss: 0.2751 - regression_loss: 0.2598 - classification_loss: 0.0153 242/500 [=============>................] - ETA: 2:09 - loss: 0.2753 - regression_loss: 0.2600 - classification_loss: 0.0153 243/500 [=============>................] - ETA: 2:09 - loss: 0.2754 - regression_loss: 0.2601 - classification_loss: 0.0154 244/500 [=============>................] - ETA: 2:08 - loss: 0.2761 - regression_loss: 0.2607 - classification_loss: 0.0154 245/500 [=============>................] - ETA: 2:08 - loss: 0.2753 - regression_loss: 0.2600 - classification_loss: 0.0154 246/500 [=============>................] - ETA: 2:07 - loss: 0.2759 - regression_loss: 0.2605 - classification_loss: 0.0154 247/500 [=============>................] - ETA: 2:07 - loss: 0.2774 - regression_loss: 0.2619 - classification_loss: 0.0155 248/500 [=============>................] - ETA: 2:06 - loss: 0.2775 - regression_loss: 0.2620 - classification_loss: 0.0155 249/500 [=============>................] - ETA: 2:06 - loss: 0.2778 - regression_loss: 0.2623 - classification_loss: 0.0155 250/500 [==============>...............] - ETA: 2:05 - loss: 0.2781 - regression_loss: 0.2625 - classification_loss: 0.0156 251/500 [==============>...............] - ETA: 2:05 - loss: 0.2779 - regression_loss: 0.2623 - classification_loss: 0.0156 252/500 [==============>...............] - ETA: 2:04 - loss: 0.2780 - regression_loss: 0.2625 - classification_loss: 0.0156 253/500 [==============>...............] - ETA: 2:04 - loss: 0.2779 - regression_loss: 0.2623 - classification_loss: 0.0155 254/500 [==============>...............] - ETA: 2:03 - loss: 0.2790 - regression_loss: 0.2634 - classification_loss: 0.0155 255/500 [==============>...............] - ETA: 2:03 - loss: 0.2789 - regression_loss: 0.2634 - classification_loss: 0.0156 256/500 [==============>...............] - ETA: 2:02 - loss: 0.2790 - regression_loss: 0.2634 - classification_loss: 0.0155 257/500 [==============>...............] - ETA: 2:02 - loss: 0.2794 - regression_loss: 0.2638 - classification_loss: 0.0156 258/500 [==============>...............] - ETA: 2:01 - loss: 0.2797 - regression_loss: 0.2641 - classification_loss: 0.0156 259/500 [==============>...............] - ETA: 2:01 - loss: 0.2796 - regression_loss: 0.2640 - classification_loss: 0.0156 260/500 [==============>...............] - ETA: 2:00 - loss: 0.2795 - regression_loss: 0.2639 - classification_loss: 0.0156 261/500 [==============>...............] - ETA: 2:00 - loss: 0.2796 - regression_loss: 0.2640 - classification_loss: 0.0156 262/500 [==============>...............] - ETA: 1:59 - loss: 0.2790 - regression_loss: 0.2635 - classification_loss: 0.0155 263/500 [==============>...............] - ETA: 1:59 - loss: 0.2793 - regression_loss: 0.2637 - classification_loss: 0.0156 264/500 [==============>...............] - ETA: 1:58 - loss: 0.2794 - regression_loss: 0.2638 - classification_loss: 0.0156 265/500 [==============>...............] - ETA: 1:58 - loss: 0.2801 - regression_loss: 0.2644 - classification_loss: 0.0157 266/500 [==============>...............] - ETA: 1:57 - loss: 0.2801 - regression_loss: 0.2644 - classification_loss: 0.0157 267/500 [===============>..............] - ETA: 1:57 - loss: 0.2801 - regression_loss: 0.2644 - classification_loss: 0.0157 268/500 [===============>..............] - ETA: 1:56 - loss: 0.2800 - regression_loss: 0.2643 - classification_loss: 0.0157 269/500 [===============>..............] - ETA: 1:56 - loss: 0.2815 - regression_loss: 0.2658 - classification_loss: 0.0157 270/500 [===============>..............] - ETA: 1:55 - loss: 0.2813 - regression_loss: 0.2657 - classification_loss: 0.0157 271/500 [===============>..............] - ETA: 1:55 - loss: 0.2808 - regression_loss: 0.2651 - classification_loss: 0.0156 272/500 [===============>..............] - ETA: 1:54 - loss: 0.2808 - regression_loss: 0.2652 - classification_loss: 0.0156 273/500 [===============>..............] - ETA: 1:54 - loss: 0.2808 - regression_loss: 0.2652 - classification_loss: 0.0156 274/500 [===============>..............] - ETA: 1:53 - loss: 0.2808 - regression_loss: 0.2652 - classification_loss: 0.0156 275/500 [===============>..............] - ETA: 1:53 - loss: 0.2808 - regression_loss: 0.2652 - classification_loss: 0.0156 276/500 [===============>..............] - ETA: 1:52 - loss: 0.2807 - regression_loss: 0.2651 - classification_loss: 0.0156 277/500 [===============>..............] - ETA: 1:52 - loss: 0.2812 - regression_loss: 0.2657 - classification_loss: 0.0156 278/500 [===============>..............] - ETA: 1:51 - loss: 0.2813 - regression_loss: 0.2657 - classification_loss: 0.0155 279/500 [===============>..............] - ETA: 1:51 - loss: 0.2817 - regression_loss: 0.2662 - classification_loss: 0.0156 280/500 [===============>..............] - ETA: 1:50 - loss: 0.2817 - regression_loss: 0.2661 - classification_loss: 0.0155 281/500 [===============>..............] - ETA: 1:50 - loss: 0.2813 - regression_loss: 0.2657 - classification_loss: 0.0155 282/500 [===============>..............] - ETA: 1:49 - loss: 0.2818 - regression_loss: 0.2663 - classification_loss: 0.0155 283/500 [===============>..............] - ETA: 1:49 - loss: 0.2813 - regression_loss: 0.2658 - classification_loss: 0.0155 284/500 [================>.............] - ETA: 1:48 - loss: 0.2813 - regression_loss: 0.2659 - classification_loss: 0.0155 285/500 [================>.............] - ETA: 1:48 - loss: 0.2818 - regression_loss: 0.2663 - classification_loss: 0.0155 286/500 [================>.............] - ETA: 1:47 - loss: 0.2816 - regression_loss: 0.2661 - classification_loss: 0.0155 287/500 [================>.............] - ETA: 1:47 - loss: 0.2812 - regression_loss: 0.2657 - classification_loss: 0.0155 288/500 [================>.............] - ETA: 1:46 - loss: 0.2811 - regression_loss: 0.2656 - classification_loss: 0.0155 289/500 [================>.............] - ETA: 1:46 - loss: 0.2814 - regression_loss: 0.2659 - classification_loss: 0.0155 290/500 [================>.............] - ETA: 1:45 - loss: 0.2809 - regression_loss: 0.2655 - classification_loss: 0.0154 291/500 [================>.............] - ETA: 1:45 - loss: 0.2807 - regression_loss: 0.2653 - classification_loss: 0.0154 292/500 [================>.............] - ETA: 1:44 - loss: 0.2804 - regression_loss: 0.2650 - classification_loss: 0.0154 293/500 [================>.............] - ETA: 1:44 - loss: 0.2802 - regression_loss: 0.2648 - classification_loss: 0.0154 294/500 [================>.............] - ETA: 1:43 - loss: 0.2802 - regression_loss: 0.2648 - classification_loss: 0.0154 295/500 [================>.............] - ETA: 1:43 - loss: 0.2797 - regression_loss: 0.2644 - classification_loss: 0.0153 296/500 [================>.............] - ETA: 1:42 - loss: 0.2795 - regression_loss: 0.2642 - classification_loss: 0.0153 297/500 [================>.............] - ETA: 1:42 - loss: 0.2791 - regression_loss: 0.2638 - classification_loss: 0.0153 298/500 [================>.............] - ETA: 1:41 - loss: 0.2788 - regression_loss: 0.2635 - classification_loss: 0.0153 299/500 [================>.............] - ETA: 1:41 - loss: 0.2783 - regression_loss: 0.2630 - classification_loss: 0.0152 300/500 [=================>............] - ETA: 1:40 - loss: 0.2785 - regression_loss: 0.2632 - classification_loss: 0.0153 301/500 [=================>............] - ETA: 1:40 - loss: 0.2786 - regression_loss: 0.2633 - classification_loss: 0.0153 302/500 [=================>............] - ETA: 1:39 - loss: 0.2789 - regression_loss: 0.2636 - classification_loss: 0.0153 303/500 [=================>............] - ETA: 1:39 - loss: 0.2791 - regression_loss: 0.2638 - classification_loss: 0.0153 304/500 [=================>............] - ETA: 1:38 - loss: 0.2787 - regression_loss: 0.2634 - classification_loss: 0.0153 305/500 [=================>............] - ETA: 1:38 - loss: 0.2791 - regression_loss: 0.2637 - classification_loss: 0.0154 306/500 [=================>............] - ETA: 1:37 - loss: 0.2792 - regression_loss: 0.2638 - classification_loss: 0.0154 307/500 [=================>............] - ETA: 1:37 - loss: 0.2801 - regression_loss: 0.2647 - classification_loss: 0.0154 308/500 [=================>............] - ETA: 1:36 - loss: 0.2801 - regression_loss: 0.2647 - classification_loss: 0.0154 309/500 [=================>............] - ETA: 1:36 - loss: 0.2796 - regression_loss: 0.2642 - classification_loss: 0.0154 310/500 [=================>............] - ETA: 1:35 - loss: 0.2793 - regression_loss: 0.2639 - classification_loss: 0.0154 311/500 [=================>............] - ETA: 1:35 - loss: 0.2799 - regression_loss: 0.2645 - classification_loss: 0.0154 312/500 [=================>............] - ETA: 1:34 - loss: 0.2805 - regression_loss: 0.2650 - classification_loss: 0.0155 313/500 [=================>............] - ETA: 1:34 - loss: 0.2801 - regression_loss: 0.2647 - classification_loss: 0.0155 314/500 [=================>............] - ETA: 1:33 - loss: 0.2799 - regression_loss: 0.2644 - classification_loss: 0.0154 315/500 [=================>............] - ETA: 1:33 - loss: 0.2795 - regression_loss: 0.2641 - classification_loss: 0.0154 316/500 [=================>............] - ETA: 1:32 - loss: 0.2800 - regression_loss: 0.2645 - classification_loss: 0.0155 317/500 [==================>...........] - ETA: 1:32 - loss: 0.2796 - regression_loss: 0.2641 - classification_loss: 0.0155 318/500 [==================>...........] - ETA: 1:31 - loss: 0.2798 - regression_loss: 0.2643 - classification_loss: 0.0155 319/500 [==================>...........] - ETA: 1:31 - loss: 0.2791 - regression_loss: 0.2636 - classification_loss: 0.0155 320/500 [==================>...........] - ETA: 1:30 - loss: 0.2794 - regression_loss: 0.2640 - classification_loss: 0.0155 321/500 [==================>...........] - ETA: 1:30 - loss: 0.2794 - regression_loss: 0.2640 - classification_loss: 0.0154 322/500 [==================>...........] - ETA: 1:29 - loss: 0.2793 - regression_loss: 0.2638 - classification_loss: 0.0154 323/500 [==================>...........] - ETA: 1:29 - loss: 0.2788 - regression_loss: 0.2634 - classification_loss: 0.0154 324/500 [==================>...........] - ETA: 1:28 - loss: 0.2787 - regression_loss: 0.2633 - classification_loss: 0.0154 325/500 [==================>...........] - ETA: 1:28 - loss: 0.2792 - regression_loss: 0.2637 - classification_loss: 0.0155 326/500 [==================>...........] - ETA: 1:27 - loss: 0.2784 - regression_loss: 0.2630 - classification_loss: 0.0154 327/500 [==================>...........] - ETA: 1:27 - loss: 0.2779 - regression_loss: 0.2625 - classification_loss: 0.0154 328/500 [==================>...........] - ETA: 1:26 - loss: 0.2780 - regression_loss: 0.2625 - classification_loss: 0.0154 329/500 [==================>...........] - ETA: 1:26 - loss: 0.2780 - regression_loss: 0.2625 - classification_loss: 0.0154 330/500 [==================>...........] - ETA: 1:25 - loss: 0.2774 - regression_loss: 0.2620 - classification_loss: 0.0154 331/500 [==================>...........] - ETA: 1:25 - loss: 0.2769 - regression_loss: 0.2616 - classification_loss: 0.0153 332/500 [==================>...........] - ETA: 1:24 - loss: 0.2767 - regression_loss: 0.2613 - classification_loss: 0.0153 333/500 [==================>...........] - ETA: 1:24 - loss: 0.2764 - regression_loss: 0.2611 - classification_loss: 0.0153 334/500 [===================>..........] - ETA: 1:23 - loss: 0.2762 - regression_loss: 0.2609 - classification_loss: 0.0153 335/500 [===================>..........] - ETA: 1:23 - loss: 0.2765 - regression_loss: 0.2612 - classification_loss: 0.0153 336/500 [===================>..........] - ETA: 1:22 - loss: 0.2768 - regression_loss: 0.2614 - classification_loss: 0.0153 337/500 [===================>..........] - ETA: 1:22 - loss: 0.2765 - regression_loss: 0.2612 - classification_loss: 0.0153 338/500 [===================>..........] - ETA: 1:21 - loss: 0.2763 - regression_loss: 0.2610 - classification_loss: 0.0153 339/500 [===================>..........] - ETA: 1:21 - loss: 0.2762 - regression_loss: 0.2609 - classification_loss: 0.0153 340/500 [===================>..........] - ETA: 1:20 - loss: 0.2762 - regression_loss: 0.2609 - classification_loss: 0.0153 341/500 [===================>..........] - ETA: 1:20 - loss: 0.2759 - regression_loss: 0.2606 - classification_loss: 0.0153 342/500 [===================>..........] - ETA: 1:19 - loss: 0.2759 - regression_loss: 0.2606 - classification_loss: 0.0153 343/500 [===================>..........] - ETA: 1:19 - loss: 0.2765 - regression_loss: 0.2611 - classification_loss: 0.0153 344/500 [===================>..........] - ETA: 1:18 - loss: 0.2769 - regression_loss: 0.2614 - classification_loss: 0.0155 345/500 [===================>..........] - ETA: 1:18 - loss: 0.2772 - regression_loss: 0.2617 - classification_loss: 0.0155 346/500 [===================>..........] - ETA: 1:17 - loss: 0.2766 - regression_loss: 0.2611 - classification_loss: 0.0155 347/500 [===================>..........] - ETA: 1:17 - loss: 0.2764 - regression_loss: 0.2609 - classification_loss: 0.0154 348/500 [===================>..........] - ETA: 1:16 - loss: 0.2768 - regression_loss: 0.2613 - classification_loss: 0.0154 349/500 [===================>..........] - ETA: 1:16 - loss: 0.2764 - regression_loss: 0.2610 - classification_loss: 0.0154 350/500 [====================>.........] - ETA: 1:15 - loss: 0.2760 - regression_loss: 0.2606 - classification_loss: 0.0154 351/500 [====================>.........] - ETA: 1:15 - loss: 0.2759 - regression_loss: 0.2605 - classification_loss: 0.0154 352/500 [====================>.........] - ETA: 1:14 - loss: 0.2765 - regression_loss: 0.2610 - classification_loss: 0.0154 353/500 [====================>.........] - ETA: 1:14 - loss: 0.2772 - regression_loss: 0.2617 - classification_loss: 0.0155 354/500 [====================>.........] - ETA: 1:13 - loss: 0.2773 - regression_loss: 0.2619 - classification_loss: 0.0155 355/500 [====================>.........] - ETA: 1:13 - loss: 0.2776 - regression_loss: 0.2622 - classification_loss: 0.0155 356/500 [====================>.........] - ETA: 1:12 - loss: 0.2779 - regression_loss: 0.2624 - classification_loss: 0.0155 357/500 [====================>.........] - ETA: 1:12 - loss: 0.2780 - regression_loss: 0.2625 - classification_loss: 0.0155 358/500 [====================>.........] - ETA: 1:11 - loss: 0.2780 - regression_loss: 0.2625 - classification_loss: 0.0155 359/500 [====================>.........] - ETA: 1:11 - loss: 0.2779 - regression_loss: 0.2625 - classification_loss: 0.0155 360/500 [====================>.........] - ETA: 1:10 - loss: 0.2776 - regression_loss: 0.2621 - classification_loss: 0.0155 361/500 [====================>.........] - ETA: 1:10 - loss: 0.2774 - regression_loss: 0.2619 - classification_loss: 0.0155 362/500 [====================>.........] - ETA: 1:09 - loss: 0.2774 - regression_loss: 0.2619 - classification_loss: 0.0155 363/500 [====================>.........] - ETA: 1:08 - loss: 0.2773 - regression_loss: 0.2618 - classification_loss: 0.0155 364/500 [====================>.........] - ETA: 1:08 - loss: 0.2772 - regression_loss: 0.2618 - classification_loss: 0.0155 365/500 [====================>.........] - ETA: 1:07 - loss: 0.2772 - regression_loss: 0.2618 - classification_loss: 0.0155 366/500 [====================>.........] - ETA: 1:07 - loss: 0.2770 - regression_loss: 0.2616 - classification_loss: 0.0154 367/500 [=====================>........] - ETA: 1:06 - loss: 0.2772 - regression_loss: 0.2618 - classification_loss: 0.0154 368/500 [=====================>........] - ETA: 1:06 - loss: 0.2779 - regression_loss: 0.2624 - classification_loss: 0.0155 369/500 [=====================>........] - ETA: 1:05 - loss: 0.2776 - regression_loss: 0.2621 - classification_loss: 0.0155 370/500 [=====================>........] - ETA: 1:05 - loss: 0.2772 - regression_loss: 0.2618 - classification_loss: 0.0155 371/500 [=====================>........] - ETA: 1:04 - loss: 0.2768 - regression_loss: 0.2614 - classification_loss: 0.0154 372/500 [=====================>........] - ETA: 1:04 - loss: 0.2767 - regression_loss: 0.2613 - classification_loss: 0.0154 373/500 [=====================>........] - ETA: 1:03 - loss: 0.2764 - regression_loss: 0.2611 - classification_loss: 0.0154 374/500 [=====================>........] - ETA: 1:03 - loss: 0.2764 - regression_loss: 0.2610 - classification_loss: 0.0154 375/500 [=====================>........] - ETA: 1:02 - loss: 0.2764 - regression_loss: 0.2610 - classification_loss: 0.0154 376/500 [=====================>........] - ETA: 1:02 - loss: 0.2766 - regression_loss: 0.2612 - classification_loss: 0.0154 377/500 [=====================>........] - ETA: 1:01 - loss: 0.2771 - regression_loss: 0.2616 - classification_loss: 0.0154 378/500 [=====================>........] - ETA: 1:01 - loss: 0.2770 - regression_loss: 0.2616 - classification_loss: 0.0154 379/500 [=====================>........] - ETA: 1:00 - loss: 0.2773 - regression_loss: 0.2618 - classification_loss: 0.0155 380/500 [=====================>........] - ETA: 1:00 - loss: 0.2777 - regression_loss: 0.2622 - classification_loss: 0.0155 381/500 [=====================>........] - ETA: 59s - loss: 0.2779 - regression_loss: 0.2624 - classification_loss: 0.0155  382/500 [=====================>........] - ETA: 59s - loss: 0.2780 - regression_loss: 0.2625 - classification_loss: 0.0155 383/500 [=====================>........] - ETA: 58s - loss: 0.2781 - regression_loss: 0.2625 - classification_loss: 0.0155 384/500 [======================>.......] - ETA: 58s - loss: 0.2778 - regression_loss: 0.2623 - classification_loss: 0.0155 385/500 [======================>.......] - ETA: 57s - loss: 0.2778 - regression_loss: 0.2623 - classification_loss: 0.0155 386/500 [======================>.......] - ETA: 57s - loss: 0.2781 - regression_loss: 0.2626 - classification_loss: 0.0155 387/500 [======================>.......] - ETA: 56s - loss: 0.2780 - regression_loss: 0.2625 - classification_loss: 0.0155 388/500 [======================>.......] - ETA: 56s - loss: 0.2784 - regression_loss: 0.2628 - classification_loss: 0.0155 389/500 [======================>.......] - ETA: 55s - loss: 0.2782 - regression_loss: 0.2627 - classification_loss: 0.0155 390/500 [======================>.......] - ETA: 55s - loss: 0.2779 - regression_loss: 0.2624 - classification_loss: 0.0155 391/500 [======================>.......] - ETA: 54s - loss: 0.2778 - regression_loss: 0.2623 - classification_loss: 0.0154 392/500 [======================>.......] - ETA: 54s - loss: 0.2773 - regression_loss: 0.2619 - classification_loss: 0.0154 393/500 [======================>.......] - ETA: 53s - loss: 0.2780 - regression_loss: 0.2626 - classification_loss: 0.0154 394/500 [======================>.......] - ETA: 53s - loss: 0.2783 - regression_loss: 0.2629 - classification_loss: 0.0154 395/500 [======================>.......] - ETA: 52s - loss: 0.2781 - regression_loss: 0.2627 - classification_loss: 0.0154 396/500 [======================>.......] - ETA: 52s - loss: 0.2779 - regression_loss: 0.2625 - classification_loss: 0.0154 397/500 [======================>.......] - ETA: 51s - loss: 0.2775 - regression_loss: 0.2621 - classification_loss: 0.0154 398/500 [======================>.......] - ETA: 51s - loss: 0.2776 - regression_loss: 0.2622 - classification_loss: 0.0154 399/500 [======================>.......] - ETA: 50s - loss: 0.2774 - regression_loss: 0.2621 - classification_loss: 0.0154 400/500 [=======================>......] - ETA: 50s - loss: 0.2773 - regression_loss: 0.2620 - classification_loss: 0.0153 401/500 [=======================>......] - ETA: 49s - loss: 0.2770 - regression_loss: 0.2616 - classification_loss: 0.0153 402/500 [=======================>......] - ETA: 49s - loss: 0.2771 - regression_loss: 0.2618 - classification_loss: 0.0153 403/500 [=======================>......] - ETA: 48s - loss: 0.2767 - regression_loss: 0.2615 - classification_loss: 0.0153 404/500 [=======================>......] - ETA: 48s - loss: 0.2770 - regression_loss: 0.2617 - classification_loss: 0.0153 405/500 [=======================>......] - ETA: 47s - loss: 0.2769 - regression_loss: 0.2616 - classification_loss: 0.0153 406/500 [=======================>......] - ETA: 47s - loss: 0.2777 - regression_loss: 0.2624 - classification_loss: 0.0153 407/500 [=======================>......] - ETA: 46s - loss: 0.2774 - regression_loss: 0.2622 - classification_loss: 0.0153 408/500 [=======================>......] - ETA: 46s - loss: 0.2775 - regression_loss: 0.2622 - classification_loss: 0.0153 409/500 [=======================>......] - ETA: 45s - loss: 0.2772 - regression_loss: 0.2619 - classification_loss: 0.0152 410/500 [=======================>......] - ETA: 45s - loss: 0.2771 - regression_loss: 0.2618 - classification_loss: 0.0152 411/500 [=======================>......] - ETA: 44s - loss: 0.2768 - regression_loss: 0.2616 - classification_loss: 0.0152 412/500 [=======================>......] - ETA: 44s - loss: 0.2766 - regression_loss: 0.2614 - classification_loss: 0.0152 413/500 [=======================>......] - ETA: 43s - loss: 0.2763 - regression_loss: 0.2611 - classification_loss: 0.0152 414/500 [=======================>......] - ETA: 43s - loss: 0.2767 - regression_loss: 0.2615 - classification_loss: 0.0152 415/500 [=======================>......] - ETA: 42s - loss: 0.2767 - regression_loss: 0.2616 - classification_loss: 0.0152 416/500 [=======================>......] - ETA: 42s - loss: 0.2767 - regression_loss: 0.2615 - classification_loss: 0.0152 417/500 [========================>.....] - ETA: 41s - loss: 0.2764 - regression_loss: 0.2613 - classification_loss: 0.0152 418/500 [========================>.....] - ETA: 41s - loss: 0.2764 - regression_loss: 0.2612 - classification_loss: 0.0152 419/500 [========================>.....] - ETA: 40s - loss: 0.2764 - regression_loss: 0.2612 - classification_loss: 0.0152 420/500 [========================>.....] - ETA: 40s - loss: 0.2762 - regression_loss: 0.2611 - classification_loss: 0.0152 421/500 [========================>.....] - ETA: 39s - loss: 0.2767 - regression_loss: 0.2615 - classification_loss: 0.0152 422/500 [========================>.....] - ETA: 39s - loss: 0.2764 - regression_loss: 0.2612 - classification_loss: 0.0152 423/500 [========================>.....] - ETA: 38s - loss: 0.2764 - regression_loss: 0.2612 - classification_loss: 0.0152 424/500 [========================>.....] - ETA: 38s - loss: 0.2761 - regression_loss: 0.2609 - classification_loss: 0.0152 425/500 [========================>.....] - ETA: 37s - loss: 0.2757 - regression_loss: 0.2606 - classification_loss: 0.0151 426/500 [========================>.....] - ETA: 37s - loss: 0.2757 - regression_loss: 0.2606 - classification_loss: 0.0151 427/500 [========================>.....] - ETA: 36s - loss: 0.2755 - regression_loss: 0.2604 - classification_loss: 0.0151 428/500 [========================>.....] - ETA: 36s - loss: 0.2755 - regression_loss: 0.2604 - classification_loss: 0.0151 429/500 [========================>.....] - ETA: 35s - loss: 0.2757 - regression_loss: 0.2605 - classification_loss: 0.0152 430/500 [========================>.....] - ETA: 35s - loss: 0.2754 - regression_loss: 0.2603 - classification_loss: 0.0151 431/500 [========================>.....] - ETA: 34s - loss: 0.2758 - regression_loss: 0.2606 - classification_loss: 0.0152 432/500 [========================>.....] - ETA: 34s - loss: 0.2759 - regression_loss: 0.2607 - classification_loss: 0.0152 433/500 [========================>.....] - ETA: 33s - loss: 0.2754 - regression_loss: 0.2603 - classification_loss: 0.0152 434/500 [=========================>....] - ETA: 33s - loss: 0.2756 - regression_loss: 0.2605 - classification_loss: 0.0152 435/500 [=========================>....] - ETA: 32s - loss: 0.2751 - regression_loss: 0.2600 - classification_loss: 0.0151 436/500 [=========================>....] - ETA: 32s - loss: 0.2752 - regression_loss: 0.2600 - classification_loss: 0.0151 437/500 [=========================>....] - ETA: 31s - loss: 0.2750 - regression_loss: 0.2598 - classification_loss: 0.0152 438/500 [=========================>....] - ETA: 31s - loss: 0.2747 - regression_loss: 0.2596 - classification_loss: 0.0151 439/500 [=========================>....] - ETA: 30s - loss: 0.2748 - regression_loss: 0.2597 - classification_loss: 0.0151 440/500 [=========================>....] - ETA: 30s - loss: 0.2745 - regression_loss: 0.2594 - classification_loss: 0.0151 441/500 [=========================>....] - ETA: 29s - loss: 0.2746 - regression_loss: 0.2595 - classification_loss: 0.0151 442/500 [=========================>....] - ETA: 29s - loss: 0.2749 - regression_loss: 0.2598 - classification_loss: 0.0151 443/500 [=========================>....] - ETA: 28s - loss: 0.2750 - regression_loss: 0.2599 - classification_loss: 0.0151 444/500 [=========================>....] - ETA: 28s - loss: 0.2749 - regression_loss: 0.2598 - classification_loss: 0.0151 445/500 [=========================>....] - ETA: 27s - loss: 0.2749 - regression_loss: 0.2598 - classification_loss: 0.0151 446/500 [=========================>....] - ETA: 27s - loss: 0.2751 - regression_loss: 0.2600 - classification_loss: 0.0152 447/500 [=========================>....] - ETA: 26s - loss: 0.2752 - regression_loss: 0.2600 - classification_loss: 0.0152 448/500 [=========================>....] - ETA: 26s - loss: 0.2752 - regression_loss: 0.2600 - classification_loss: 0.0152 449/500 [=========================>....] - ETA: 25s - loss: 0.2748 - regression_loss: 0.2596 - classification_loss: 0.0152 450/500 [==========================>...] - ETA: 25s - loss: 0.2748 - regression_loss: 0.2596 - classification_loss: 0.0152 451/500 [==========================>...] - ETA: 24s - loss: 0.2745 - regression_loss: 0.2594 - classification_loss: 0.0151 452/500 [==========================>...] - ETA: 24s - loss: 0.2748 - regression_loss: 0.2597 - classification_loss: 0.0151 453/500 [==========================>...] - ETA: 23s - loss: 0.2747 - regression_loss: 0.2596 - classification_loss: 0.0151 454/500 [==========================>...] - ETA: 23s - loss: 0.2751 - regression_loss: 0.2600 - classification_loss: 0.0151 455/500 [==========================>...] - ETA: 22s - loss: 0.2753 - regression_loss: 0.2601 - classification_loss: 0.0151 456/500 [==========================>...] - ETA: 22s - loss: 0.2751 - regression_loss: 0.2600 - classification_loss: 0.0151 457/500 [==========================>...] - ETA: 21s - loss: 0.2750 - regression_loss: 0.2599 - classification_loss: 0.0151 458/500 [==========================>...] - ETA: 21s - loss: 0.2746 - regression_loss: 0.2595 - classification_loss: 0.0151 459/500 [==========================>...] - ETA: 20s - loss: 0.2747 - regression_loss: 0.2596 - classification_loss: 0.0151 460/500 [==========================>...] - ETA: 20s - loss: 0.2747 - regression_loss: 0.2596 - classification_loss: 0.0151 461/500 [==========================>...] - ETA: 19s - loss: 0.2747 - regression_loss: 0.2596 - classification_loss: 0.0151 462/500 [==========================>...] - ETA: 19s - loss: 0.2753 - regression_loss: 0.2601 - classification_loss: 0.0151 463/500 [==========================>...] - ETA: 18s - loss: 0.2753 - regression_loss: 0.2602 - classification_loss: 0.0152 464/500 [==========================>...] - ETA: 18s - loss: 0.2752 - regression_loss: 0.2601 - classification_loss: 0.0151 465/500 [==========================>...] - ETA: 17s - loss: 0.2752 - regression_loss: 0.2600 - classification_loss: 0.0151 466/500 [==========================>...] - ETA: 17s - loss: 0.2755 - regression_loss: 0.2604 - classification_loss: 0.0151 467/500 [===========================>..] - ETA: 16s - loss: 0.2753 - regression_loss: 0.2602 - classification_loss: 0.0151 468/500 [===========================>..] - ETA: 16s - loss: 0.2755 - regression_loss: 0.2604 - classification_loss: 0.0151 469/500 [===========================>..] - ETA: 15s - loss: 0.2757 - regression_loss: 0.2606 - classification_loss: 0.0151 470/500 [===========================>..] - ETA: 15s - loss: 0.2761 - regression_loss: 0.2609 - classification_loss: 0.0151 471/500 [===========================>..] - ETA: 14s - loss: 0.2763 - regression_loss: 0.2611 - classification_loss: 0.0151 472/500 [===========================>..] - ETA: 14s - loss: 0.2767 - regression_loss: 0.2616 - classification_loss: 0.0152 473/500 [===========================>..] - ETA: 13s - loss: 0.2766 - regression_loss: 0.2615 - classification_loss: 0.0151 474/500 [===========================>..] - ETA: 13s - loss: 0.2769 - regression_loss: 0.2617 - classification_loss: 0.0151 475/500 [===========================>..] - ETA: 12s - loss: 0.2769 - regression_loss: 0.2617 - classification_loss: 0.0151 476/500 [===========================>..] - ETA: 12s - loss: 0.2767 - regression_loss: 0.2615 - classification_loss: 0.0151 477/500 [===========================>..] - ETA: 11s - loss: 0.2768 - regression_loss: 0.2617 - classification_loss: 0.0151 478/500 [===========================>..] - ETA: 11s - loss: 0.2768 - regression_loss: 0.2617 - classification_loss: 0.0151 479/500 [===========================>..] - ETA: 10s - loss: 0.2768 - regression_loss: 0.2616 - classification_loss: 0.0151 480/500 [===========================>..] - ETA: 10s - loss: 0.2768 - regression_loss: 0.2617 - classification_loss: 0.0151 481/500 [===========================>..] - ETA: 9s - loss: 0.2770 - regression_loss: 0.2619 - classification_loss: 0.0151  482/500 [===========================>..] - ETA: 9s - loss: 0.2768 - regression_loss: 0.2617 - classification_loss: 0.0151 483/500 [===========================>..] - ETA: 8s - loss: 0.2768 - regression_loss: 0.2617 - classification_loss: 0.0151 484/500 [============================>.] - ETA: 8s - loss: 0.2771 - regression_loss: 0.2620 - classification_loss: 0.0151 485/500 [============================>.] - ETA: 7s - loss: 0.2773 - regression_loss: 0.2622 - classification_loss: 0.0151 486/500 [============================>.] - ETA: 7s - loss: 0.2776 - regression_loss: 0.2625 - classification_loss: 0.0151 487/500 [============================>.] - ETA: 6s - loss: 0.2774 - regression_loss: 0.2623 - classification_loss: 0.0151 488/500 [============================>.] - ETA: 6s - loss: 0.2770 - regression_loss: 0.2619 - classification_loss: 0.0151 489/500 [============================>.] - ETA: 5s - loss: 0.2769 - regression_loss: 0.2619 - classification_loss: 0.0151 490/500 [============================>.] - ETA: 5s - loss: 0.2768 - regression_loss: 0.2617 - classification_loss: 0.0151 491/500 [============================>.] - ETA: 4s - loss: 0.2767 - regression_loss: 0.2617 - classification_loss: 0.0151 492/500 [============================>.] - ETA: 4s - loss: 0.2765 - regression_loss: 0.2615 - classification_loss: 0.0150 493/500 [============================>.] - ETA: 3s - loss: 0.2763 - regression_loss: 0.2613 - classification_loss: 0.0150 494/500 [============================>.] - ETA: 3s - loss: 0.2769 - regression_loss: 0.2617 - classification_loss: 0.0151 495/500 [============================>.] - ETA: 2s - loss: 0.2768 - regression_loss: 0.2616 - classification_loss: 0.0151 496/500 [============================>.] - ETA: 2s - loss: 0.2766 - regression_loss: 0.2615 - classification_loss: 0.0151 497/500 [============================>.] - ETA: 1s - loss: 0.2768 - regression_loss: 0.2616 - classification_loss: 0.0151 498/500 [============================>.] - ETA: 1s - loss: 0.2764 - regression_loss: 0.2613 - classification_loss: 0.0151 499/500 [============================>.] - ETA: 0s - loss: 0.2770 - regression_loss: 0.2618 - classification_loss: 0.0151 500/500 [==============================] - 253s 505ms/step - loss: 0.2772 - regression_loss: 0.2620 - classification_loss: 0.0152 1172 instances of class plum with average precision: 0.7553 mAP: 0.7553 Epoch 00022: saving model to ./training/snapshots/resnet101_pascal_22.h5 Epoch 23/150 1/500 [..............................] - ETA: 4:16 - loss: 0.2515 - regression_loss: 0.2408 - classification_loss: 0.0107 2/500 [..............................] - ETA: 4:12 - loss: 0.3358 - regression_loss: 0.3180 - classification_loss: 0.0178 3/500 [..............................] - ETA: 4:11 - loss: 0.2941 - regression_loss: 0.2804 - classification_loss: 0.0136 4/500 [..............................] - ETA: 4:10 - loss: 0.2974 - regression_loss: 0.2849 - classification_loss: 0.0125 5/500 [..............................] - ETA: 4:08 - loss: 0.2918 - regression_loss: 0.2788 - classification_loss: 0.0130 6/500 [..............................] - ETA: 4:08 - loss: 0.2948 - regression_loss: 0.2814 - classification_loss: 0.0134 7/500 [..............................] - ETA: 4:08 - loss: 0.2922 - regression_loss: 0.2788 - classification_loss: 0.0134 8/500 [..............................] - ETA: 4:07 - loss: 0.2896 - regression_loss: 0.2764 - classification_loss: 0.0132 9/500 [..............................] - ETA: 4:07 - loss: 0.2774 - regression_loss: 0.2646 - classification_loss: 0.0129 10/500 [..............................] - ETA: 4:07 - loss: 0.2824 - regression_loss: 0.2702 - classification_loss: 0.0122 11/500 [..............................] - ETA: 4:08 - loss: 0.2784 - regression_loss: 0.2659 - classification_loss: 0.0125 12/500 [..............................] - ETA: 4:08 - loss: 0.2861 - regression_loss: 0.2728 - classification_loss: 0.0133 13/500 [..............................] - ETA: 4:07 - loss: 0.3022 - regression_loss: 0.2880 - classification_loss: 0.0142 14/500 [..............................] - ETA: 4:06 - loss: 0.3078 - regression_loss: 0.2929 - classification_loss: 0.0148 15/500 [..............................] - ETA: 4:06 - loss: 0.3031 - regression_loss: 0.2880 - classification_loss: 0.0151 16/500 [..............................] - ETA: 4:05 - loss: 0.3112 - regression_loss: 0.2961 - classification_loss: 0.0151 17/500 [>.............................] - ETA: 4:05 - loss: 0.3045 - regression_loss: 0.2896 - classification_loss: 0.0149 18/500 [>.............................] - ETA: 4:04 - loss: 0.3166 - regression_loss: 0.3016 - classification_loss: 0.0151 19/500 [>.............................] - ETA: 4:05 - loss: 0.3131 - regression_loss: 0.2984 - classification_loss: 0.0148 20/500 [>.............................] - ETA: 4:05 - loss: 0.3094 - regression_loss: 0.2949 - classification_loss: 0.0145 21/500 [>.............................] - ETA: 4:04 - loss: 0.3073 - regression_loss: 0.2929 - classification_loss: 0.0144 22/500 [>.............................] - ETA: 4:03 - loss: 0.3070 - regression_loss: 0.2927 - classification_loss: 0.0143 23/500 [>.............................] - ETA: 4:03 - loss: 0.2989 - regression_loss: 0.2850 - classification_loss: 0.0138 24/500 [>.............................] - ETA: 4:02 - loss: 0.2945 - regression_loss: 0.2809 - classification_loss: 0.0136 25/500 [>.............................] - ETA: 4:02 - loss: 0.2891 - regression_loss: 0.2759 - classification_loss: 0.0132 26/500 [>.............................] - ETA: 4:01 - loss: 0.2852 - regression_loss: 0.2724 - classification_loss: 0.0128 27/500 [>.............................] - ETA: 4:00 - loss: 0.2852 - regression_loss: 0.2724 - classification_loss: 0.0128 28/500 [>.............................] - ETA: 4:00 - loss: 0.2800 - regression_loss: 0.2674 - classification_loss: 0.0125 29/500 [>.............................] - ETA: 4:00 - loss: 0.2776 - regression_loss: 0.2652 - classification_loss: 0.0124 30/500 [>.............................] - ETA: 3:59 - loss: 0.2759 - regression_loss: 0.2635 - classification_loss: 0.0124 31/500 [>.............................] - ETA: 3:59 - loss: 0.2712 - regression_loss: 0.2590 - classification_loss: 0.0122 32/500 [>.............................] - ETA: 3:58 - loss: 0.2674 - regression_loss: 0.2555 - classification_loss: 0.0119 33/500 [>.............................] - ETA: 3:57 - loss: 0.2648 - regression_loss: 0.2530 - classification_loss: 0.0117 34/500 [=>............................] - ETA: 3:57 - loss: 0.2654 - regression_loss: 0.2535 - classification_loss: 0.0119 35/500 [=>............................] - ETA: 3:56 - loss: 0.2620 - regression_loss: 0.2503 - classification_loss: 0.0116 36/500 [=>............................] - ETA: 3:56 - loss: 0.2601 - regression_loss: 0.2488 - classification_loss: 0.0113 37/500 [=>............................] - ETA: 3:55 - loss: 0.2560 - regression_loss: 0.2448 - classification_loss: 0.0111 38/500 [=>............................] - ETA: 3:55 - loss: 0.2552 - regression_loss: 0.2442 - classification_loss: 0.0110 39/500 [=>............................] - ETA: 3:54 - loss: 0.2547 - regression_loss: 0.2439 - classification_loss: 0.0108 40/500 [=>............................] - ETA: 3:53 - loss: 0.2521 - regression_loss: 0.2415 - classification_loss: 0.0106 41/500 [=>............................] - ETA: 3:53 - loss: 0.2528 - regression_loss: 0.2421 - classification_loss: 0.0107 42/500 [=>............................] - ETA: 3:52 - loss: 0.2543 - regression_loss: 0.2435 - classification_loss: 0.0108 43/500 [=>............................] - ETA: 3:51 - loss: 0.2544 - regression_loss: 0.2437 - classification_loss: 0.0108 44/500 [=>............................] - ETA: 3:51 - loss: 0.2628 - regression_loss: 0.2519 - classification_loss: 0.0109 45/500 [=>............................] - ETA: 3:51 - loss: 0.2634 - regression_loss: 0.2525 - classification_loss: 0.0109 46/500 [=>............................] - ETA: 3:50 - loss: 0.2659 - regression_loss: 0.2547 - classification_loss: 0.0112 47/500 [=>............................] - ETA: 3:50 - loss: 0.2623 - regression_loss: 0.2513 - classification_loss: 0.0110 48/500 [=>............................] - ETA: 3:49 - loss: 0.2624 - regression_loss: 0.2514 - classification_loss: 0.0110 49/500 [=>............................] - ETA: 3:48 - loss: 0.2619 - regression_loss: 0.2511 - classification_loss: 0.0109 50/500 [==>...........................] - ETA: 3:48 - loss: 0.2604 - regression_loss: 0.2496 - classification_loss: 0.0108 51/500 [==>...........................] - ETA: 3:47 - loss: 0.2626 - regression_loss: 0.2515 - classification_loss: 0.0111 52/500 [==>...........................] - ETA: 3:47 - loss: 0.2627 - regression_loss: 0.2516 - classification_loss: 0.0112 53/500 [==>...........................] - ETA: 3:46 - loss: 0.2674 - regression_loss: 0.2558 - classification_loss: 0.0116 54/500 [==>...........................] - ETA: 3:45 - loss: 0.2699 - regression_loss: 0.2581 - classification_loss: 0.0118 55/500 [==>...........................] - ETA: 3:45 - loss: 0.2737 - regression_loss: 0.2613 - classification_loss: 0.0124 56/500 [==>...........................] - ETA: 3:44 - loss: 0.2755 - regression_loss: 0.2632 - classification_loss: 0.0123 57/500 [==>...........................] - ETA: 3:44 - loss: 0.2741 - regression_loss: 0.2615 - classification_loss: 0.0126 58/500 [==>...........................] - ETA: 3:43 - loss: 0.2718 - regression_loss: 0.2593 - classification_loss: 0.0125 59/500 [==>...........................] - ETA: 3:43 - loss: 0.2720 - regression_loss: 0.2596 - classification_loss: 0.0125 60/500 [==>...........................] - ETA: 3:42 - loss: 0.2717 - regression_loss: 0.2593 - classification_loss: 0.0124 61/500 [==>...........................] - ETA: 3:42 - loss: 0.2732 - regression_loss: 0.2603 - classification_loss: 0.0128 62/500 [==>...........................] - ETA: 3:41 - loss: 0.2743 - regression_loss: 0.2615 - classification_loss: 0.0128 63/500 [==>...........................] - ETA: 3:41 - loss: 0.2748 - regression_loss: 0.2619 - classification_loss: 0.0129 64/500 [==>...........................] - ETA: 3:40 - loss: 0.2736 - regression_loss: 0.2608 - classification_loss: 0.0128 65/500 [==>...........................] - ETA: 3:40 - loss: 0.2732 - regression_loss: 0.2605 - classification_loss: 0.0127 66/500 [==>...........................] - ETA: 3:39 - loss: 0.2740 - regression_loss: 0.2613 - classification_loss: 0.0127 67/500 [===>..........................] - ETA: 3:39 - loss: 0.2742 - regression_loss: 0.2612 - classification_loss: 0.0130 68/500 [===>..........................] - ETA: 3:38 - loss: 0.2708 - regression_loss: 0.2579 - classification_loss: 0.0129 69/500 [===>..........................] - ETA: 3:38 - loss: 0.2711 - regression_loss: 0.2582 - classification_loss: 0.0129 70/500 [===>..........................] - ETA: 3:37 - loss: 0.2708 - regression_loss: 0.2579 - classification_loss: 0.0128 71/500 [===>..........................] - ETA: 3:37 - loss: 0.2766 - regression_loss: 0.2630 - classification_loss: 0.0136 72/500 [===>..........................] - ETA: 3:36 - loss: 0.2772 - regression_loss: 0.2637 - classification_loss: 0.0135 73/500 [===>..........................] - ETA: 3:36 - loss: 0.2780 - regression_loss: 0.2645 - classification_loss: 0.0135 74/500 [===>..........................] - ETA: 3:35 - loss: 0.2754 - regression_loss: 0.2620 - classification_loss: 0.0134 75/500 [===>..........................] - ETA: 3:34 - loss: 0.2732 - regression_loss: 0.2600 - classification_loss: 0.0132 76/500 [===>..........................] - ETA: 3:34 - loss: 0.2711 - regression_loss: 0.2579 - classification_loss: 0.0132 77/500 [===>..........................] - ETA: 3:33 - loss: 0.2695 - regression_loss: 0.2564 - classification_loss: 0.0131 78/500 [===>..........................] - ETA: 3:33 - loss: 0.2685 - regression_loss: 0.2556 - classification_loss: 0.0130 79/500 [===>..........................] - ETA: 3:32 - loss: 0.2684 - regression_loss: 0.2554 - classification_loss: 0.0130 80/500 [===>..........................] - ETA: 3:32 - loss: 0.2687 - regression_loss: 0.2557 - classification_loss: 0.0130 81/500 [===>..........................] - ETA: 3:31 - loss: 0.2691 - regression_loss: 0.2559 - classification_loss: 0.0132 82/500 [===>..........................] - ETA: 3:31 - loss: 0.2677 - regression_loss: 0.2546 - classification_loss: 0.0131 83/500 [===>..........................] - ETA: 3:31 - loss: 0.2675 - regression_loss: 0.2545 - classification_loss: 0.0130 84/500 [====>.........................] - ETA: 3:30 - loss: 0.2680 - regression_loss: 0.2549 - classification_loss: 0.0131 85/500 [====>.........................] - ETA: 3:30 - loss: 0.2674 - regression_loss: 0.2543 - classification_loss: 0.0131 86/500 [====>.........................] - ETA: 3:29 - loss: 0.2649 - regression_loss: 0.2519 - classification_loss: 0.0130 87/500 [====>.........................] - ETA: 3:29 - loss: 0.2644 - regression_loss: 0.2514 - classification_loss: 0.0130 88/500 [====>.........................] - ETA: 3:28 - loss: 0.2635 - regression_loss: 0.2504 - classification_loss: 0.0130 89/500 [====>.........................] - ETA: 3:28 - loss: 0.2627 - regression_loss: 0.2497 - classification_loss: 0.0129 90/500 [====>.........................] - ETA: 3:27 - loss: 0.2627 - regression_loss: 0.2497 - classification_loss: 0.0131 91/500 [====>.........................] - ETA: 3:27 - loss: 0.2608 - regression_loss: 0.2479 - classification_loss: 0.0130 92/500 [====>.........................] - ETA: 3:26 - loss: 0.2600 - regression_loss: 0.2471 - classification_loss: 0.0128 93/500 [====>.........................] - ETA: 3:25 - loss: 0.2598 - regression_loss: 0.2470 - classification_loss: 0.0128 94/500 [====>.........................] - ETA: 3:25 - loss: 0.2609 - regression_loss: 0.2481 - classification_loss: 0.0128 95/500 [====>.........................] - ETA: 3:25 - loss: 0.2592 - regression_loss: 0.2465 - classification_loss: 0.0127 96/500 [====>.........................] - ETA: 3:24 - loss: 0.2585 - regression_loss: 0.2458 - classification_loss: 0.0127 97/500 [====>.........................] - ETA: 3:24 - loss: 0.2589 - regression_loss: 0.2461 - classification_loss: 0.0128 98/500 [====>.........................] - ETA: 3:23 - loss: 0.2598 - regression_loss: 0.2470 - classification_loss: 0.0128 99/500 [====>.........................] - ETA: 3:22 - loss: 0.2604 - regression_loss: 0.2477 - classification_loss: 0.0127 100/500 [=====>........................] - ETA: 3:22 - loss: 0.2618 - regression_loss: 0.2486 - classification_loss: 0.0131 101/500 [=====>........................] - ETA: 3:21 - loss: 0.2625 - regression_loss: 0.2493 - classification_loss: 0.0132 102/500 [=====>........................] - ETA: 3:21 - loss: 0.2630 - regression_loss: 0.2496 - classification_loss: 0.0134 103/500 [=====>........................] - ETA: 3:20 - loss: 0.2629 - regression_loss: 0.2495 - classification_loss: 0.0134 104/500 [=====>........................] - ETA: 3:20 - loss: 0.2641 - regression_loss: 0.2506 - classification_loss: 0.0135 105/500 [=====>........................] - ETA: 3:19 - loss: 0.2640 - regression_loss: 0.2506 - classification_loss: 0.0134 106/500 [=====>........................] - ETA: 3:19 - loss: 0.2622 - regression_loss: 0.2489 - classification_loss: 0.0133 107/500 [=====>........................] - ETA: 3:18 - loss: 0.2616 - regression_loss: 0.2484 - classification_loss: 0.0132 108/500 [=====>........................] - ETA: 3:18 - loss: 0.2608 - regression_loss: 0.2477 - classification_loss: 0.0131 109/500 [=====>........................] - ETA: 3:17 - loss: 0.2616 - regression_loss: 0.2483 - classification_loss: 0.0133 110/500 [=====>........................] - ETA: 3:17 - loss: 0.2632 - regression_loss: 0.2498 - classification_loss: 0.0134 111/500 [=====>........................] - ETA: 3:16 - loss: 0.2625 - regression_loss: 0.2493 - classification_loss: 0.0133 112/500 [=====>........................] - ETA: 3:16 - loss: 0.2627 - regression_loss: 0.2495 - classification_loss: 0.0132 113/500 [=====>........................] - ETA: 3:15 - loss: 0.2629 - regression_loss: 0.2496 - classification_loss: 0.0133 114/500 [=====>........................] - ETA: 3:15 - loss: 0.2631 - regression_loss: 0.2497 - classification_loss: 0.0133 115/500 [=====>........................] - ETA: 3:14 - loss: 0.2637 - regression_loss: 0.2504 - classification_loss: 0.0133 116/500 [=====>........................] - ETA: 3:14 - loss: 0.2632 - regression_loss: 0.2500 - classification_loss: 0.0132 117/500 [======>.......................] - ETA: 3:13 - loss: 0.2647 - regression_loss: 0.2514 - classification_loss: 0.0132 118/500 [======>.......................] - ETA: 3:13 - loss: 0.2647 - regression_loss: 0.2514 - classification_loss: 0.0133 119/500 [======>.......................] - ETA: 3:12 - loss: 0.2650 - regression_loss: 0.2517 - classification_loss: 0.0133 120/500 [======>.......................] - ETA: 3:12 - loss: 0.2638 - regression_loss: 0.2506 - classification_loss: 0.0132 121/500 [======>.......................] - ETA: 3:11 - loss: 0.2648 - regression_loss: 0.2515 - classification_loss: 0.0133 122/500 [======>.......................] - ETA: 3:11 - loss: 0.2655 - regression_loss: 0.2521 - classification_loss: 0.0134 123/500 [======>.......................] - ETA: 3:10 - loss: 0.2649 - regression_loss: 0.2516 - classification_loss: 0.0133 124/500 [======>.......................] - ETA: 3:10 - loss: 0.2653 - regression_loss: 0.2518 - classification_loss: 0.0135 125/500 [======>.......................] - ETA: 3:09 - loss: 0.2665 - regression_loss: 0.2529 - classification_loss: 0.0136 126/500 [======>.......................] - ETA: 3:09 - loss: 0.2660 - regression_loss: 0.2525 - classification_loss: 0.0136 127/500 [======>.......................] - ETA: 3:08 - loss: 0.2653 - regression_loss: 0.2518 - classification_loss: 0.0135 128/500 [======>.......................] - ETA: 3:08 - loss: 0.2647 - regression_loss: 0.2512 - classification_loss: 0.0135 129/500 [======>.......................] - ETA: 3:07 - loss: 0.2674 - regression_loss: 0.2536 - classification_loss: 0.0137 130/500 [======>.......................] - ETA: 3:06 - loss: 0.2672 - regression_loss: 0.2534 - classification_loss: 0.0138 131/500 [======>.......................] - ETA: 3:06 - loss: 0.2666 - regression_loss: 0.2529 - classification_loss: 0.0137 132/500 [======>.......................] - ETA: 3:06 - loss: 0.2666 - regression_loss: 0.2529 - classification_loss: 0.0137 133/500 [======>.......................] - ETA: 3:05 - loss: 0.2665 - regression_loss: 0.2528 - classification_loss: 0.0137 134/500 [=======>......................] - ETA: 3:05 - loss: 0.2656 - regression_loss: 0.2519 - classification_loss: 0.0137 135/500 [=======>......................] - ETA: 3:04 - loss: 0.2660 - regression_loss: 0.2522 - classification_loss: 0.0137 136/500 [=======>......................] - ETA: 3:04 - loss: 0.2666 - regression_loss: 0.2528 - classification_loss: 0.0138 137/500 [=======>......................] - ETA: 3:03 - loss: 0.2665 - regression_loss: 0.2527 - classification_loss: 0.0138 138/500 [=======>......................] - ETA: 3:03 - loss: 0.2663 - regression_loss: 0.2525 - classification_loss: 0.0138 139/500 [=======>......................] - ETA: 3:02 - loss: 0.2679 - regression_loss: 0.2540 - classification_loss: 0.0138 140/500 [=======>......................] - ETA: 3:02 - loss: 0.2674 - regression_loss: 0.2536 - classification_loss: 0.0138 141/500 [=======>......................] - ETA: 3:01 - loss: 0.2664 - regression_loss: 0.2527 - classification_loss: 0.0137 142/500 [=======>......................] - ETA: 3:01 - loss: 0.2659 - regression_loss: 0.2521 - classification_loss: 0.0138 143/500 [=======>......................] - ETA: 3:00 - loss: 0.2657 - regression_loss: 0.2520 - classification_loss: 0.0138 144/500 [=======>......................] - ETA: 3:00 - loss: 0.2657 - regression_loss: 0.2518 - classification_loss: 0.0139 145/500 [=======>......................] - ETA: 2:59 - loss: 0.2655 - regression_loss: 0.2516 - classification_loss: 0.0139 146/500 [=======>......................] - ETA: 2:59 - loss: 0.2696 - regression_loss: 0.2557 - classification_loss: 0.0138 147/500 [=======>......................] - ETA: 2:58 - loss: 0.2687 - regression_loss: 0.2549 - classification_loss: 0.0138 148/500 [=======>......................] - ETA: 2:58 - loss: 0.2695 - regression_loss: 0.2557 - classification_loss: 0.0138 149/500 [=======>......................] - ETA: 2:57 - loss: 0.2699 - regression_loss: 0.2561 - classification_loss: 0.0138 150/500 [========>.....................] - ETA: 2:57 - loss: 0.2695 - regression_loss: 0.2558 - classification_loss: 0.0138 151/500 [========>.....................] - ETA: 2:56 - loss: 0.2687 - regression_loss: 0.2550 - classification_loss: 0.0137 152/500 [========>.....................] - ETA: 2:56 - loss: 0.2685 - regression_loss: 0.2548 - classification_loss: 0.0137 153/500 [========>.....................] - ETA: 2:55 - loss: 0.2677 - regression_loss: 0.2541 - classification_loss: 0.0137 154/500 [========>.....................] - ETA: 2:55 - loss: 0.2671 - regression_loss: 0.2535 - classification_loss: 0.0136 155/500 [========>.....................] - ETA: 2:54 - loss: 0.2671 - regression_loss: 0.2535 - classification_loss: 0.0136 156/500 [========>.....................] - ETA: 2:54 - loss: 0.2673 - regression_loss: 0.2536 - classification_loss: 0.0137 157/500 [========>.....................] - ETA: 2:53 - loss: 0.2671 - regression_loss: 0.2535 - classification_loss: 0.0136 158/500 [========>.....................] - ETA: 2:53 - loss: 0.2668 - regression_loss: 0.2532 - classification_loss: 0.0136 159/500 [========>.....................] - ETA: 2:52 - loss: 0.2662 - regression_loss: 0.2526 - classification_loss: 0.0136 160/500 [========>.....................] - ETA: 2:52 - loss: 0.2661 - regression_loss: 0.2525 - classification_loss: 0.0136 161/500 [========>.....................] - ETA: 2:51 - loss: 0.2661 - regression_loss: 0.2524 - classification_loss: 0.0136 162/500 [========>.....................] - ETA: 2:51 - loss: 0.2670 - regression_loss: 0.2534 - classification_loss: 0.0137 163/500 [========>.....................] - ETA: 2:50 - loss: 0.2671 - regression_loss: 0.2535 - classification_loss: 0.0136 164/500 [========>.....................] - ETA: 2:50 - loss: 0.2663 - regression_loss: 0.2527 - classification_loss: 0.0136 165/500 [========>.....................] - ETA: 2:49 - loss: 0.2661 - regression_loss: 0.2525 - classification_loss: 0.0136 166/500 [========>.....................] - ETA: 2:49 - loss: 0.2663 - regression_loss: 0.2527 - classification_loss: 0.0136 167/500 [=========>....................] - ETA: 2:48 - loss: 0.2657 - regression_loss: 0.2521 - classification_loss: 0.0136 168/500 [=========>....................] - ETA: 2:48 - loss: 0.2656 - regression_loss: 0.2520 - classification_loss: 0.0136 169/500 [=========>....................] - ETA: 2:47 - loss: 0.2665 - regression_loss: 0.2528 - classification_loss: 0.0137 170/500 [=========>....................] - ETA: 2:47 - loss: 0.2677 - regression_loss: 0.2539 - classification_loss: 0.0138 171/500 [=========>....................] - ETA: 2:46 - loss: 0.2695 - regression_loss: 0.2557 - classification_loss: 0.0138 172/500 [=========>....................] - ETA: 2:46 - loss: 0.2697 - regression_loss: 0.2559 - classification_loss: 0.0138 173/500 [=========>....................] - ETA: 2:45 - loss: 0.2689 - regression_loss: 0.2551 - classification_loss: 0.0138 174/500 [=========>....................] - ETA: 2:45 - loss: 0.2709 - regression_loss: 0.2570 - classification_loss: 0.0139 175/500 [=========>....................] - ETA: 2:44 - loss: 0.2718 - regression_loss: 0.2576 - classification_loss: 0.0142 176/500 [=========>....................] - ETA: 2:44 - loss: 0.2721 - regression_loss: 0.2579 - classification_loss: 0.0142 177/500 [=========>....................] - ETA: 2:43 - loss: 0.2726 - regression_loss: 0.2584 - classification_loss: 0.0142 178/500 [=========>....................] - ETA: 2:43 - loss: 0.2716 - regression_loss: 0.2574 - classification_loss: 0.0142 179/500 [=========>....................] - ETA: 2:42 - loss: 0.2714 - regression_loss: 0.2572 - classification_loss: 0.0142 180/500 [=========>....................] - ETA: 2:41 - loss: 0.2714 - regression_loss: 0.2573 - classification_loss: 0.0142 181/500 [=========>....................] - ETA: 2:41 - loss: 0.2706 - regression_loss: 0.2565 - classification_loss: 0.0141 182/500 [=========>....................] - ETA: 2:40 - loss: 0.2705 - regression_loss: 0.2564 - classification_loss: 0.0141 183/500 [=========>....................] - ETA: 2:40 - loss: 0.2718 - regression_loss: 0.2576 - classification_loss: 0.0142 184/500 [==========>...................] - ETA: 2:40 - loss: 0.2714 - regression_loss: 0.2572 - classification_loss: 0.0142 185/500 [==========>...................] - ETA: 2:39 - loss: 0.2715 - regression_loss: 0.2573 - classification_loss: 0.0142 186/500 [==========>...................] - ETA: 2:38 - loss: 0.2724 - regression_loss: 0.2580 - classification_loss: 0.0144 187/500 [==========>...................] - ETA: 2:38 - loss: 0.2720 - regression_loss: 0.2576 - classification_loss: 0.0143 188/500 [==========>...................] - ETA: 2:38 - loss: 0.2724 - regression_loss: 0.2580 - classification_loss: 0.0144 189/500 [==========>...................] - ETA: 2:37 - loss: 0.2729 - regression_loss: 0.2585 - classification_loss: 0.0144 190/500 [==========>...................] - ETA: 2:36 - loss: 0.2721 - regression_loss: 0.2577 - classification_loss: 0.0144 191/500 [==========>...................] - ETA: 2:36 - loss: 0.2715 - regression_loss: 0.2572 - classification_loss: 0.0143 192/500 [==========>...................] - ETA: 2:35 - loss: 0.2719 - regression_loss: 0.2576 - classification_loss: 0.0143 193/500 [==========>...................] - ETA: 2:35 - loss: 0.2718 - regression_loss: 0.2575 - classification_loss: 0.0143 194/500 [==========>...................] - ETA: 2:34 - loss: 0.2724 - regression_loss: 0.2580 - classification_loss: 0.0145 195/500 [==========>...................] - ETA: 2:34 - loss: 0.2720 - regression_loss: 0.2576 - classification_loss: 0.0144 196/500 [==========>...................] - ETA: 2:33 - loss: 0.2719 - regression_loss: 0.2575 - classification_loss: 0.0144 197/500 [==========>...................] - ETA: 2:33 - loss: 0.2716 - regression_loss: 0.2572 - classification_loss: 0.0144 198/500 [==========>...................] - ETA: 2:32 - loss: 0.2713 - regression_loss: 0.2569 - classification_loss: 0.0144 199/500 [==========>...................] - ETA: 2:32 - loss: 0.2723 - regression_loss: 0.2578 - classification_loss: 0.0145 200/500 [===========>..................] - ETA: 2:31 - loss: 0.2720 - regression_loss: 0.2575 - classification_loss: 0.0145 201/500 [===========>..................] - ETA: 2:31 - loss: 0.2719 - regression_loss: 0.2574 - classification_loss: 0.0145 202/500 [===========>..................] - ETA: 2:30 - loss: 0.2717 - regression_loss: 0.2571 - classification_loss: 0.0145 203/500 [===========>..................] - ETA: 2:30 - loss: 0.2717 - regression_loss: 0.2571 - classification_loss: 0.0145 204/500 [===========>..................] - ETA: 2:29 - loss: 0.2711 - regression_loss: 0.2566 - classification_loss: 0.0145 205/500 [===========>..................] - ETA: 2:29 - loss: 0.2716 - regression_loss: 0.2571 - classification_loss: 0.0145 206/500 [===========>..................] - ETA: 2:28 - loss: 0.2724 - regression_loss: 0.2577 - classification_loss: 0.0147 207/500 [===========>..................] - ETA: 2:28 - loss: 0.2715 - regression_loss: 0.2569 - classification_loss: 0.0146 208/500 [===========>..................] - ETA: 2:27 - loss: 0.2713 - regression_loss: 0.2568 - classification_loss: 0.0145 209/500 [===========>..................] - ETA: 2:27 - loss: 0.2706 - regression_loss: 0.2561 - classification_loss: 0.0145 210/500 [===========>..................] - ETA: 2:26 - loss: 0.2704 - regression_loss: 0.2559 - classification_loss: 0.0144 211/500 [===========>..................] - ETA: 2:26 - loss: 0.2705 - regression_loss: 0.2561 - classification_loss: 0.0145 212/500 [===========>..................] - ETA: 2:25 - loss: 0.2709 - regression_loss: 0.2565 - classification_loss: 0.0145 213/500 [===========>..................] - ETA: 2:25 - loss: 0.2706 - regression_loss: 0.2562 - classification_loss: 0.0144 214/500 [===========>..................] - ETA: 2:24 - loss: 0.2702 - regression_loss: 0.2558 - classification_loss: 0.0143 215/500 [===========>..................] - ETA: 2:24 - loss: 0.2698 - regression_loss: 0.2555 - classification_loss: 0.0143 216/500 [===========>..................] - ETA: 2:23 - loss: 0.2700 - regression_loss: 0.2557 - classification_loss: 0.0143 217/500 [============>.................] - ETA: 2:23 - loss: 0.2703 - regression_loss: 0.2559 - classification_loss: 0.0143 218/500 [============>.................] - ETA: 2:22 - loss: 0.2698 - regression_loss: 0.2555 - classification_loss: 0.0143 219/500 [============>.................] - ETA: 2:22 - loss: 0.2691 - regression_loss: 0.2548 - classification_loss: 0.0143 220/500 [============>.................] - ETA: 2:21 - loss: 0.2686 - regression_loss: 0.2544 - classification_loss: 0.0142 221/500 [============>.................] - ETA: 2:21 - loss: 0.2688 - regression_loss: 0.2546 - classification_loss: 0.0142 222/500 [============>.................] - ETA: 2:20 - loss: 0.2680 - regression_loss: 0.2539 - classification_loss: 0.0141 223/500 [============>.................] - ETA: 2:20 - loss: 0.2673 - regression_loss: 0.2532 - classification_loss: 0.0141 224/500 [============>.................] - ETA: 2:19 - loss: 0.2675 - regression_loss: 0.2534 - classification_loss: 0.0141 225/500 [============>.................] - ETA: 2:19 - loss: 0.2678 - regression_loss: 0.2537 - classification_loss: 0.0141 226/500 [============>.................] - ETA: 2:18 - loss: 0.2679 - regression_loss: 0.2538 - classification_loss: 0.0141 227/500 [============>.................] - ETA: 2:18 - loss: 0.2676 - regression_loss: 0.2535 - classification_loss: 0.0141 228/500 [============>.................] - ETA: 2:17 - loss: 0.2675 - regression_loss: 0.2535 - classification_loss: 0.0141 229/500 [============>.................] - ETA: 2:17 - loss: 0.2681 - regression_loss: 0.2539 - classification_loss: 0.0142 230/500 [============>.................] - ETA: 2:16 - loss: 0.2677 - regression_loss: 0.2536 - classification_loss: 0.0141 231/500 [============>.................] - ETA: 2:16 - loss: 0.2678 - regression_loss: 0.2537 - classification_loss: 0.0141 232/500 [============>.................] - ETA: 2:15 - loss: 0.2679 - regression_loss: 0.2538 - classification_loss: 0.0141 233/500 [============>.................] - ETA: 2:15 - loss: 0.2685 - regression_loss: 0.2544 - classification_loss: 0.0141 234/500 [=============>................] - ETA: 2:14 - loss: 0.2689 - regression_loss: 0.2547 - classification_loss: 0.0142 235/500 [=============>................] - ETA: 2:14 - loss: 0.2682 - regression_loss: 0.2541 - classification_loss: 0.0141 236/500 [=============>................] - ETA: 2:13 - loss: 0.2675 - regression_loss: 0.2534 - classification_loss: 0.0141 237/500 [=============>................] - ETA: 2:13 - loss: 0.2673 - regression_loss: 0.2532 - classification_loss: 0.0141 238/500 [=============>................] - ETA: 2:12 - loss: 0.2674 - regression_loss: 0.2533 - classification_loss: 0.0141 239/500 [=============>................] - ETA: 2:12 - loss: 0.2669 - regression_loss: 0.2528 - classification_loss: 0.0141 240/500 [=============>................] - ETA: 2:11 - loss: 0.2668 - regression_loss: 0.2528 - classification_loss: 0.0140 241/500 [=============>................] - ETA: 2:11 - loss: 0.2663 - regression_loss: 0.2523 - classification_loss: 0.0140 242/500 [=============>................] - ETA: 2:10 - loss: 0.2658 - regression_loss: 0.2518 - classification_loss: 0.0140 243/500 [=============>................] - ETA: 2:10 - loss: 0.2661 - regression_loss: 0.2521 - classification_loss: 0.0140 244/500 [=============>................] - ETA: 2:09 - loss: 0.2655 - regression_loss: 0.2516 - classification_loss: 0.0139 245/500 [=============>................] - ETA: 2:09 - loss: 0.2650 - regression_loss: 0.2511 - classification_loss: 0.0139 246/500 [=============>................] - ETA: 2:08 - loss: 0.2664 - regression_loss: 0.2523 - classification_loss: 0.0140 247/500 [=============>................] - ETA: 2:08 - loss: 0.2666 - regression_loss: 0.2526 - classification_loss: 0.0141 248/500 [=============>................] - ETA: 2:07 - loss: 0.2666 - regression_loss: 0.2525 - classification_loss: 0.0141 249/500 [=============>................] - ETA: 2:07 - loss: 0.2670 - regression_loss: 0.2528 - classification_loss: 0.0142 250/500 [==============>...............] - ETA: 2:06 - loss: 0.2673 - regression_loss: 0.2531 - classification_loss: 0.0142 251/500 [==============>...............] - ETA: 2:06 - loss: 0.2670 - regression_loss: 0.2528 - classification_loss: 0.0142 252/500 [==============>...............] - ETA: 2:05 - loss: 0.2668 - regression_loss: 0.2526 - classification_loss: 0.0141 253/500 [==============>...............] - ETA: 2:05 - loss: 0.2665 - regression_loss: 0.2524 - classification_loss: 0.0141 254/500 [==============>...............] - ETA: 2:04 - loss: 0.2675 - regression_loss: 0.2532 - classification_loss: 0.0142 255/500 [==============>...............] - ETA: 2:04 - loss: 0.2674 - regression_loss: 0.2532 - classification_loss: 0.0142 256/500 [==============>...............] - ETA: 2:03 - loss: 0.2671 - regression_loss: 0.2529 - classification_loss: 0.0142 257/500 [==============>...............] - ETA: 2:03 - loss: 0.2672 - regression_loss: 0.2530 - classification_loss: 0.0142 258/500 [==============>...............] - ETA: 2:02 - loss: 0.2672 - regression_loss: 0.2530 - classification_loss: 0.0142 259/500 [==============>...............] - ETA: 2:02 - loss: 0.2672 - regression_loss: 0.2530 - classification_loss: 0.0142 260/500 [==============>...............] - ETA: 2:01 - loss: 0.2673 - regression_loss: 0.2532 - classification_loss: 0.0142 261/500 [==============>...............] - ETA: 2:00 - loss: 0.2669 - regression_loss: 0.2527 - classification_loss: 0.0141 262/500 [==============>...............] - ETA: 2:00 - loss: 0.2666 - regression_loss: 0.2525 - classification_loss: 0.0141 263/500 [==============>...............] - ETA: 1:59 - loss: 0.2677 - regression_loss: 0.2535 - classification_loss: 0.0143 264/500 [==============>...............] - ETA: 1:59 - loss: 0.2683 - regression_loss: 0.2540 - classification_loss: 0.0143 265/500 [==============>...............] - ETA: 1:58 - loss: 0.2682 - regression_loss: 0.2540 - classification_loss: 0.0143 266/500 [==============>...............] - ETA: 1:58 - loss: 0.2675 - regression_loss: 0.2533 - classification_loss: 0.0142 267/500 [===============>..............] - ETA: 1:57 - loss: 0.2677 - regression_loss: 0.2535 - classification_loss: 0.0142 268/500 [===============>..............] - ETA: 1:57 - loss: 0.2672 - regression_loss: 0.2531 - classification_loss: 0.0142 269/500 [===============>..............] - ETA: 1:56 - loss: 0.2675 - regression_loss: 0.2532 - classification_loss: 0.0142 270/500 [===============>..............] - ETA: 1:56 - loss: 0.2676 - regression_loss: 0.2534 - classification_loss: 0.0142 271/500 [===============>..............] - ETA: 1:55 - loss: 0.2674 - regression_loss: 0.2532 - classification_loss: 0.0142 272/500 [===============>..............] - ETA: 1:55 - loss: 0.2672 - regression_loss: 0.2530 - classification_loss: 0.0142 273/500 [===============>..............] - ETA: 1:54 - loss: 0.2675 - regression_loss: 0.2532 - classification_loss: 0.0143 274/500 [===============>..............] - ETA: 1:54 - loss: 0.2673 - regression_loss: 0.2530 - classification_loss: 0.0143 275/500 [===============>..............] - ETA: 1:53 - loss: 0.2667 - regression_loss: 0.2524 - classification_loss: 0.0143 276/500 [===============>..............] - ETA: 1:53 - loss: 0.2666 - regression_loss: 0.2523 - classification_loss: 0.0143 277/500 [===============>..............] - ETA: 1:52 - loss: 0.2667 - regression_loss: 0.2525 - classification_loss: 0.0143 278/500 [===============>..............] - ETA: 1:52 - loss: 0.2666 - regression_loss: 0.2523 - classification_loss: 0.0143 279/500 [===============>..............] - ETA: 1:51 - loss: 0.2670 - regression_loss: 0.2527 - classification_loss: 0.0143 280/500 [===============>..............] - ETA: 1:51 - loss: 0.2667 - regression_loss: 0.2524 - classification_loss: 0.0143 281/500 [===============>..............] - ETA: 1:50 - loss: 0.2667 - regression_loss: 0.2525 - classification_loss: 0.0143 282/500 [===============>..............] - ETA: 1:50 - loss: 0.2663 - regression_loss: 0.2521 - classification_loss: 0.0142 283/500 [===============>..............] - ETA: 1:49 - loss: 0.2661 - regression_loss: 0.2519 - classification_loss: 0.0142 284/500 [================>.............] - ETA: 1:49 - loss: 0.2660 - regression_loss: 0.2518 - classification_loss: 0.0142 285/500 [================>.............] - ETA: 1:48 - loss: 0.2658 - regression_loss: 0.2516 - classification_loss: 0.0142 286/500 [================>.............] - ETA: 1:48 - loss: 0.2657 - regression_loss: 0.2515 - classification_loss: 0.0142 287/500 [================>.............] - ETA: 1:47 - loss: 0.2651 - regression_loss: 0.2510 - classification_loss: 0.0142 288/500 [================>.............] - ETA: 1:47 - loss: 0.2653 - regression_loss: 0.2512 - classification_loss: 0.0141 289/500 [================>.............] - ETA: 1:46 - loss: 0.2652 - regression_loss: 0.2511 - classification_loss: 0.0141 290/500 [================>.............] - ETA: 1:46 - loss: 0.2651 - regression_loss: 0.2510 - classification_loss: 0.0141 291/500 [================>.............] - ETA: 1:45 - loss: 0.2649 - regression_loss: 0.2508 - classification_loss: 0.0141 292/500 [================>.............] - ETA: 1:45 - loss: 0.2649 - regression_loss: 0.2507 - classification_loss: 0.0141 293/500 [================>.............] - ETA: 1:44 - loss: 0.2647 - regression_loss: 0.2506 - classification_loss: 0.0141 294/500 [================>.............] - ETA: 1:44 - loss: 0.2647 - regression_loss: 0.2506 - classification_loss: 0.0141 295/500 [================>.............] - ETA: 1:43 - loss: 0.2644 - regression_loss: 0.2504 - classification_loss: 0.0141 296/500 [================>.............] - ETA: 1:43 - loss: 0.2640 - regression_loss: 0.2500 - classification_loss: 0.0140 297/500 [================>.............] - ETA: 1:42 - loss: 0.2636 - regression_loss: 0.2496 - classification_loss: 0.0140 298/500 [================>.............] - ETA: 1:42 - loss: 0.2641 - regression_loss: 0.2500 - classification_loss: 0.0140 299/500 [================>.............] - ETA: 1:41 - loss: 0.2638 - regression_loss: 0.2497 - classification_loss: 0.0140 300/500 [=================>............] - ETA: 1:41 - loss: 0.2635 - regression_loss: 0.2495 - classification_loss: 0.0140 301/500 [=================>............] - ETA: 1:40 - loss: 0.2635 - regression_loss: 0.2495 - classification_loss: 0.0140 302/500 [=================>............] - ETA: 1:40 - loss: 0.2635 - regression_loss: 0.2495 - classification_loss: 0.0140 303/500 [=================>............] - ETA: 1:39 - loss: 0.2634 - regression_loss: 0.2493 - classification_loss: 0.0141 304/500 [=================>............] - ETA: 1:39 - loss: 0.2633 - regression_loss: 0.2493 - classification_loss: 0.0140 305/500 [=================>............] - ETA: 1:38 - loss: 0.2629 - regression_loss: 0.2489 - classification_loss: 0.0140 306/500 [=================>............] - ETA: 1:38 - loss: 0.2627 - regression_loss: 0.2488 - classification_loss: 0.0140 307/500 [=================>............] - ETA: 1:37 - loss: 0.2627 - regression_loss: 0.2487 - classification_loss: 0.0140 308/500 [=================>............] - ETA: 1:37 - loss: 0.2624 - regression_loss: 0.2485 - classification_loss: 0.0140 309/500 [=================>............] - ETA: 1:36 - loss: 0.2628 - regression_loss: 0.2488 - classification_loss: 0.0141 310/500 [=================>............] - ETA: 1:36 - loss: 0.2625 - regression_loss: 0.2485 - classification_loss: 0.0141 311/500 [=================>............] - ETA: 1:35 - loss: 0.2625 - regression_loss: 0.2484 - classification_loss: 0.0141 312/500 [=================>............] - ETA: 1:34 - loss: 0.2626 - regression_loss: 0.2485 - classification_loss: 0.0141 313/500 [=================>............] - ETA: 1:34 - loss: 0.2622 - regression_loss: 0.2481 - classification_loss: 0.0141 314/500 [=================>............] - ETA: 1:34 - loss: 0.2618 - regression_loss: 0.2478 - classification_loss: 0.0140 315/500 [=================>............] - ETA: 1:33 - loss: 0.2619 - regression_loss: 0.2478 - classification_loss: 0.0140 316/500 [=================>............] - ETA: 1:32 - loss: 0.2615 - regression_loss: 0.2475 - classification_loss: 0.0140 317/500 [==================>...........] - ETA: 1:32 - loss: 0.2611 - regression_loss: 0.2471 - classification_loss: 0.0140 318/500 [==================>...........] - ETA: 1:31 - loss: 0.2619 - regression_loss: 0.2478 - classification_loss: 0.0140 319/500 [==================>...........] - ETA: 1:31 - loss: 0.2619 - regression_loss: 0.2478 - classification_loss: 0.0141 320/500 [==================>...........] - ETA: 1:30 - loss: 0.2613 - regression_loss: 0.2473 - classification_loss: 0.0141 321/500 [==================>...........] - ETA: 1:30 - loss: 0.2609 - regression_loss: 0.2469 - classification_loss: 0.0140 322/500 [==================>...........] - ETA: 1:29 - loss: 0.2613 - regression_loss: 0.2472 - classification_loss: 0.0141 323/500 [==================>...........] - ETA: 1:29 - loss: 0.2617 - regression_loss: 0.2475 - classification_loss: 0.0141 324/500 [==================>...........] - ETA: 1:28 - loss: 0.2620 - regression_loss: 0.2478 - classification_loss: 0.0141 325/500 [==================>...........] - ETA: 1:28 - loss: 0.2619 - regression_loss: 0.2477 - classification_loss: 0.0141 326/500 [==================>...........] - ETA: 1:27 - loss: 0.2620 - regression_loss: 0.2479 - classification_loss: 0.0141 327/500 [==================>...........] - ETA: 1:27 - loss: 0.2618 - regression_loss: 0.2477 - classification_loss: 0.0141 328/500 [==================>...........] - ETA: 1:26 - loss: 0.2620 - regression_loss: 0.2479 - classification_loss: 0.0141 329/500 [==================>...........] - ETA: 1:26 - loss: 0.2621 - regression_loss: 0.2480 - classification_loss: 0.0141 330/500 [==================>...........] - ETA: 1:25 - loss: 0.2619 - regression_loss: 0.2478 - classification_loss: 0.0141 331/500 [==================>...........] - ETA: 1:25 - loss: 0.2623 - regression_loss: 0.2482 - classification_loss: 0.0141 332/500 [==================>...........] - ETA: 1:24 - loss: 0.2623 - regression_loss: 0.2482 - classification_loss: 0.0141 333/500 [==================>...........] - ETA: 1:24 - loss: 0.2623 - regression_loss: 0.2482 - classification_loss: 0.0141 334/500 [===================>..........] - ETA: 1:23 - loss: 0.2626 - regression_loss: 0.2485 - classification_loss: 0.0141 335/500 [===================>..........] - ETA: 1:23 - loss: 0.2625 - regression_loss: 0.2485 - classification_loss: 0.0141 336/500 [===================>..........] - ETA: 1:22 - loss: 0.2623 - regression_loss: 0.2483 - classification_loss: 0.0141 337/500 [===================>..........] - ETA: 1:22 - loss: 0.2623 - regression_loss: 0.2482 - classification_loss: 0.0141 338/500 [===================>..........] - ETA: 1:21 - loss: 0.2622 - regression_loss: 0.2481 - classification_loss: 0.0141 339/500 [===================>..........] - ETA: 1:21 - loss: 0.2623 - regression_loss: 0.2482 - classification_loss: 0.0141 340/500 [===================>..........] - ETA: 1:20 - loss: 0.2624 - regression_loss: 0.2483 - classification_loss: 0.0141 341/500 [===================>..........] - ETA: 1:20 - loss: 0.2627 - regression_loss: 0.2486 - classification_loss: 0.0141 342/500 [===================>..........] - ETA: 1:19 - loss: 0.2622 - regression_loss: 0.2482 - classification_loss: 0.0140 343/500 [===================>..........] - ETA: 1:19 - loss: 0.2625 - regression_loss: 0.2484 - classification_loss: 0.0141 344/500 [===================>..........] - ETA: 1:18 - loss: 0.2631 - regression_loss: 0.2490 - classification_loss: 0.0141 345/500 [===================>..........] - ETA: 1:18 - loss: 0.2631 - regression_loss: 0.2490 - classification_loss: 0.0140 346/500 [===================>..........] - ETA: 1:17 - loss: 0.2635 - regression_loss: 0.2494 - classification_loss: 0.0141 347/500 [===================>..........] - ETA: 1:17 - loss: 0.2637 - regression_loss: 0.2497 - classification_loss: 0.0141 348/500 [===================>..........] - ETA: 1:16 - loss: 0.2637 - regression_loss: 0.2496 - classification_loss: 0.0141 349/500 [===================>..........] - ETA: 1:16 - loss: 0.2638 - regression_loss: 0.2497 - classification_loss: 0.0141 350/500 [====================>.........] - ETA: 1:15 - loss: 0.2635 - regression_loss: 0.2494 - classification_loss: 0.0141 351/500 [====================>.........] - ETA: 1:15 - loss: 0.2632 - regression_loss: 0.2492 - classification_loss: 0.0141 352/500 [====================>.........] - ETA: 1:14 - loss: 0.2639 - regression_loss: 0.2498 - classification_loss: 0.0141 353/500 [====================>.........] - ETA: 1:14 - loss: 0.2647 - regression_loss: 0.2505 - classification_loss: 0.0141 354/500 [====================>.........] - ETA: 1:13 - loss: 0.2647 - regression_loss: 0.2506 - classification_loss: 0.0141 355/500 [====================>.........] - ETA: 1:13 - loss: 0.2649 - regression_loss: 0.2508 - classification_loss: 0.0142 356/500 [====================>.........] - ETA: 1:12 - loss: 0.2645 - regression_loss: 0.2503 - classification_loss: 0.0141 357/500 [====================>.........] - ETA: 1:12 - loss: 0.2641 - regression_loss: 0.2500 - classification_loss: 0.0141 358/500 [====================>.........] - ETA: 1:11 - loss: 0.2641 - regression_loss: 0.2500 - classification_loss: 0.0141 359/500 [====================>.........] - ETA: 1:11 - loss: 0.2642 - regression_loss: 0.2501 - classification_loss: 0.0141 360/500 [====================>.........] - ETA: 1:10 - loss: 0.2643 - regression_loss: 0.2503 - classification_loss: 0.0141 361/500 [====================>.........] - ETA: 1:10 - loss: 0.2643 - regression_loss: 0.2503 - classification_loss: 0.0141 362/500 [====================>.........] - ETA: 1:09 - loss: 0.2643 - regression_loss: 0.2502 - classification_loss: 0.0141 363/500 [====================>.........] - ETA: 1:09 - loss: 0.2640 - regression_loss: 0.2500 - classification_loss: 0.0141 364/500 [====================>.........] - ETA: 1:08 - loss: 0.2640 - regression_loss: 0.2500 - classification_loss: 0.0140 365/500 [====================>.........] - ETA: 1:08 - loss: 0.2643 - regression_loss: 0.2503 - classification_loss: 0.0141 366/500 [====================>.........] - ETA: 1:07 - loss: 0.2640 - regression_loss: 0.2500 - classification_loss: 0.0140 367/500 [=====================>........] - ETA: 1:07 - loss: 0.2636 - regression_loss: 0.2496 - classification_loss: 0.0140 368/500 [=====================>........] - ETA: 1:06 - loss: 0.2645 - regression_loss: 0.2505 - classification_loss: 0.0140 369/500 [=====================>........] - ETA: 1:06 - loss: 0.2642 - regression_loss: 0.2502 - classification_loss: 0.0140 370/500 [=====================>........] - ETA: 1:05 - loss: 0.2638 - regression_loss: 0.2499 - classification_loss: 0.0140 371/500 [=====================>........] - ETA: 1:05 - loss: 0.2635 - regression_loss: 0.2496 - classification_loss: 0.0139 372/500 [=====================>........] - ETA: 1:04 - loss: 0.2636 - regression_loss: 0.2496 - classification_loss: 0.0139 373/500 [=====================>........] - ETA: 1:04 - loss: 0.2635 - regression_loss: 0.2495 - classification_loss: 0.0139 374/500 [=====================>........] - ETA: 1:03 - loss: 0.2634 - regression_loss: 0.2495 - classification_loss: 0.0140 375/500 [=====================>........] - ETA: 1:03 - loss: 0.2634 - regression_loss: 0.2494 - classification_loss: 0.0140 376/500 [=====================>........] - ETA: 1:02 - loss: 0.2636 - regression_loss: 0.2496 - classification_loss: 0.0140 377/500 [=====================>........] - ETA: 1:02 - loss: 0.2637 - regression_loss: 0.2497 - classification_loss: 0.0140 378/500 [=====================>........] - ETA: 1:01 - loss: 0.2635 - regression_loss: 0.2495 - classification_loss: 0.0140 379/500 [=====================>........] - ETA: 1:01 - loss: 0.2637 - regression_loss: 0.2497 - classification_loss: 0.0140 380/500 [=====================>........] - ETA: 1:00 - loss: 0.2631 - regression_loss: 0.2492 - classification_loss: 0.0139 381/500 [=====================>........] - ETA: 1:00 - loss: 0.2630 - regression_loss: 0.2490 - classification_loss: 0.0140 382/500 [=====================>........] - ETA: 59s - loss: 0.2626 - regression_loss: 0.2487 - classification_loss: 0.0139  383/500 [=====================>........] - ETA: 59s - loss: 0.2625 - regression_loss: 0.2486 - classification_loss: 0.0139 384/500 [======================>.......] - ETA: 58s - loss: 0.2625 - regression_loss: 0.2486 - classification_loss: 0.0139 385/500 [======================>.......] - ETA: 58s - loss: 0.2628 - regression_loss: 0.2489 - classification_loss: 0.0139 386/500 [======================>.......] - ETA: 57s - loss: 0.2627 - regression_loss: 0.2488 - classification_loss: 0.0139 387/500 [======================>.......] - ETA: 57s - loss: 0.2625 - regression_loss: 0.2486 - classification_loss: 0.0139 388/500 [======================>.......] - ETA: 56s - loss: 0.2625 - regression_loss: 0.2486 - classification_loss: 0.0139 389/500 [======================>.......] - ETA: 56s - loss: 0.2620 - regression_loss: 0.2482 - classification_loss: 0.0139 390/500 [======================>.......] - ETA: 55s - loss: 0.2616 - regression_loss: 0.2478 - classification_loss: 0.0138 391/500 [======================>.......] - ETA: 55s - loss: 0.2614 - regression_loss: 0.2475 - classification_loss: 0.0138 392/500 [======================>.......] - ETA: 54s - loss: 0.2614 - regression_loss: 0.2475 - classification_loss: 0.0138 393/500 [======================>.......] - ETA: 54s - loss: 0.2614 - regression_loss: 0.2476 - classification_loss: 0.0138 394/500 [======================>.......] - ETA: 53s - loss: 0.2617 - regression_loss: 0.2478 - classification_loss: 0.0139 395/500 [======================>.......] - ETA: 53s - loss: 0.2615 - regression_loss: 0.2476 - classification_loss: 0.0139 396/500 [======================>.......] - ETA: 52s - loss: 0.2612 - regression_loss: 0.2474 - classification_loss: 0.0138 397/500 [======================>.......] - ETA: 52s - loss: 0.2609 - regression_loss: 0.2471 - classification_loss: 0.0138 398/500 [======================>.......] - ETA: 51s - loss: 0.2606 - regression_loss: 0.2468 - classification_loss: 0.0138 399/500 [======================>.......] - ETA: 51s - loss: 0.2604 - regression_loss: 0.2466 - classification_loss: 0.0138 400/500 [=======================>......] - ETA: 50s - loss: 0.2606 - regression_loss: 0.2468 - classification_loss: 0.0138 401/500 [=======================>......] - ETA: 50s - loss: 0.2602 - regression_loss: 0.2464 - classification_loss: 0.0138 402/500 [=======================>......] - ETA: 49s - loss: 0.2608 - regression_loss: 0.2469 - classification_loss: 0.0140 403/500 [=======================>......] - ETA: 48s - loss: 0.2608 - regression_loss: 0.2468 - classification_loss: 0.0140 404/500 [=======================>......] - ETA: 48s - loss: 0.2604 - regression_loss: 0.2464 - classification_loss: 0.0140 405/500 [=======================>......] - ETA: 47s - loss: 0.2607 - regression_loss: 0.2467 - classification_loss: 0.0140 406/500 [=======================>......] - ETA: 47s - loss: 0.2605 - regression_loss: 0.2466 - classification_loss: 0.0140 407/500 [=======================>......] - ETA: 46s - loss: 0.2605 - regression_loss: 0.2465 - classification_loss: 0.0140 408/500 [=======================>......] - ETA: 46s - loss: 0.2608 - regression_loss: 0.2468 - classification_loss: 0.0140 409/500 [=======================>......] - ETA: 45s - loss: 0.2611 - regression_loss: 0.2471 - classification_loss: 0.0140 410/500 [=======================>......] - ETA: 45s - loss: 0.2607 - regression_loss: 0.2467 - classification_loss: 0.0140 411/500 [=======================>......] - ETA: 44s - loss: 0.2607 - regression_loss: 0.2467 - classification_loss: 0.0140 412/500 [=======================>......] - ETA: 44s - loss: 0.2605 - regression_loss: 0.2465 - classification_loss: 0.0140 413/500 [=======================>......] - ETA: 43s - loss: 0.2608 - regression_loss: 0.2468 - classification_loss: 0.0140 414/500 [=======================>......] - ETA: 43s - loss: 0.2605 - regression_loss: 0.2465 - classification_loss: 0.0140 415/500 [=======================>......] - ETA: 42s - loss: 0.2605 - regression_loss: 0.2465 - classification_loss: 0.0140 416/500 [=======================>......] - ETA: 42s - loss: 0.2604 - regression_loss: 0.2465 - classification_loss: 0.0140 417/500 [========================>.....] - ETA: 41s - loss: 0.2603 - regression_loss: 0.2464 - classification_loss: 0.0139 418/500 [========================>.....] - ETA: 41s - loss: 0.2601 - regression_loss: 0.2461 - classification_loss: 0.0139 419/500 [========================>.....] - ETA: 40s - loss: 0.2603 - regression_loss: 0.2464 - classification_loss: 0.0140 420/500 [========================>.....] - ETA: 40s - loss: 0.2606 - regression_loss: 0.2467 - classification_loss: 0.0140 421/500 [========================>.....] - ETA: 39s - loss: 0.2602 - regression_loss: 0.2463 - classification_loss: 0.0139 422/500 [========================>.....] - ETA: 39s - loss: 0.2603 - regression_loss: 0.2464 - classification_loss: 0.0139 423/500 [========================>.....] - ETA: 38s - loss: 0.2607 - regression_loss: 0.2467 - classification_loss: 0.0140 424/500 [========================>.....] - ETA: 38s - loss: 0.2606 - regression_loss: 0.2466 - classification_loss: 0.0140 425/500 [========================>.....] - ETA: 37s - loss: 0.2606 - regression_loss: 0.2467 - classification_loss: 0.0140 426/500 [========================>.....] - ETA: 37s - loss: 0.2605 - regression_loss: 0.2466 - classification_loss: 0.0139 427/500 [========================>.....] - ETA: 36s - loss: 0.2610 - regression_loss: 0.2470 - classification_loss: 0.0140 428/500 [========================>.....] - ETA: 36s - loss: 0.2608 - regression_loss: 0.2468 - classification_loss: 0.0139 429/500 [========================>.....] - ETA: 35s - loss: 0.2606 - regression_loss: 0.2467 - classification_loss: 0.0139 430/500 [========================>.....] - ETA: 35s - loss: 0.2604 - regression_loss: 0.2465 - classification_loss: 0.0139 431/500 [========================>.....] - ETA: 34s - loss: 0.2609 - regression_loss: 0.2470 - classification_loss: 0.0139 432/500 [========================>.....] - ETA: 34s - loss: 0.2608 - regression_loss: 0.2469 - classification_loss: 0.0139 433/500 [========================>.....] - ETA: 33s - loss: 0.2606 - regression_loss: 0.2468 - classification_loss: 0.0139 434/500 [=========================>....] - ETA: 33s - loss: 0.2610 - regression_loss: 0.2471 - classification_loss: 0.0139 435/500 [=========================>....] - ETA: 32s - loss: 0.2608 - regression_loss: 0.2469 - classification_loss: 0.0139 436/500 [=========================>....] - ETA: 32s - loss: 0.2607 - regression_loss: 0.2469 - classification_loss: 0.0139 437/500 [=========================>....] - ETA: 31s - loss: 0.2605 - regression_loss: 0.2467 - classification_loss: 0.0139 438/500 [=========================>....] - ETA: 31s - loss: 0.2605 - regression_loss: 0.2467 - classification_loss: 0.0139 439/500 [=========================>....] - ETA: 30s - loss: 0.2606 - regression_loss: 0.2467 - classification_loss: 0.0139 440/500 [=========================>....] - ETA: 30s - loss: 0.2606 - regression_loss: 0.2467 - classification_loss: 0.0139 441/500 [=========================>....] - ETA: 29s - loss: 0.2608 - regression_loss: 0.2469 - classification_loss: 0.0139 442/500 [=========================>....] - ETA: 29s - loss: 0.2608 - regression_loss: 0.2470 - classification_loss: 0.0139 443/500 [=========================>....] - ETA: 28s - loss: 0.2607 - regression_loss: 0.2468 - classification_loss: 0.0138 444/500 [=========================>....] - ETA: 28s - loss: 0.2606 - regression_loss: 0.2468 - classification_loss: 0.0138 445/500 [=========================>....] - ETA: 27s - loss: 0.2603 - regression_loss: 0.2465 - classification_loss: 0.0138 446/500 [=========================>....] - ETA: 27s - loss: 0.2603 - regression_loss: 0.2465 - classification_loss: 0.0138 447/500 [=========================>....] - ETA: 26s - loss: 0.2601 - regression_loss: 0.2463 - classification_loss: 0.0138 448/500 [=========================>....] - ETA: 26s - loss: 0.2605 - regression_loss: 0.2467 - classification_loss: 0.0138 449/500 [=========================>....] - ETA: 25s - loss: 0.2603 - regression_loss: 0.2465 - classification_loss: 0.0138 450/500 [==========================>...] - ETA: 25s - loss: 0.2601 - regression_loss: 0.2463 - classification_loss: 0.0138 451/500 [==========================>...] - ETA: 24s - loss: 0.2599 - regression_loss: 0.2461 - classification_loss: 0.0138 452/500 [==========================>...] - ETA: 24s - loss: 0.2598 - regression_loss: 0.2460 - classification_loss: 0.0138 453/500 [==========================>...] - ETA: 23s - loss: 0.2605 - regression_loss: 0.2466 - classification_loss: 0.0138 454/500 [==========================>...] - ETA: 23s - loss: 0.2607 - regression_loss: 0.2469 - classification_loss: 0.0138 455/500 [==========================>...] - ETA: 22s - loss: 0.2607 - regression_loss: 0.2469 - classification_loss: 0.0138 456/500 [==========================>...] - ETA: 22s - loss: 0.2609 - regression_loss: 0.2471 - classification_loss: 0.0138 457/500 [==========================>...] - ETA: 21s - loss: 0.2611 - regression_loss: 0.2472 - classification_loss: 0.0138 458/500 [==========================>...] - ETA: 21s - loss: 0.2609 - regression_loss: 0.2470 - classification_loss: 0.0138 459/500 [==========================>...] - ETA: 20s - loss: 0.2607 - regression_loss: 0.2469 - classification_loss: 0.0138 460/500 [==========================>...] - ETA: 20s - loss: 0.2609 - regression_loss: 0.2470 - classification_loss: 0.0139 461/500 [==========================>...] - ETA: 19s - loss: 0.2610 - regression_loss: 0.2471 - classification_loss: 0.0139 462/500 [==========================>...] - ETA: 19s - loss: 0.2612 - regression_loss: 0.2473 - classification_loss: 0.0139 463/500 [==========================>...] - ETA: 18s - loss: 0.2613 - regression_loss: 0.2474 - classification_loss: 0.0139 464/500 [==========================>...] - ETA: 18s - loss: 0.2610 - regression_loss: 0.2471 - classification_loss: 0.0139 465/500 [==========================>...] - ETA: 17s - loss: 0.2608 - regression_loss: 0.2469 - classification_loss: 0.0139 466/500 [==========================>...] - ETA: 17s - loss: 0.2604 - regression_loss: 0.2466 - classification_loss: 0.0139 467/500 [===========================>..] - ETA: 16s - loss: 0.2606 - regression_loss: 0.2468 - classification_loss: 0.0139 468/500 [===========================>..] - ETA: 16s - loss: 0.2606 - regression_loss: 0.2468 - classification_loss: 0.0139 469/500 [===========================>..] - ETA: 15s - loss: 0.2604 - regression_loss: 0.2465 - classification_loss: 0.0138 470/500 [===========================>..] - ETA: 15s - loss: 0.2608 - regression_loss: 0.2468 - classification_loss: 0.0140 471/500 [===========================>..] - ETA: 14s - loss: 0.2605 - regression_loss: 0.2466 - classification_loss: 0.0139 472/500 [===========================>..] - ETA: 14s - loss: 0.2603 - regression_loss: 0.2464 - classification_loss: 0.0139 473/500 [===========================>..] - ETA: 13s - loss: 0.2605 - regression_loss: 0.2466 - classification_loss: 0.0140 474/500 [===========================>..] - ETA: 13s - loss: 0.2605 - regression_loss: 0.2466 - classification_loss: 0.0140 475/500 [===========================>..] - ETA: 12s - loss: 0.2607 - regression_loss: 0.2468 - classification_loss: 0.0140 476/500 [===========================>..] - ETA: 12s - loss: 0.2605 - regression_loss: 0.2465 - classification_loss: 0.0139 477/500 [===========================>..] - ETA: 11s - loss: 0.2614 - regression_loss: 0.2475 - classification_loss: 0.0139 478/500 [===========================>..] - ETA: 11s - loss: 0.2615 - regression_loss: 0.2476 - classification_loss: 0.0139 479/500 [===========================>..] - ETA: 10s - loss: 0.2616 - regression_loss: 0.2477 - classification_loss: 0.0139 480/500 [===========================>..] - ETA: 10s - loss: 0.2617 - regression_loss: 0.2477 - classification_loss: 0.0140 481/500 [===========================>..] - ETA: 9s - loss: 0.2613 - regression_loss: 0.2474 - classification_loss: 0.0139  482/500 [===========================>..] - ETA: 9s - loss: 0.2613 - regression_loss: 0.2473 - classification_loss: 0.0139 483/500 [===========================>..] - ETA: 8s - loss: 0.2610 - regression_loss: 0.2471 - classification_loss: 0.0139 484/500 [============================>.] - ETA: 8s - loss: 0.2609 - regression_loss: 0.2470 - classification_loss: 0.0139 485/500 [============================>.] - ETA: 7s - loss: 0.2613 - regression_loss: 0.2474 - classification_loss: 0.0139 486/500 [============================>.] - ETA: 7s - loss: 0.2616 - regression_loss: 0.2477 - classification_loss: 0.0139 487/500 [============================>.] - ETA: 6s - loss: 0.2615 - regression_loss: 0.2476 - classification_loss: 0.0139 488/500 [============================>.] - ETA: 6s - loss: 0.2615 - regression_loss: 0.2476 - classification_loss: 0.0139 489/500 [============================>.] - ETA: 5s - loss: 0.2616 - regression_loss: 0.2477 - classification_loss: 0.0139 490/500 [============================>.] - ETA: 5s - loss: 0.2614 - regression_loss: 0.2475 - classification_loss: 0.0139 491/500 [============================>.] - ETA: 4s - loss: 0.2612 - regression_loss: 0.2473 - classification_loss: 0.0139 492/500 [============================>.] - ETA: 4s - loss: 0.2612 - regression_loss: 0.2473 - classification_loss: 0.0139 493/500 [============================>.] - ETA: 3s - loss: 0.2611 - regression_loss: 0.2472 - classification_loss: 0.0139 494/500 [============================>.] - ETA: 3s - loss: 0.2609 - regression_loss: 0.2471 - classification_loss: 0.0139 495/500 [============================>.] - ETA: 2s - loss: 0.2607 - regression_loss: 0.2469 - classification_loss: 0.0138 496/500 [============================>.] - ETA: 2s - loss: 0.2604 - regression_loss: 0.2466 - classification_loss: 0.0138 497/500 [============================>.] - ETA: 1s - loss: 0.2607 - regression_loss: 0.2469 - classification_loss: 0.0138 498/500 [============================>.] - ETA: 1s - loss: 0.2609 - regression_loss: 0.2470 - classification_loss: 0.0138 499/500 [============================>.] - ETA: 0s - loss: 0.2611 - regression_loss: 0.2472 - classification_loss: 0.0138 500/500 [==============================] - 252s 505ms/step - loss: 0.2608 - regression_loss: 0.2470 - classification_loss: 0.0138 1172 instances of class plum with average precision: 0.7507 mAP: 0.7507 Epoch 00023: saving model to ./training/snapshots/resnet101_pascal_23.h5 Epoch 24/150 1/500 [..............................] - ETA: 4:00 - loss: 0.2609 - regression_loss: 0.2434 - classification_loss: 0.0175 2/500 [..............................] - ETA: 4:02 - loss: 0.1844 - regression_loss: 0.1737 - classification_loss: 0.0107 3/500 [..............................] - ETA: 4:00 - loss: 0.1704 - regression_loss: 0.1626 - classification_loss: 0.0078 4/500 [..............................] - ETA: 4:00 - loss: 0.1906 - regression_loss: 0.1813 - classification_loss: 0.0094 5/500 [..............................] - ETA: 4:02 - loss: 0.2232 - regression_loss: 0.2106 - classification_loss: 0.0126 6/500 [..............................] - ETA: 4:02 - loss: 0.2453 - regression_loss: 0.2284 - classification_loss: 0.0169 7/500 [..............................] - ETA: 4:04 - loss: 0.2458 - regression_loss: 0.2294 - classification_loss: 0.0164 8/500 [..............................] - ETA: 4:05 - loss: 0.2759 - regression_loss: 0.2588 - classification_loss: 0.0171 9/500 [..............................] - ETA: 4:04 - loss: 0.2729 - regression_loss: 0.2558 - classification_loss: 0.0171 10/500 [..............................] - ETA: 4:03 - loss: 0.2824 - regression_loss: 0.2643 - classification_loss: 0.0181 11/500 [..............................] - ETA: 4:03 - loss: 0.2674 - regression_loss: 0.2507 - classification_loss: 0.0168 12/500 [..............................] - ETA: 4:03 - loss: 0.2619 - regression_loss: 0.2459 - classification_loss: 0.0160 13/500 [..............................] - ETA: 4:02 - loss: 0.2497 - regression_loss: 0.2346 - classification_loss: 0.0151 14/500 [..............................] - ETA: 4:02 - loss: 0.2598 - regression_loss: 0.2428 - classification_loss: 0.0170 15/500 [..............................] - ETA: 4:01 - loss: 0.2544 - regression_loss: 0.2379 - classification_loss: 0.0165 16/500 [..............................] - ETA: 4:01 - loss: 0.2542 - regression_loss: 0.2379 - classification_loss: 0.0162 17/500 [>.............................] - ETA: 4:01 - loss: 0.2767 - regression_loss: 0.2600 - classification_loss: 0.0167 18/500 [>.............................] - ETA: 4:00 - loss: 0.2728 - regression_loss: 0.2567 - classification_loss: 0.0161 19/500 [>.............................] - ETA: 4:00 - loss: 0.2676 - regression_loss: 0.2517 - classification_loss: 0.0159 20/500 [>.............................] - ETA: 3:59 - loss: 0.2657 - regression_loss: 0.2499 - classification_loss: 0.0158 21/500 [>.............................] - ETA: 3:59 - loss: 0.2649 - regression_loss: 0.2492 - classification_loss: 0.0156 22/500 [>.............................] - ETA: 3:59 - loss: 0.2586 - regression_loss: 0.2436 - classification_loss: 0.0151 23/500 [>.............................] - ETA: 3:59 - loss: 0.2514 - regression_loss: 0.2368 - classification_loss: 0.0146 24/500 [>.............................] - ETA: 3:59 - loss: 0.2541 - regression_loss: 0.2393 - classification_loss: 0.0148 25/500 [>.............................] - ETA: 3:59 - loss: 0.2489 - regression_loss: 0.2344 - classification_loss: 0.0145 26/500 [>.............................] - ETA: 3:59 - loss: 0.2521 - regression_loss: 0.2376 - classification_loss: 0.0145 27/500 [>.............................] - ETA: 3:59 - loss: 0.2471 - regression_loss: 0.2331 - classification_loss: 0.0141 28/500 [>.............................] - ETA: 3:58 - loss: 0.2561 - regression_loss: 0.2417 - classification_loss: 0.0144 29/500 [>.............................] - ETA: 3:57 - loss: 0.2525 - regression_loss: 0.2385 - classification_loss: 0.0140 30/500 [>.............................] - ETA: 3:57 - loss: 0.2516 - regression_loss: 0.2376 - classification_loss: 0.0139 31/500 [>.............................] - ETA: 3:56 - loss: 0.2484 - regression_loss: 0.2346 - classification_loss: 0.0138 32/500 [>.............................] - ETA: 3:56 - loss: 0.2450 - regression_loss: 0.2315 - classification_loss: 0.0135 33/500 [>.............................] - ETA: 3:55 - loss: 0.2442 - regression_loss: 0.2309 - classification_loss: 0.0134 34/500 [=>............................] - ETA: 3:54 - loss: 0.2431 - regression_loss: 0.2297 - classification_loss: 0.0134 35/500 [=>............................] - ETA: 3:54 - loss: 0.2411 - regression_loss: 0.2280 - classification_loss: 0.0131 36/500 [=>............................] - ETA: 3:53 - loss: 0.2415 - regression_loss: 0.2285 - classification_loss: 0.0130 37/500 [=>............................] - ETA: 3:53 - loss: 0.2429 - regression_loss: 0.2297 - classification_loss: 0.0131 38/500 [=>............................] - ETA: 3:52 - loss: 0.2446 - regression_loss: 0.2313 - classification_loss: 0.0133 39/500 [=>............................] - ETA: 3:52 - loss: 0.2487 - regression_loss: 0.2347 - classification_loss: 0.0141 40/500 [=>............................] - ETA: 3:51 - loss: 0.2491 - regression_loss: 0.2350 - classification_loss: 0.0141 41/500 [=>............................] - ETA: 3:51 - loss: 0.2448 - regression_loss: 0.2310 - classification_loss: 0.0138 42/500 [=>............................] - ETA: 3:50 - loss: 0.2463 - regression_loss: 0.2319 - classification_loss: 0.0143 43/500 [=>............................] - ETA: 3:50 - loss: 0.2454 - regression_loss: 0.2313 - classification_loss: 0.0141 44/500 [=>............................] - ETA: 3:49 - loss: 0.2458 - regression_loss: 0.2319 - classification_loss: 0.0140 45/500 [=>............................] - ETA: 3:49 - loss: 0.2476 - regression_loss: 0.2337 - classification_loss: 0.0139 46/500 [=>............................] - ETA: 3:48 - loss: 0.2484 - regression_loss: 0.2347 - classification_loss: 0.0137 47/500 [=>............................] - ETA: 3:48 - loss: 0.2486 - regression_loss: 0.2350 - classification_loss: 0.0135 48/500 [=>............................] - ETA: 3:47 - loss: 0.2490 - regression_loss: 0.2356 - classification_loss: 0.0134 49/500 [=>............................] - ETA: 3:47 - loss: 0.2481 - regression_loss: 0.2349 - classification_loss: 0.0132 50/500 [==>...........................] - ETA: 3:46 - loss: 0.2446 - regression_loss: 0.2316 - classification_loss: 0.0130 51/500 [==>...........................] - ETA: 3:46 - loss: 0.2434 - regression_loss: 0.2305 - classification_loss: 0.0129 52/500 [==>...........................] - ETA: 3:45 - loss: 0.2462 - regression_loss: 0.2331 - classification_loss: 0.0131 53/500 [==>...........................] - ETA: 3:45 - loss: 0.2475 - regression_loss: 0.2344 - classification_loss: 0.0131 54/500 [==>...........................] - ETA: 3:44 - loss: 0.2473 - regression_loss: 0.2343 - classification_loss: 0.0130 55/500 [==>...........................] - ETA: 3:44 - loss: 0.2483 - regression_loss: 0.2352 - classification_loss: 0.0131 56/500 [==>...........................] - ETA: 3:43 - loss: 0.2457 - regression_loss: 0.2328 - classification_loss: 0.0129 57/500 [==>...........................] - ETA: 3:43 - loss: 0.2464 - regression_loss: 0.2335 - classification_loss: 0.0129 58/500 [==>...........................] - ETA: 3:42 - loss: 0.2482 - regression_loss: 0.2352 - classification_loss: 0.0130 59/500 [==>...........................] - ETA: 3:41 - loss: 0.2481 - regression_loss: 0.2353 - classification_loss: 0.0128 60/500 [==>...........................] - ETA: 3:41 - loss: 0.2509 - regression_loss: 0.2379 - classification_loss: 0.0130 61/500 [==>...........................] - ETA: 3:41 - loss: 0.2524 - regression_loss: 0.2394 - classification_loss: 0.0130 62/500 [==>...........................] - ETA: 3:40 - loss: 0.2517 - regression_loss: 0.2388 - classification_loss: 0.0129 63/500 [==>...........................] - ETA: 3:40 - loss: 0.2517 - regression_loss: 0.2389 - classification_loss: 0.0128 64/500 [==>...........................] - ETA: 3:39 - loss: 0.2525 - regression_loss: 0.2396 - classification_loss: 0.0128 65/500 [==>...........................] - ETA: 3:39 - loss: 0.2536 - regression_loss: 0.2407 - classification_loss: 0.0129 66/500 [==>...........................] - ETA: 3:39 - loss: 0.2536 - regression_loss: 0.2407 - classification_loss: 0.0129 67/500 [===>..........................] - ETA: 3:38 - loss: 0.2528 - regression_loss: 0.2400 - classification_loss: 0.0128 68/500 [===>..........................] - ETA: 3:38 - loss: 0.2507 - regression_loss: 0.2380 - classification_loss: 0.0127 69/500 [===>..........................] - ETA: 3:37 - loss: 0.2501 - regression_loss: 0.2374 - classification_loss: 0.0127 70/500 [===>..........................] - ETA: 3:37 - loss: 0.2507 - regression_loss: 0.2378 - classification_loss: 0.0129 71/500 [===>..........................] - ETA: 3:36 - loss: 0.2497 - regression_loss: 0.2369 - classification_loss: 0.0128 72/500 [===>..........................] - ETA: 3:36 - loss: 0.2503 - regression_loss: 0.2375 - classification_loss: 0.0129 73/500 [===>..........................] - ETA: 3:35 - loss: 0.2517 - regression_loss: 0.2384 - classification_loss: 0.0133 74/500 [===>..........................] - ETA: 3:35 - loss: 0.2514 - regression_loss: 0.2381 - classification_loss: 0.0133 75/500 [===>..........................] - ETA: 3:34 - loss: 0.2524 - regression_loss: 0.2390 - classification_loss: 0.0134 76/500 [===>..........................] - ETA: 3:33 - loss: 0.2506 - regression_loss: 0.2373 - classification_loss: 0.0133 77/500 [===>..........................] - ETA: 3:33 - loss: 0.2527 - regression_loss: 0.2393 - classification_loss: 0.0134 78/500 [===>..........................] - ETA: 3:33 - loss: 0.2574 - regression_loss: 0.2436 - classification_loss: 0.0138 79/500 [===>..........................] - ETA: 3:32 - loss: 0.2580 - regression_loss: 0.2442 - classification_loss: 0.0138 80/500 [===>..........................] - ETA: 3:32 - loss: 0.2566 - regression_loss: 0.2429 - classification_loss: 0.0137 81/500 [===>..........................] - ETA: 3:31 - loss: 0.2548 - regression_loss: 0.2412 - classification_loss: 0.0136 82/500 [===>..........................] - ETA: 3:30 - loss: 0.2535 - regression_loss: 0.2400 - classification_loss: 0.0135 83/500 [===>..........................] - ETA: 3:30 - loss: 0.2542 - regression_loss: 0.2405 - classification_loss: 0.0136 84/500 [====>.........................] - ETA: 3:30 - loss: 0.2526 - regression_loss: 0.2391 - classification_loss: 0.0135 85/500 [====>.........................] - ETA: 3:29 - loss: 0.2516 - regression_loss: 0.2381 - classification_loss: 0.0134 86/500 [====>.........................] - ETA: 3:28 - loss: 0.2505 - regression_loss: 0.2371 - classification_loss: 0.0134 87/500 [====>.........................] - ETA: 3:28 - loss: 0.2511 - regression_loss: 0.2377 - classification_loss: 0.0134 88/500 [====>.........................] - ETA: 3:27 - loss: 0.2514 - regression_loss: 0.2379 - classification_loss: 0.0135 89/500 [====>.........................] - ETA: 3:27 - loss: 0.2521 - regression_loss: 0.2386 - classification_loss: 0.0135 90/500 [====>.........................] - ETA: 3:26 - loss: 0.2523 - regression_loss: 0.2386 - classification_loss: 0.0138 91/500 [====>.........................] - ETA: 3:26 - loss: 0.2521 - regression_loss: 0.2383 - classification_loss: 0.0138 92/500 [====>.........................] - ETA: 3:25 - loss: 0.2516 - regression_loss: 0.2378 - classification_loss: 0.0137 93/500 [====>.........................] - ETA: 3:25 - loss: 0.2515 - regression_loss: 0.2378 - classification_loss: 0.0137 94/500 [====>.........................] - ETA: 3:24 - loss: 0.2512 - regression_loss: 0.2375 - classification_loss: 0.0137 95/500 [====>.........................] - ETA: 3:24 - loss: 0.2507 - regression_loss: 0.2371 - classification_loss: 0.0136 96/500 [====>.........................] - ETA: 3:23 - loss: 0.2523 - regression_loss: 0.2385 - classification_loss: 0.0138 97/500 [====>.........................] - ETA: 3:23 - loss: 0.2525 - regression_loss: 0.2387 - classification_loss: 0.0138 98/500 [====>.........................] - ETA: 3:22 - loss: 0.2521 - regression_loss: 0.2383 - classification_loss: 0.0138 99/500 [====>.........................] - ETA: 3:22 - loss: 0.2526 - regression_loss: 0.2388 - classification_loss: 0.0138 100/500 [=====>........................] - ETA: 3:21 - loss: 0.2525 - regression_loss: 0.2387 - classification_loss: 0.0138 101/500 [=====>........................] - ETA: 3:21 - loss: 0.2520 - regression_loss: 0.2384 - classification_loss: 0.0136 102/500 [=====>........................] - ETA: 3:20 - loss: 0.2510 - regression_loss: 0.2374 - classification_loss: 0.0136 103/500 [=====>........................] - ETA: 3:20 - loss: 0.2511 - regression_loss: 0.2375 - classification_loss: 0.0136 104/500 [=====>........................] - ETA: 3:19 - loss: 0.2512 - regression_loss: 0.2376 - classification_loss: 0.0136 105/500 [=====>........................] - ETA: 3:19 - loss: 0.2505 - regression_loss: 0.2369 - classification_loss: 0.0135 106/500 [=====>........................] - ETA: 3:18 - loss: 0.2495 - regression_loss: 0.2360 - classification_loss: 0.0135 107/500 [=====>........................] - ETA: 3:18 - loss: 0.2500 - regression_loss: 0.2365 - classification_loss: 0.0134 108/500 [=====>........................] - ETA: 3:17 - loss: 0.2509 - regression_loss: 0.2375 - classification_loss: 0.0134 109/500 [=====>........................] - ETA: 3:17 - loss: 0.2514 - regression_loss: 0.2380 - classification_loss: 0.0133 110/500 [=====>........................] - ETA: 3:16 - loss: 0.2510 - regression_loss: 0.2378 - classification_loss: 0.0133 111/500 [=====>........................] - ETA: 3:16 - loss: 0.2522 - regression_loss: 0.2386 - classification_loss: 0.0135 112/500 [=====>........................] - ETA: 3:15 - loss: 0.2523 - regression_loss: 0.2389 - classification_loss: 0.0135 113/500 [=====>........................] - ETA: 3:15 - loss: 0.2513 - regression_loss: 0.2379 - classification_loss: 0.0134 114/500 [=====>........................] - ETA: 3:14 - loss: 0.2520 - regression_loss: 0.2385 - classification_loss: 0.0135 115/500 [=====>........................] - ETA: 3:14 - loss: 0.2511 - regression_loss: 0.2377 - classification_loss: 0.0134 116/500 [=====>........................] - ETA: 3:13 - loss: 0.2507 - regression_loss: 0.2374 - classification_loss: 0.0134 117/500 [======>.......................] - ETA: 3:13 - loss: 0.2508 - regression_loss: 0.2375 - classification_loss: 0.0133 118/500 [======>.......................] - ETA: 3:12 - loss: 0.2503 - regression_loss: 0.2371 - classification_loss: 0.0132 119/500 [======>.......................] - ETA: 3:12 - loss: 0.2496 - regression_loss: 0.2365 - classification_loss: 0.0132 120/500 [======>.......................] - ETA: 3:11 - loss: 0.2482 - regression_loss: 0.2351 - classification_loss: 0.0131 121/500 [======>.......................] - ETA: 3:11 - loss: 0.2472 - regression_loss: 0.2341 - classification_loss: 0.0130 122/500 [======>.......................] - ETA: 3:10 - loss: 0.2492 - regression_loss: 0.2360 - classification_loss: 0.0133 123/500 [======>.......................] - ETA: 3:10 - loss: 0.2496 - regression_loss: 0.2363 - classification_loss: 0.0132 124/500 [======>.......................] - ETA: 3:09 - loss: 0.2484 - regression_loss: 0.2353 - classification_loss: 0.0131 125/500 [======>.......................] - ETA: 3:09 - loss: 0.2482 - regression_loss: 0.2351 - classification_loss: 0.0131 126/500 [======>.......................] - ETA: 3:08 - loss: 0.2483 - regression_loss: 0.2353 - classification_loss: 0.0130 127/500 [======>.......................] - ETA: 3:08 - loss: 0.2488 - regression_loss: 0.2358 - classification_loss: 0.0131 128/500 [======>.......................] - ETA: 3:07 - loss: 0.2489 - regression_loss: 0.2358 - classification_loss: 0.0131 129/500 [======>.......................] - ETA: 3:07 - loss: 0.2492 - regression_loss: 0.2362 - classification_loss: 0.0130 130/500 [======>.......................] - ETA: 3:06 - loss: 0.2494 - regression_loss: 0.2365 - classification_loss: 0.0130 131/500 [======>.......................] - ETA: 3:06 - loss: 0.2491 - regression_loss: 0.2362 - classification_loss: 0.0129 132/500 [======>.......................] - ETA: 3:05 - loss: 0.2484 - regression_loss: 0.2354 - classification_loss: 0.0130 133/500 [======>.......................] - ETA: 3:05 - loss: 0.2518 - regression_loss: 0.2388 - classification_loss: 0.0131 134/500 [=======>......................] - ETA: 3:04 - loss: 0.2521 - regression_loss: 0.2391 - classification_loss: 0.0130 135/500 [=======>......................] - ETA: 3:04 - loss: 0.2532 - regression_loss: 0.2402 - classification_loss: 0.0130 136/500 [=======>......................] - ETA: 3:03 - loss: 0.2541 - regression_loss: 0.2410 - classification_loss: 0.0130 137/500 [=======>......................] - ETA: 3:02 - loss: 0.2536 - regression_loss: 0.2406 - classification_loss: 0.0130 138/500 [=======>......................] - ETA: 3:02 - loss: 0.2535 - regression_loss: 0.2405 - classification_loss: 0.0130 139/500 [=======>......................] - ETA: 3:01 - loss: 0.2530 - regression_loss: 0.2400 - classification_loss: 0.0130 140/500 [=======>......................] - ETA: 3:01 - loss: 0.2536 - regression_loss: 0.2407 - classification_loss: 0.0130 141/500 [=======>......................] - ETA: 3:00 - loss: 0.2530 - regression_loss: 0.2400 - classification_loss: 0.0130 142/500 [=======>......................] - ETA: 3:00 - loss: 0.2539 - regression_loss: 0.2407 - classification_loss: 0.0131 143/500 [=======>......................] - ETA: 2:59 - loss: 0.2530 - regression_loss: 0.2399 - classification_loss: 0.0131 144/500 [=======>......................] - ETA: 2:59 - loss: 0.2530 - regression_loss: 0.2400 - classification_loss: 0.0130 145/500 [=======>......................] - ETA: 2:59 - loss: 0.2537 - regression_loss: 0.2406 - classification_loss: 0.0131 146/500 [=======>......................] - ETA: 2:58 - loss: 0.2536 - regression_loss: 0.2404 - classification_loss: 0.0131 147/500 [=======>......................] - ETA: 2:57 - loss: 0.2536 - regression_loss: 0.2404 - classification_loss: 0.0131 148/500 [=======>......................] - ETA: 2:57 - loss: 0.2533 - regression_loss: 0.2402 - classification_loss: 0.0131 149/500 [=======>......................] - ETA: 2:56 - loss: 0.2532 - regression_loss: 0.2401 - classification_loss: 0.0131 150/500 [========>.....................] - ETA: 2:56 - loss: 0.2545 - regression_loss: 0.2414 - classification_loss: 0.0131 151/500 [========>.....................] - ETA: 2:55 - loss: 0.2548 - regression_loss: 0.2417 - classification_loss: 0.0131 152/500 [========>.....................] - ETA: 2:55 - loss: 0.2558 - regression_loss: 0.2427 - classification_loss: 0.0131 153/500 [========>.....................] - ETA: 2:54 - loss: 0.2552 - regression_loss: 0.2422 - classification_loss: 0.0131 154/500 [========>.....................] - ETA: 2:54 - loss: 0.2544 - regression_loss: 0.2414 - classification_loss: 0.0130 155/500 [========>.....................] - ETA: 2:53 - loss: 0.2543 - regression_loss: 0.2413 - classification_loss: 0.0130 156/500 [========>.....................] - ETA: 2:53 - loss: 0.2555 - regression_loss: 0.2421 - classification_loss: 0.0134 157/500 [========>.....................] - ETA: 2:53 - loss: 0.2567 - regression_loss: 0.2433 - classification_loss: 0.0135 158/500 [========>.....................] - ETA: 2:52 - loss: 0.2579 - regression_loss: 0.2442 - classification_loss: 0.0136 159/500 [========>.....................] - ETA: 2:52 - loss: 0.2577 - regression_loss: 0.2441 - classification_loss: 0.0136 160/500 [========>.....................] - ETA: 2:51 - loss: 0.2572 - regression_loss: 0.2436 - classification_loss: 0.0136 161/500 [========>.....................] - ETA: 2:50 - loss: 0.2584 - regression_loss: 0.2448 - classification_loss: 0.0136 162/500 [========>.....................] - ETA: 2:50 - loss: 0.2584 - regression_loss: 0.2448 - classification_loss: 0.0136 163/500 [========>.....................] - ETA: 2:49 - loss: 0.2596 - regression_loss: 0.2460 - classification_loss: 0.0136 164/500 [========>.....................] - ETA: 2:49 - loss: 0.2596 - regression_loss: 0.2460 - classification_loss: 0.0136 165/500 [========>.....................] - ETA: 2:48 - loss: 0.2599 - regression_loss: 0.2464 - classification_loss: 0.0135 166/500 [========>.....................] - ETA: 2:48 - loss: 0.2610 - regression_loss: 0.2473 - classification_loss: 0.0137 167/500 [=========>....................] - ETA: 2:47 - loss: 0.2607 - regression_loss: 0.2471 - classification_loss: 0.0136 168/500 [=========>....................] - ETA: 2:47 - loss: 0.2611 - regression_loss: 0.2474 - classification_loss: 0.0137 169/500 [=========>....................] - ETA: 2:46 - loss: 0.2617 - regression_loss: 0.2480 - classification_loss: 0.0137 170/500 [=========>....................] - ETA: 2:46 - loss: 0.2635 - regression_loss: 0.2496 - classification_loss: 0.0139 171/500 [=========>....................] - ETA: 2:45 - loss: 0.2641 - regression_loss: 0.2501 - classification_loss: 0.0141 172/500 [=========>....................] - ETA: 2:45 - loss: 0.2637 - regression_loss: 0.2497 - classification_loss: 0.0140 173/500 [=========>....................] - ETA: 2:44 - loss: 0.2647 - regression_loss: 0.2507 - classification_loss: 0.0140 174/500 [=========>....................] - ETA: 2:44 - loss: 0.2646 - regression_loss: 0.2506 - classification_loss: 0.0140 175/500 [=========>....................] - ETA: 2:43 - loss: 0.2643 - regression_loss: 0.2503 - classification_loss: 0.0140 176/500 [=========>....................] - ETA: 2:43 - loss: 0.2648 - regression_loss: 0.2507 - classification_loss: 0.0140 177/500 [=========>....................] - ETA: 2:42 - loss: 0.2642 - regression_loss: 0.2502 - classification_loss: 0.0140 178/500 [=========>....................] - ETA: 2:42 - loss: 0.2639 - regression_loss: 0.2499 - classification_loss: 0.0140 179/500 [=========>....................] - ETA: 2:41 - loss: 0.2642 - regression_loss: 0.2503 - classification_loss: 0.0140 180/500 [=========>....................] - ETA: 2:41 - loss: 0.2647 - regression_loss: 0.2507 - classification_loss: 0.0140 181/500 [=========>....................] - ETA: 2:40 - loss: 0.2650 - regression_loss: 0.2510 - classification_loss: 0.0140 182/500 [=========>....................] - ETA: 2:40 - loss: 0.2652 - regression_loss: 0.2513 - classification_loss: 0.0139 183/500 [=========>....................] - ETA: 2:39 - loss: 0.2657 - regression_loss: 0.2518 - classification_loss: 0.0139 184/500 [==========>...................] - ETA: 2:39 - loss: 0.2654 - regression_loss: 0.2516 - classification_loss: 0.0139 185/500 [==========>...................] - ETA: 2:38 - loss: 0.2656 - regression_loss: 0.2517 - classification_loss: 0.0139 186/500 [==========>...................] - ETA: 2:38 - loss: 0.2651 - regression_loss: 0.2512 - classification_loss: 0.0139 187/500 [==========>...................] - ETA: 2:37 - loss: 0.2642 - regression_loss: 0.2504 - classification_loss: 0.0138 188/500 [==========>...................] - ETA: 2:37 - loss: 0.2639 - regression_loss: 0.2500 - classification_loss: 0.0138 189/500 [==========>...................] - ETA: 2:36 - loss: 0.2629 - regression_loss: 0.2492 - classification_loss: 0.0137 190/500 [==========>...................] - ETA: 2:36 - loss: 0.2623 - regression_loss: 0.2486 - classification_loss: 0.0138 191/500 [==========>...................] - ETA: 2:35 - loss: 0.2630 - regression_loss: 0.2492 - classification_loss: 0.0138 192/500 [==========>...................] - ETA: 2:35 - loss: 0.2626 - regression_loss: 0.2487 - classification_loss: 0.0139 193/500 [==========>...................] - ETA: 2:34 - loss: 0.2617 - regression_loss: 0.2479 - classification_loss: 0.0138 194/500 [==========>...................] - ETA: 2:34 - loss: 0.2620 - regression_loss: 0.2481 - classification_loss: 0.0138 195/500 [==========>...................] - ETA: 2:33 - loss: 0.2616 - regression_loss: 0.2478 - classification_loss: 0.0138 196/500 [==========>...................] - ETA: 2:33 - loss: 0.2612 - regression_loss: 0.2475 - classification_loss: 0.0138 197/500 [==========>...................] - ETA: 2:32 - loss: 0.2623 - regression_loss: 0.2483 - classification_loss: 0.0139 198/500 [==========>...................] - ETA: 2:32 - loss: 0.2633 - regression_loss: 0.2492 - classification_loss: 0.0140 199/500 [==========>...................] - ETA: 2:31 - loss: 0.2630 - regression_loss: 0.2490 - classification_loss: 0.0140 200/500 [===========>..................] - ETA: 2:31 - loss: 0.2626 - regression_loss: 0.2486 - classification_loss: 0.0140 201/500 [===========>..................] - ETA: 2:30 - loss: 0.2627 - regression_loss: 0.2488 - classification_loss: 0.0139 202/500 [===========>..................] - ETA: 2:30 - loss: 0.2624 - regression_loss: 0.2485 - classification_loss: 0.0139 203/500 [===========>..................] - ETA: 2:29 - loss: 0.2637 - regression_loss: 0.2497 - classification_loss: 0.0140 204/500 [===========>..................] - ETA: 2:29 - loss: 0.2631 - regression_loss: 0.2491 - classification_loss: 0.0140 205/500 [===========>..................] - ETA: 2:28 - loss: 0.2624 - regression_loss: 0.2485 - classification_loss: 0.0139 206/500 [===========>..................] - ETA: 2:28 - loss: 0.2635 - regression_loss: 0.2496 - classification_loss: 0.0139 207/500 [===========>..................] - ETA: 2:27 - loss: 0.2633 - regression_loss: 0.2494 - classification_loss: 0.0139 208/500 [===========>..................] - ETA: 2:27 - loss: 0.2627 - regression_loss: 0.2489 - classification_loss: 0.0139 209/500 [===========>..................] - ETA: 2:26 - loss: 0.2620 - regression_loss: 0.2482 - classification_loss: 0.0138 210/500 [===========>..................] - ETA: 2:26 - loss: 0.2614 - regression_loss: 0.2476 - classification_loss: 0.0138 211/500 [===========>..................] - ETA: 2:25 - loss: 0.2611 - regression_loss: 0.2474 - classification_loss: 0.0137 212/500 [===========>..................] - ETA: 2:25 - loss: 0.2608 - regression_loss: 0.2470 - classification_loss: 0.0138 213/500 [===========>..................] - ETA: 2:24 - loss: 0.2606 - regression_loss: 0.2468 - classification_loss: 0.0137 214/500 [===========>..................] - ETA: 2:24 - loss: 0.2608 - regression_loss: 0.2470 - classification_loss: 0.0138 215/500 [===========>..................] - ETA: 2:23 - loss: 0.2607 - regression_loss: 0.2469 - classification_loss: 0.0138 216/500 [===========>..................] - ETA: 2:23 - loss: 0.2610 - regression_loss: 0.2472 - classification_loss: 0.0138 217/500 [============>.................] - ETA: 2:22 - loss: 0.2606 - regression_loss: 0.2468 - classification_loss: 0.0137 218/500 [============>.................] - ETA: 2:22 - loss: 0.2605 - regression_loss: 0.2468 - classification_loss: 0.0137 219/500 [============>.................] - ETA: 2:21 - loss: 0.2602 - regression_loss: 0.2466 - classification_loss: 0.0137 220/500 [============>.................] - ETA: 2:21 - loss: 0.2601 - regression_loss: 0.2464 - classification_loss: 0.0136 221/500 [============>.................] - ETA: 2:20 - loss: 0.2595 - regression_loss: 0.2459 - classification_loss: 0.0136 222/500 [============>.................] - ETA: 2:20 - loss: 0.2595 - regression_loss: 0.2460 - classification_loss: 0.0136 223/500 [============>.................] - ETA: 2:19 - loss: 0.2597 - regression_loss: 0.2461 - classification_loss: 0.0136 224/500 [============>.................] - ETA: 2:19 - loss: 0.2589 - regression_loss: 0.2454 - classification_loss: 0.0135 225/500 [============>.................] - ETA: 2:18 - loss: 0.2590 - regression_loss: 0.2455 - classification_loss: 0.0135 226/500 [============>.................] - ETA: 2:18 - loss: 0.2584 - regression_loss: 0.2449 - classification_loss: 0.0134 227/500 [============>.................] - ETA: 2:17 - loss: 0.2587 - regression_loss: 0.2453 - classification_loss: 0.0134 228/500 [============>.................] - ETA: 2:17 - loss: 0.2585 - regression_loss: 0.2452 - classification_loss: 0.0134 229/500 [============>.................] - ETA: 2:16 - loss: 0.2586 - regression_loss: 0.2452 - classification_loss: 0.0134 230/500 [============>.................] - ETA: 2:16 - loss: 0.2581 - regression_loss: 0.2447 - classification_loss: 0.0134 231/500 [============>.................] - ETA: 2:15 - loss: 0.2577 - regression_loss: 0.2444 - classification_loss: 0.0133 232/500 [============>.................] - ETA: 2:15 - loss: 0.2586 - regression_loss: 0.2451 - classification_loss: 0.0135 233/500 [============>.................] - ETA: 2:14 - loss: 0.2583 - regression_loss: 0.2448 - classification_loss: 0.0135 234/500 [=============>................] - ETA: 2:14 - loss: 0.2587 - regression_loss: 0.2453 - classification_loss: 0.0135 235/500 [=============>................] - ETA: 2:13 - loss: 0.2592 - regression_loss: 0.2457 - classification_loss: 0.0135 236/500 [=============>................] - ETA: 2:13 - loss: 0.2596 - regression_loss: 0.2461 - classification_loss: 0.0135 237/500 [=============>................] - ETA: 2:12 - loss: 0.2594 - regression_loss: 0.2459 - classification_loss: 0.0135 238/500 [=============>................] - ETA: 2:12 - loss: 0.2591 - regression_loss: 0.2456 - classification_loss: 0.0135 239/500 [=============>................] - ETA: 2:11 - loss: 0.2588 - regression_loss: 0.2454 - classification_loss: 0.0134 240/500 [=============>................] - ETA: 2:11 - loss: 0.2590 - regression_loss: 0.2456 - classification_loss: 0.0134 241/500 [=============>................] - ETA: 2:10 - loss: 0.2591 - regression_loss: 0.2457 - classification_loss: 0.0134 242/500 [=============>................] - ETA: 2:10 - loss: 0.2591 - regression_loss: 0.2457 - classification_loss: 0.0134 243/500 [=============>................] - ETA: 2:09 - loss: 0.2593 - regression_loss: 0.2459 - classification_loss: 0.0134 244/500 [=============>................] - ETA: 2:09 - loss: 0.2601 - regression_loss: 0.2467 - classification_loss: 0.0134 245/500 [=============>................] - ETA: 2:08 - loss: 0.2598 - regression_loss: 0.2464 - classification_loss: 0.0134 246/500 [=============>................] - ETA: 2:08 - loss: 0.2598 - regression_loss: 0.2464 - classification_loss: 0.0134 247/500 [=============>................] - ETA: 2:07 - loss: 0.2597 - regression_loss: 0.2463 - classification_loss: 0.0134 248/500 [=============>................] - ETA: 2:07 - loss: 0.2593 - regression_loss: 0.2459 - classification_loss: 0.0134 249/500 [=============>................] - ETA: 2:06 - loss: 0.2596 - regression_loss: 0.2462 - classification_loss: 0.0133 250/500 [==============>...............] - ETA: 2:06 - loss: 0.2589 - regression_loss: 0.2456 - classification_loss: 0.0133 251/500 [==============>...............] - ETA: 2:05 - loss: 0.2594 - regression_loss: 0.2460 - classification_loss: 0.0134 252/500 [==============>...............] - ETA: 2:05 - loss: 0.2597 - regression_loss: 0.2462 - classification_loss: 0.0134 253/500 [==============>...............] - ETA: 2:04 - loss: 0.2601 - regression_loss: 0.2466 - classification_loss: 0.0135 254/500 [==============>...............] - ETA: 2:03 - loss: 0.2609 - regression_loss: 0.2473 - classification_loss: 0.0136 255/500 [==============>...............] - ETA: 2:03 - loss: 0.2614 - regression_loss: 0.2478 - classification_loss: 0.0136 256/500 [==============>...............] - ETA: 2:03 - loss: 0.2614 - regression_loss: 0.2478 - classification_loss: 0.0136 257/500 [==============>...............] - ETA: 2:02 - loss: 0.2618 - regression_loss: 0.2482 - classification_loss: 0.0136 258/500 [==============>...............] - ETA: 2:01 - loss: 0.2617 - regression_loss: 0.2482 - classification_loss: 0.0136 259/500 [==============>...............] - ETA: 2:01 - loss: 0.2616 - regression_loss: 0.2481 - classification_loss: 0.0136 260/500 [==============>...............] - ETA: 2:00 - loss: 0.2614 - regression_loss: 0.2478 - classification_loss: 0.0135 261/500 [==============>...............] - ETA: 2:00 - loss: 0.2612 - regression_loss: 0.2477 - classification_loss: 0.0135 262/500 [==============>...............] - ETA: 1:59 - loss: 0.2606 - regression_loss: 0.2471 - classification_loss: 0.0135 263/500 [==============>...............] - ETA: 1:59 - loss: 0.2606 - regression_loss: 0.2471 - classification_loss: 0.0135 264/500 [==============>...............] - ETA: 1:58 - loss: 0.2604 - regression_loss: 0.2470 - classification_loss: 0.0135 265/500 [==============>...............] - ETA: 1:58 - loss: 0.2607 - regression_loss: 0.2473 - classification_loss: 0.0135 266/500 [==============>...............] - ETA: 1:57 - loss: 0.2611 - regression_loss: 0.2476 - classification_loss: 0.0135 267/500 [===============>..............] - ETA: 1:57 - loss: 0.2615 - regression_loss: 0.2479 - classification_loss: 0.0136 268/500 [===============>..............] - ETA: 1:56 - loss: 0.2630 - regression_loss: 0.2494 - classification_loss: 0.0137 269/500 [===============>..............] - ETA: 1:56 - loss: 0.2631 - regression_loss: 0.2494 - classification_loss: 0.0137 270/500 [===============>..............] - ETA: 1:55 - loss: 0.2629 - regression_loss: 0.2493 - classification_loss: 0.0137 271/500 [===============>..............] - ETA: 1:55 - loss: 0.2628 - regression_loss: 0.2492 - classification_loss: 0.0137 272/500 [===============>..............] - ETA: 1:54 - loss: 0.2630 - regression_loss: 0.2493 - classification_loss: 0.0137 273/500 [===============>..............] - ETA: 1:54 - loss: 0.2633 - regression_loss: 0.2496 - classification_loss: 0.0137 274/500 [===============>..............] - ETA: 1:53 - loss: 0.2636 - regression_loss: 0.2499 - classification_loss: 0.0137 275/500 [===============>..............] - ETA: 1:53 - loss: 0.2633 - regression_loss: 0.2497 - classification_loss: 0.0137 276/500 [===============>..............] - ETA: 1:52 - loss: 0.2634 - regression_loss: 0.2497 - classification_loss: 0.0137 277/500 [===============>..............] - ETA: 1:52 - loss: 0.2638 - regression_loss: 0.2500 - classification_loss: 0.0137 278/500 [===============>..............] - ETA: 1:51 - loss: 0.2642 - regression_loss: 0.2505 - classification_loss: 0.0138 279/500 [===============>..............] - ETA: 1:51 - loss: 0.2645 - regression_loss: 0.2507 - classification_loss: 0.0138 280/500 [===============>..............] - ETA: 1:50 - loss: 0.2646 - regression_loss: 0.2508 - classification_loss: 0.0138 281/500 [===============>..............] - ETA: 1:50 - loss: 0.2648 - regression_loss: 0.2510 - classification_loss: 0.0138 282/500 [===============>..............] - ETA: 1:49 - loss: 0.2646 - regression_loss: 0.2508 - classification_loss: 0.0138 283/500 [===============>..............] - ETA: 1:49 - loss: 0.2645 - regression_loss: 0.2506 - classification_loss: 0.0138 284/500 [================>.............] - ETA: 1:48 - loss: 0.2642 - regression_loss: 0.2504 - classification_loss: 0.0138 285/500 [================>.............] - ETA: 1:48 - loss: 0.2639 - regression_loss: 0.2502 - classification_loss: 0.0138 286/500 [================>.............] - ETA: 1:47 - loss: 0.2638 - regression_loss: 0.2500 - classification_loss: 0.0138 287/500 [================>.............] - ETA: 1:47 - loss: 0.2639 - regression_loss: 0.2501 - classification_loss: 0.0138 288/500 [================>.............] - ETA: 1:46 - loss: 0.2642 - regression_loss: 0.2504 - classification_loss: 0.0139 289/500 [================>.............] - ETA: 1:46 - loss: 0.2640 - regression_loss: 0.2501 - classification_loss: 0.0139 290/500 [================>.............] - ETA: 1:45 - loss: 0.2632 - regression_loss: 0.2494 - classification_loss: 0.0138 291/500 [================>.............] - ETA: 1:45 - loss: 0.2632 - regression_loss: 0.2494 - classification_loss: 0.0138 292/500 [================>.............] - ETA: 1:44 - loss: 0.2627 - regression_loss: 0.2489 - classification_loss: 0.0138 293/500 [================>.............] - ETA: 1:44 - loss: 0.2622 - regression_loss: 0.2485 - classification_loss: 0.0137 294/500 [================>.............] - ETA: 1:43 - loss: 0.2617 - regression_loss: 0.2481 - classification_loss: 0.0137 295/500 [================>.............] - ETA: 1:43 - loss: 0.2617 - regression_loss: 0.2480 - classification_loss: 0.0137 296/500 [================>.............] - ETA: 1:42 - loss: 0.2629 - regression_loss: 0.2492 - classification_loss: 0.0137 297/500 [================>.............] - ETA: 1:42 - loss: 0.2633 - regression_loss: 0.2496 - classification_loss: 0.0137 298/500 [================>.............] - ETA: 1:41 - loss: 0.2639 - regression_loss: 0.2502 - classification_loss: 0.0137 299/500 [================>.............] - ETA: 1:41 - loss: 0.2646 - regression_loss: 0.2507 - classification_loss: 0.0139 300/500 [=================>............] - ETA: 1:40 - loss: 0.2659 - regression_loss: 0.2520 - classification_loss: 0.0140 301/500 [=================>............] - ETA: 1:40 - loss: 0.2657 - regression_loss: 0.2518 - classification_loss: 0.0139 302/500 [=================>............] - ETA: 1:39 - loss: 0.2660 - regression_loss: 0.2521 - classification_loss: 0.0139 303/500 [=================>............] - ETA: 1:39 - loss: 0.2665 - regression_loss: 0.2525 - classification_loss: 0.0140 304/500 [=================>............] - ETA: 1:38 - loss: 0.2671 - regression_loss: 0.2530 - classification_loss: 0.0141 305/500 [=================>............] - ETA: 1:38 - loss: 0.2667 - regression_loss: 0.2527 - classification_loss: 0.0140 306/500 [=================>............] - ETA: 1:37 - loss: 0.2662 - regression_loss: 0.2522 - classification_loss: 0.0140 307/500 [=================>............] - ETA: 1:37 - loss: 0.2663 - regression_loss: 0.2523 - classification_loss: 0.0140 308/500 [=================>............] - ETA: 1:36 - loss: 0.2663 - regression_loss: 0.2523 - classification_loss: 0.0140 309/500 [=================>............] - ETA: 1:36 - loss: 0.2665 - regression_loss: 0.2525 - classification_loss: 0.0140 310/500 [=================>............] - ETA: 1:35 - loss: 0.2673 - regression_loss: 0.2533 - classification_loss: 0.0141 311/500 [=================>............] - ETA: 1:35 - loss: 0.2684 - regression_loss: 0.2543 - classification_loss: 0.0141 312/500 [=================>............] - ETA: 1:34 - loss: 0.2687 - regression_loss: 0.2546 - classification_loss: 0.0141 313/500 [=================>............] - ETA: 1:34 - loss: 0.2687 - regression_loss: 0.2546 - classification_loss: 0.0141 314/500 [=================>............] - ETA: 1:33 - loss: 0.2684 - regression_loss: 0.2543 - classification_loss: 0.0141 315/500 [=================>............] - ETA: 1:33 - loss: 0.2682 - regression_loss: 0.2542 - classification_loss: 0.0140 316/500 [=================>............] - ETA: 1:32 - loss: 0.2680 - regression_loss: 0.2540 - classification_loss: 0.0140 317/500 [==================>...........] - ETA: 1:32 - loss: 0.2677 - regression_loss: 0.2537 - classification_loss: 0.0140 318/500 [==================>...........] - ETA: 1:31 - loss: 0.2676 - regression_loss: 0.2536 - classification_loss: 0.0140 319/500 [==================>...........] - ETA: 1:31 - loss: 0.2676 - regression_loss: 0.2536 - classification_loss: 0.0140 320/500 [==================>...........] - ETA: 1:30 - loss: 0.2672 - regression_loss: 0.2532 - classification_loss: 0.0140 321/500 [==================>...........] - ETA: 1:30 - loss: 0.2668 - regression_loss: 0.2528 - classification_loss: 0.0140 322/500 [==================>...........] - ETA: 1:29 - loss: 0.2666 - regression_loss: 0.2526 - classification_loss: 0.0140 323/500 [==================>...........] - ETA: 1:29 - loss: 0.2664 - regression_loss: 0.2524 - classification_loss: 0.0140 324/500 [==================>...........] - ETA: 1:28 - loss: 0.2667 - regression_loss: 0.2527 - classification_loss: 0.0140 325/500 [==================>...........] - ETA: 1:28 - loss: 0.2673 - regression_loss: 0.2533 - classification_loss: 0.0140 326/500 [==================>...........] - ETA: 1:27 - loss: 0.2673 - regression_loss: 0.2533 - classification_loss: 0.0140 327/500 [==================>...........] - ETA: 1:27 - loss: 0.2680 - regression_loss: 0.2539 - classification_loss: 0.0140 328/500 [==================>...........] - ETA: 1:26 - loss: 0.2676 - regression_loss: 0.2536 - classification_loss: 0.0140 329/500 [==================>...........] - ETA: 1:26 - loss: 0.2676 - regression_loss: 0.2536 - classification_loss: 0.0140 330/500 [==================>...........] - ETA: 1:25 - loss: 0.2676 - regression_loss: 0.2535 - classification_loss: 0.0141 331/500 [==================>...........] - ETA: 1:25 - loss: 0.2671 - regression_loss: 0.2531 - classification_loss: 0.0140 332/500 [==================>...........] - ETA: 1:24 - loss: 0.2672 - regression_loss: 0.2531 - classification_loss: 0.0140 333/500 [==================>...........] - ETA: 1:24 - loss: 0.2668 - regression_loss: 0.2528 - classification_loss: 0.0140 334/500 [===================>..........] - ETA: 1:23 - loss: 0.2668 - regression_loss: 0.2528 - classification_loss: 0.0140 335/500 [===================>..........] - ETA: 1:23 - loss: 0.2662 - regression_loss: 0.2522 - classification_loss: 0.0140 336/500 [===================>..........] - ETA: 1:22 - loss: 0.2659 - regression_loss: 0.2519 - classification_loss: 0.0140 337/500 [===================>..........] - ETA: 1:22 - loss: 0.2656 - regression_loss: 0.2517 - classification_loss: 0.0139 338/500 [===================>..........] - ETA: 1:21 - loss: 0.2665 - regression_loss: 0.2524 - classification_loss: 0.0141 339/500 [===================>..........] - ETA: 1:21 - loss: 0.2663 - regression_loss: 0.2523 - classification_loss: 0.0140 340/500 [===================>..........] - ETA: 1:20 - loss: 0.2670 - regression_loss: 0.2527 - classification_loss: 0.0142 341/500 [===================>..........] - ETA: 1:20 - loss: 0.2670 - regression_loss: 0.2527 - classification_loss: 0.0142 342/500 [===================>..........] - ETA: 1:19 - loss: 0.2670 - regression_loss: 0.2527 - classification_loss: 0.0142 343/500 [===================>..........] - ETA: 1:19 - loss: 0.2669 - regression_loss: 0.2527 - classification_loss: 0.0142 344/500 [===================>..........] - ETA: 1:18 - loss: 0.2669 - regression_loss: 0.2527 - classification_loss: 0.0142 345/500 [===================>..........] - ETA: 1:18 - loss: 0.2669 - regression_loss: 0.2527 - classification_loss: 0.0142 346/500 [===================>..........] - ETA: 1:17 - loss: 0.2667 - regression_loss: 0.2525 - classification_loss: 0.0142 347/500 [===================>..........] - ETA: 1:17 - loss: 0.2668 - regression_loss: 0.2525 - classification_loss: 0.0142 348/500 [===================>..........] - ETA: 1:16 - loss: 0.2669 - regression_loss: 0.2526 - classification_loss: 0.0143 349/500 [===================>..........] - ETA: 1:16 - loss: 0.2670 - regression_loss: 0.2528 - classification_loss: 0.0143 350/500 [====================>.........] - ETA: 1:15 - loss: 0.2667 - regression_loss: 0.2525 - classification_loss: 0.0143 351/500 [====================>.........] - ETA: 1:15 - loss: 0.2666 - regression_loss: 0.2523 - classification_loss: 0.0142 352/500 [====================>.........] - ETA: 1:14 - loss: 0.2672 - regression_loss: 0.2529 - classification_loss: 0.0143 353/500 [====================>.........] - ETA: 1:14 - loss: 0.2669 - regression_loss: 0.2527 - classification_loss: 0.0143 354/500 [====================>.........] - ETA: 1:13 - loss: 0.2667 - regression_loss: 0.2524 - classification_loss: 0.0143 355/500 [====================>.........] - ETA: 1:13 - loss: 0.2666 - regression_loss: 0.2524 - classification_loss: 0.0143 356/500 [====================>.........] - ETA: 1:12 - loss: 0.2661 - regression_loss: 0.2519 - classification_loss: 0.0142 357/500 [====================>.........] - ETA: 1:12 - loss: 0.2662 - regression_loss: 0.2520 - classification_loss: 0.0142 358/500 [====================>.........] - ETA: 1:11 - loss: 0.2658 - regression_loss: 0.2516 - classification_loss: 0.0142 359/500 [====================>.........] - ETA: 1:11 - loss: 0.2658 - regression_loss: 0.2517 - classification_loss: 0.0142 360/500 [====================>.........] - ETA: 1:10 - loss: 0.2657 - regression_loss: 0.2515 - classification_loss: 0.0141 361/500 [====================>.........] - ETA: 1:10 - loss: 0.2657 - regression_loss: 0.2515 - classification_loss: 0.0142 362/500 [====================>.........] - ETA: 1:09 - loss: 0.2659 - regression_loss: 0.2517 - classification_loss: 0.0142 363/500 [====================>.........] - ETA: 1:09 - loss: 0.2656 - regression_loss: 0.2514 - classification_loss: 0.0142 364/500 [====================>.........] - ETA: 1:08 - loss: 0.2653 - regression_loss: 0.2512 - classification_loss: 0.0142 365/500 [====================>.........] - ETA: 1:08 - loss: 0.2660 - regression_loss: 0.2518 - classification_loss: 0.0142 366/500 [====================>.........] - ETA: 1:07 - loss: 0.2666 - regression_loss: 0.2524 - classification_loss: 0.0142 367/500 [=====================>........] - ETA: 1:07 - loss: 0.2664 - regression_loss: 0.2521 - classification_loss: 0.0142 368/500 [=====================>........] - ETA: 1:06 - loss: 0.2662 - regression_loss: 0.2520 - classification_loss: 0.0142 369/500 [=====================>........] - ETA: 1:06 - loss: 0.2663 - regression_loss: 0.2520 - classification_loss: 0.0143 370/500 [=====================>........] - ETA: 1:05 - loss: 0.2661 - regression_loss: 0.2519 - classification_loss: 0.0142 371/500 [=====================>........] - ETA: 1:05 - loss: 0.2658 - regression_loss: 0.2516 - classification_loss: 0.0142 372/500 [=====================>........] - ETA: 1:04 - loss: 0.2662 - regression_loss: 0.2520 - classification_loss: 0.0143 373/500 [=====================>........] - ETA: 1:04 - loss: 0.2666 - regression_loss: 0.2524 - classification_loss: 0.0143 374/500 [=====================>........] - ETA: 1:03 - loss: 0.2664 - regression_loss: 0.2521 - classification_loss: 0.0143 375/500 [=====================>........] - ETA: 1:02 - loss: 0.2664 - regression_loss: 0.2521 - classification_loss: 0.0143 376/500 [=====================>........] - ETA: 1:02 - loss: 0.2668 - regression_loss: 0.2526 - classification_loss: 0.0143 377/500 [=====================>........] - ETA: 1:02 - loss: 0.2675 - regression_loss: 0.2533 - classification_loss: 0.0143 378/500 [=====================>........] - ETA: 1:01 - loss: 0.2671 - regression_loss: 0.2529 - classification_loss: 0.0142 379/500 [=====================>........] - ETA: 1:01 - loss: 0.2674 - regression_loss: 0.2532 - classification_loss: 0.0143 380/500 [=====================>........] - ETA: 1:00 - loss: 0.2671 - regression_loss: 0.2529 - classification_loss: 0.0142 381/500 [=====================>........] - ETA: 1:00 - loss: 0.2680 - regression_loss: 0.2536 - classification_loss: 0.0144 382/500 [=====================>........] - ETA: 59s - loss: 0.2677 - regression_loss: 0.2533 - classification_loss: 0.0144  383/500 [=====================>........] - ETA: 58s - loss: 0.2671 - regression_loss: 0.2528 - classification_loss: 0.0143 384/500 [======================>.......] - ETA: 58s - loss: 0.2670 - regression_loss: 0.2527 - classification_loss: 0.0143 385/500 [======================>.......] - ETA: 57s - loss: 0.2671 - regression_loss: 0.2528 - classification_loss: 0.0143 386/500 [======================>.......] - ETA: 57s - loss: 0.2677 - regression_loss: 0.2533 - classification_loss: 0.0143 387/500 [======================>.......] - ETA: 56s - loss: 0.2676 - regression_loss: 0.2532 - classification_loss: 0.0143 388/500 [======================>.......] - ETA: 56s - loss: 0.2678 - regression_loss: 0.2535 - classification_loss: 0.0144 389/500 [======================>.......] - ETA: 55s - loss: 0.2679 - regression_loss: 0.2535 - classification_loss: 0.0144 390/500 [======================>.......] - ETA: 55s - loss: 0.2682 - regression_loss: 0.2538 - classification_loss: 0.0144 391/500 [======================>.......] - ETA: 54s - loss: 0.2686 - regression_loss: 0.2542 - classification_loss: 0.0144 392/500 [======================>.......] - ETA: 54s - loss: 0.2694 - regression_loss: 0.2550 - classification_loss: 0.0144 393/500 [======================>.......] - ETA: 53s - loss: 0.2694 - regression_loss: 0.2550 - classification_loss: 0.0144 394/500 [======================>.......] - ETA: 53s - loss: 0.2696 - regression_loss: 0.2552 - classification_loss: 0.0144 395/500 [======================>.......] - ETA: 52s - loss: 0.2705 - regression_loss: 0.2560 - classification_loss: 0.0144 396/500 [======================>.......] - ETA: 52s - loss: 0.2704 - regression_loss: 0.2560 - classification_loss: 0.0144 397/500 [======================>.......] - ETA: 51s - loss: 0.2711 - regression_loss: 0.2567 - classification_loss: 0.0144 398/500 [======================>.......] - ETA: 51s - loss: 0.2708 - regression_loss: 0.2564 - classification_loss: 0.0144 399/500 [======================>.......] - ETA: 50s - loss: 0.2713 - regression_loss: 0.2569 - classification_loss: 0.0144 400/500 [=======================>......] - ETA: 50s - loss: 0.2714 - regression_loss: 0.2570 - classification_loss: 0.0144 401/500 [=======================>......] - ETA: 49s - loss: 0.2709 - regression_loss: 0.2565 - classification_loss: 0.0144 402/500 [=======================>......] - ETA: 49s - loss: 0.2712 - regression_loss: 0.2569 - classification_loss: 0.0144 403/500 [=======================>......] - ETA: 48s - loss: 0.2714 - regression_loss: 0.2570 - classification_loss: 0.0144 404/500 [=======================>......] - ETA: 48s - loss: 0.2717 - regression_loss: 0.2572 - classification_loss: 0.0144 405/500 [=======================>......] - ETA: 47s - loss: 0.2715 - regression_loss: 0.2570 - classification_loss: 0.0144 406/500 [=======================>......] - ETA: 47s - loss: 0.2719 - regression_loss: 0.2574 - classification_loss: 0.0145 407/500 [=======================>......] - ETA: 46s - loss: 0.2715 - regression_loss: 0.2570 - classification_loss: 0.0145 408/500 [=======================>......] - ETA: 46s - loss: 0.2711 - regression_loss: 0.2566 - classification_loss: 0.0145 409/500 [=======================>......] - ETA: 45s - loss: 0.2716 - regression_loss: 0.2570 - classification_loss: 0.0146 410/500 [=======================>......] - ETA: 45s - loss: 0.2714 - regression_loss: 0.2569 - classification_loss: 0.0145 411/500 [=======================>......] - ETA: 44s - loss: 0.2709 - regression_loss: 0.2564 - classification_loss: 0.0145 412/500 [=======================>......] - ETA: 44s - loss: 0.2705 - regression_loss: 0.2560 - classification_loss: 0.0145 413/500 [=======================>......] - ETA: 43s - loss: 0.2706 - regression_loss: 0.2561 - classification_loss: 0.0145 414/500 [=======================>......] - ETA: 43s - loss: 0.2710 - regression_loss: 0.2565 - classification_loss: 0.0145 415/500 [=======================>......] - ETA: 42s - loss: 0.2710 - regression_loss: 0.2565 - classification_loss: 0.0145 416/500 [=======================>......] - ETA: 42s - loss: 0.2712 - regression_loss: 0.2567 - classification_loss: 0.0145 417/500 [========================>.....] - ETA: 41s - loss: 0.2710 - regression_loss: 0.2565 - classification_loss: 0.0145 418/500 [========================>.....] - ETA: 41s - loss: 0.2708 - regression_loss: 0.2564 - classification_loss: 0.0145 419/500 [========================>.....] - ETA: 40s - loss: 0.2705 - regression_loss: 0.2561 - classification_loss: 0.0145 420/500 [========================>.....] - ETA: 40s - loss: 0.2708 - regression_loss: 0.2564 - classification_loss: 0.0145 421/500 [========================>.....] - ETA: 39s - loss: 0.2709 - regression_loss: 0.2564 - classification_loss: 0.0145 422/500 [========================>.....] - ETA: 39s - loss: 0.2710 - regression_loss: 0.2565 - classification_loss: 0.0145 423/500 [========================>.....] - ETA: 38s - loss: 0.2711 - regression_loss: 0.2566 - classification_loss: 0.0145 424/500 [========================>.....] - ETA: 38s - loss: 0.2709 - regression_loss: 0.2565 - classification_loss: 0.0144 425/500 [========================>.....] - ETA: 37s - loss: 0.2708 - regression_loss: 0.2564 - classification_loss: 0.0144 426/500 [========================>.....] - ETA: 37s - loss: 0.2706 - regression_loss: 0.2562 - classification_loss: 0.0144 427/500 [========================>.....] - ETA: 36s - loss: 0.2707 - regression_loss: 0.2563 - classification_loss: 0.0144 428/500 [========================>.....] - ETA: 36s - loss: 0.2705 - regression_loss: 0.2561 - classification_loss: 0.0144 429/500 [========================>.....] - ETA: 35s - loss: 0.2711 - regression_loss: 0.2567 - classification_loss: 0.0144 430/500 [========================>.....] - ETA: 35s - loss: 0.2712 - regression_loss: 0.2567 - classification_loss: 0.0144 431/500 [========================>.....] - ETA: 34s - loss: 0.2713 - regression_loss: 0.2569 - classification_loss: 0.0144 432/500 [========================>.....] - ETA: 34s - loss: 0.2711 - regression_loss: 0.2567 - classification_loss: 0.0144 433/500 [========================>.....] - ETA: 33s - loss: 0.2715 - regression_loss: 0.2571 - classification_loss: 0.0144 434/500 [=========================>....] - ETA: 33s - loss: 0.2714 - regression_loss: 0.2570 - classification_loss: 0.0144 435/500 [=========================>....] - ETA: 32s - loss: 0.2716 - regression_loss: 0.2572 - classification_loss: 0.0144 436/500 [=========================>....] - ETA: 32s - loss: 0.2715 - regression_loss: 0.2571 - classification_loss: 0.0144 437/500 [=========================>....] - ETA: 31s - loss: 0.2714 - regression_loss: 0.2570 - classification_loss: 0.0144 438/500 [=========================>....] - ETA: 31s - loss: 0.2715 - regression_loss: 0.2572 - classification_loss: 0.0144 439/500 [=========================>....] - ETA: 30s - loss: 0.2713 - regression_loss: 0.2570 - classification_loss: 0.0144 440/500 [=========================>....] - ETA: 30s - loss: 0.2710 - regression_loss: 0.2567 - classification_loss: 0.0143 441/500 [=========================>....] - ETA: 29s - loss: 0.2712 - regression_loss: 0.2569 - classification_loss: 0.0143 442/500 [=========================>....] - ETA: 29s - loss: 0.2712 - regression_loss: 0.2569 - classification_loss: 0.0143 443/500 [=========================>....] - ETA: 28s - loss: 0.2707 - regression_loss: 0.2565 - classification_loss: 0.0143 444/500 [=========================>....] - ETA: 28s - loss: 0.2711 - regression_loss: 0.2569 - classification_loss: 0.0143 445/500 [=========================>....] - ETA: 27s - loss: 0.2710 - regression_loss: 0.2568 - classification_loss: 0.0143 446/500 [=========================>....] - ETA: 27s - loss: 0.2718 - regression_loss: 0.2575 - classification_loss: 0.0143 447/500 [=========================>....] - ETA: 26s - loss: 0.2718 - regression_loss: 0.2575 - classification_loss: 0.0143 448/500 [=========================>....] - ETA: 26s - loss: 0.2715 - regression_loss: 0.2572 - classification_loss: 0.0143 449/500 [=========================>....] - ETA: 25s - loss: 0.2712 - regression_loss: 0.2569 - classification_loss: 0.0143 450/500 [==========================>...] - ETA: 25s - loss: 0.2713 - regression_loss: 0.2571 - classification_loss: 0.0143 451/500 [==========================>...] - ETA: 24s - loss: 0.2719 - regression_loss: 0.2576 - classification_loss: 0.0143 452/500 [==========================>...] - ETA: 24s - loss: 0.2717 - regression_loss: 0.2574 - classification_loss: 0.0143 453/500 [==========================>...] - ETA: 23s - loss: 0.2716 - regression_loss: 0.2573 - classification_loss: 0.0143 454/500 [==========================>...] - ETA: 23s - loss: 0.2712 - regression_loss: 0.2570 - classification_loss: 0.0143 455/500 [==========================>...] - ETA: 22s - loss: 0.2712 - regression_loss: 0.2569 - classification_loss: 0.0143 456/500 [==========================>...] - ETA: 22s - loss: 0.2708 - regression_loss: 0.2566 - classification_loss: 0.0143 457/500 [==========================>...] - ETA: 21s - loss: 0.2705 - regression_loss: 0.2563 - classification_loss: 0.0142 458/500 [==========================>...] - ETA: 21s - loss: 0.2703 - regression_loss: 0.2561 - classification_loss: 0.0142 459/500 [==========================>...] - ETA: 20s - loss: 0.2701 - regression_loss: 0.2559 - classification_loss: 0.0142 460/500 [==========================>...] - ETA: 20s - loss: 0.2698 - regression_loss: 0.2556 - classification_loss: 0.0142 461/500 [==========================>...] - ETA: 19s - loss: 0.2700 - regression_loss: 0.2558 - classification_loss: 0.0142 462/500 [==========================>...] - ETA: 19s - loss: 0.2697 - regression_loss: 0.2555 - classification_loss: 0.0142 463/500 [==========================>...] - ETA: 18s - loss: 0.2696 - regression_loss: 0.2554 - classification_loss: 0.0141 464/500 [==========================>...] - ETA: 18s - loss: 0.2704 - regression_loss: 0.2561 - classification_loss: 0.0143 465/500 [==========================>...] - ETA: 17s - loss: 0.2704 - regression_loss: 0.2561 - classification_loss: 0.0143 466/500 [==========================>...] - ETA: 17s - loss: 0.2702 - regression_loss: 0.2559 - classification_loss: 0.0143 467/500 [===========================>..] - ETA: 16s - loss: 0.2704 - regression_loss: 0.2561 - classification_loss: 0.0143 468/500 [===========================>..] - ETA: 16s - loss: 0.2707 - regression_loss: 0.2563 - classification_loss: 0.0144 469/500 [===========================>..] - ETA: 15s - loss: 0.2709 - regression_loss: 0.2565 - classification_loss: 0.0144 470/500 [===========================>..] - ETA: 15s - loss: 0.2708 - regression_loss: 0.2564 - classification_loss: 0.0144 471/500 [===========================>..] - ETA: 14s - loss: 0.2709 - regression_loss: 0.2565 - classification_loss: 0.0144 472/500 [===========================>..] - ETA: 14s - loss: 0.2713 - regression_loss: 0.2569 - classification_loss: 0.0144 473/500 [===========================>..] - ETA: 13s - loss: 0.2713 - regression_loss: 0.2569 - classification_loss: 0.0144 474/500 [===========================>..] - ETA: 13s - loss: 0.2713 - regression_loss: 0.2569 - classification_loss: 0.0144 475/500 [===========================>..] - ETA: 12s - loss: 0.2714 - regression_loss: 0.2571 - classification_loss: 0.0143 476/500 [===========================>..] - ETA: 12s - loss: 0.2711 - regression_loss: 0.2568 - classification_loss: 0.0144 477/500 [===========================>..] - ETA: 11s - loss: 0.2714 - regression_loss: 0.2571 - classification_loss: 0.0144 478/500 [===========================>..] - ETA: 11s - loss: 0.2713 - regression_loss: 0.2569 - classification_loss: 0.0144 479/500 [===========================>..] - ETA: 10s - loss: 0.2712 - regression_loss: 0.2568 - classification_loss: 0.0144 480/500 [===========================>..] - ETA: 10s - loss: 0.2711 - regression_loss: 0.2567 - classification_loss: 0.0144 481/500 [===========================>..] - ETA: 9s - loss: 0.2708 - regression_loss: 0.2564 - classification_loss: 0.0144  482/500 [===========================>..] - ETA: 9s - loss: 0.2711 - regression_loss: 0.2568 - classification_loss: 0.0144 483/500 [===========================>..] - ETA: 8s - loss: 0.2712 - regression_loss: 0.2568 - classification_loss: 0.0144 484/500 [============================>.] - ETA: 8s - loss: 0.2712 - regression_loss: 0.2569 - classification_loss: 0.0144 485/500 [============================>.] - ETA: 7s - loss: 0.2714 - regression_loss: 0.2570 - classification_loss: 0.0144 486/500 [============================>.] - ETA: 7s - loss: 0.2717 - regression_loss: 0.2573 - classification_loss: 0.0144 487/500 [============================>.] - ETA: 6s - loss: 0.2714 - regression_loss: 0.2570 - classification_loss: 0.0143 488/500 [============================>.] - ETA: 6s - loss: 0.2714 - regression_loss: 0.2571 - classification_loss: 0.0143 489/500 [============================>.] - ETA: 5s - loss: 0.2714 - regression_loss: 0.2571 - classification_loss: 0.0143 490/500 [============================>.] - ETA: 5s - loss: 0.2711 - regression_loss: 0.2568 - classification_loss: 0.0143 491/500 [============================>.] - ETA: 4s - loss: 0.2712 - regression_loss: 0.2569 - classification_loss: 0.0143 492/500 [============================>.] - ETA: 4s - loss: 0.2709 - regression_loss: 0.2566 - classification_loss: 0.0143 493/500 [============================>.] - ETA: 3s - loss: 0.2708 - regression_loss: 0.2565 - classification_loss: 0.0143 494/500 [============================>.] - ETA: 3s - loss: 0.2707 - regression_loss: 0.2565 - classification_loss: 0.0142 495/500 [============================>.] - ETA: 2s - loss: 0.2706 - regression_loss: 0.2564 - classification_loss: 0.0142 496/500 [============================>.] - ETA: 2s - loss: 0.2708 - regression_loss: 0.2566 - classification_loss: 0.0142 497/500 [============================>.] - ETA: 1s - loss: 0.2708 - regression_loss: 0.2566 - classification_loss: 0.0142 498/500 [============================>.] - ETA: 1s - loss: 0.2717 - regression_loss: 0.2574 - classification_loss: 0.0142 499/500 [============================>.] - ETA: 0s - loss: 0.2717 - regression_loss: 0.2575 - classification_loss: 0.0142 500/500 [==============================] - 252s 504ms/step - loss: 0.2718 - regression_loss: 0.2576 - classification_loss: 0.0142 1172 instances of class plum with average precision: 0.7443 mAP: 0.7443 Epoch 00024: saving model to ./training/snapshots/resnet101_pascal_24.h5 Epoch 25/150 1/500 [..............................] - ETA: 3:58 - loss: 0.1819 - regression_loss: 0.1741 - classification_loss: 0.0079 2/500 [..............................] - ETA: 4:04 - loss: 0.2263 - regression_loss: 0.2122 - classification_loss: 0.0141 3/500 [..............................] - ETA: 4:05 - loss: 0.2054 - regression_loss: 0.1941 - classification_loss: 0.0113 4/500 [..............................] - ETA: 4:04 - loss: 0.2463 - regression_loss: 0.2327 - classification_loss: 0.0135 5/500 [..............................] - ETA: 4:04 - loss: 0.2515 - regression_loss: 0.2390 - classification_loss: 0.0125 6/500 [..............................] - ETA: 4:06 - loss: 0.2301 - regression_loss: 0.2190 - classification_loss: 0.0111 7/500 [..............................] - ETA: 4:06 - loss: 0.2379 - regression_loss: 0.2262 - classification_loss: 0.0117 8/500 [..............................] - ETA: 4:05 - loss: 0.2172 - regression_loss: 0.2060 - classification_loss: 0.0112 9/500 [..............................] - ETA: 4:03 - loss: 0.2160 - regression_loss: 0.2054 - classification_loss: 0.0105 10/500 [..............................] - ETA: 4:02 - loss: 0.2352 - regression_loss: 0.2234 - classification_loss: 0.0119 11/500 [..............................] - ETA: 4:02 - loss: 0.2410 - regression_loss: 0.2290 - classification_loss: 0.0120 12/500 [..............................] - ETA: 4:01 - loss: 0.2454 - regression_loss: 0.2313 - classification_loss: 0.0141 13/500 [..............................] - ETA: 4:00 - loss: 0.2481 - regression_loss: 0.2344 - classification_loss: 0.0137 14/500 [..............................] - ETA: 3:59 - loss: 0.2447 - regression_loss: 0.2317 - classification_loss: 0.0130 15/500 [..............................] - ETA: 3:58 - loss: 0.2453 - regression_loss: 0.2323 - classification_loss: 0.0130 16/500 [..............................] - ETA: 3:57 - loss: 0.2453 - regression_loss: 0.2321 - classification_loss: 0.0132 17/500 [>.............................] - ETA: 3:57 - loss: 0.2437 - regression_loss: 0.2298 - classification_loss: 0.0139 18/500 [>.............................] - ETA: 3:56 - loss: 0.2493 - regression_loss: 0.2347 - classification_loss: 0.0146 19/500 [>.............................] - ETA: 3:55 - loss: 0.2497 - regression_loss: 0.2353 - classification_loss: 0.0145 20/500 [>.............................] - ETA: 3:55 - loss: 0.2431 - regression_loss: 0.2291 - classification_loss: 0.0140 21/500 [>.............................] - ETA: 3:55 - loss: 0.2425 - regression_loss: 0.2286 - classification_loss: 0.0139 22/500 [>.............................] - ETA: 3:55 - loss: 0.2463 - regression_loss: 0.2326 - classification_loss: 0.0137 23/500 [>.............................] - ETA: 3:55 - loss: 0.2504 - regression_loss: 0.2368 - classification_loss: 0.0136 24/500 [>.............................] - ETA: 3:54 - loss: 0.2496 - regression_loss: 0.2361 - classification_loss: 0.0135 25/500 [>.............................] - ETA: 3:54 - loss: 0.2523 - regression_loss: 0.2387 - classification_loss: 0.0136 26/500 [>.............................] - ETA: 3:54 - loss: 0.2511 - regression_loss: 0.2375 - classification_loss: 0.0136 27/500 [>.............................] - ETA: 3:53 - loss: 0.2496 - regression_loss: 0.2363 - classification_loss: 0.0133 28/500 [>.............................] - ETA: 3:53 - loss: 0.2484 - regression_loss: 0.2352 - classification_loss: 0.0132 29/500 [>.............................] - ETA: 3:53 - loss: 0.2525 - regression_loss: 0.2393 - classification_loss: 0.0132 30/500 [>.............................] - ETA: 3:52 - loss: 0.2503 - regression_loss: 0.2372 - classification_loss: 0.0131 31/500 [>.............................] - ETA: 3:52 - loss: 0.2488 - regression_loss: 0.2357 - classification_loss: 0.0131 32/500 [>.............................] - ETA: 3:52 - loss: 0.2470 - regression_loss: 0.2343 - classification_loss: 0.0127 33/500 [>.............................] - ETA: 3:52 - loss: 0.2555 - regression_loss: 0.2429 - classification_loss: 0.0126 34/500 [=>............................] - ETA: 3:52 - loss: 0.2584 - regression_loss: 0.2458 - classification_loss: 0.0126 35/500 [=>............................] - ETA: 3:52 - loss: 0.2620 - regression_loss: 0.2493 - classification_loss: 0.0127 36/500 [=>............................] - ETA: 3:52 - loss: 0.2651 - regression_loss: 0.2523 - classification_loss: 0.0128 37/500 [=>............................] - ETA: 3:51 - loss: 0.2614 - regression_loss: 0.2489 - classification_loss: 0.0126 38/500 [=>............................] - ETA: 3:51 - loss: 0.2629 - regression_loss: 0.2502 - classification_loss: 0.0127 39/500 [=>............................] - ETA: 3:50 - loss: 0.2632 - regression_loss: 0.2506 - classification_loss: 0.0126 40/500 [=>............................] - ETA: 3:49 - loss: 0.2621 - regression_loss: 0.2495 - classification_loss: 0.0126 41/500 [=>............................] - ETA: 3:49 - loss: 0.2636 - regression_loss: 0.2510 - classification_loss: 0.0126 42/500 [=>............................] - ETA: 3:49 - loss: 0.2622 - regression_loss: 0.2498 - classification_loss: 0.0125 43/500 [=>............................] - ETA: 3:48 - loss: 0.2624 - regression_loss: 0.2501 - classification_loss: 0.0123 44/500 [=>............................] - ETA: 3:47 - loss: 0.2618 - regression_loss: 0.2496 - classification_loss: 0.0122 45/500 [=>............................] - ETA: 3:47 - loss: 0.2593 - regression_loss: 0.2473 - classification_loss: 0.0120 46/500 [=>............................] - ETA: 3:46 - loss: 0.2572 - regression_loss: 0.2454 - classification_loss: 0.0118 47/500 [=>............................] - ETA: 3:46 - loss: 0.2576 - regression_loss: 0.2460 - classification_loss: 0.0116 48/500 [=>............................] - ETA: 3:46 - loss: 0.2556 - regression_loss: 0.2441 - classification_loss: 0.0115 49/500 [=>............................] - ETA: 3:46 - loss: 0.2547 - regression_loss: 0.2433 - classification_loss: 0.0114 50/500 [==>...........................] - ETA: 3:45 - loss: 0.2568 - regression_loss: 0.2447 - classification_loss: 0.0121 51/500 [==>...........................] - ETA: 3:45 - loss: 0.2552 - regression_loss: 0.2432 - classification_loss: 0.0119 52/500 [==>...........................] - ETA: 3:45 - loss: 0.2547 - regression_loss: 0.2428 - classification_loss: 0.0119 53/500 [==>...........................] - ETA: 3:44 - loss: 0.2532 - regression_loss: 0.2414 - classification_loss: 0.0118 54/500 [==>...........................] - ETA: 3:44 - loss: 0.2543 - regression_loss: 0.2423 - classification_loss: 0.0120 55/500 [==>...........................] - ETA: 3:44 - loss: 0.2533 - regression_loss: 0.2413 - classification_loss: 0.0119 56/500 [==>...........................] - ETA: 3:43 - loss: 0.2554 - regression_loss: 0.2431 - classification_loss: 0.0124 57/500 [==>...........................] - ETA: 3:43 - loss: 0.2548 - regression_loss: 0.2426 - classification_loss: 0.0122 58/500 [==>...........................] - ETA: 3:42 - loss: 0.2576 - regression_loss: 0.2450 - classification_loss: 0.0126 59/500 [==>...........................] - ETA: 3:42 - loss: 0.2549 - regression_loss: 0.2425 - classification_loss: 0.0124 60/500 [==>...........................] - ETA: 3:41 - loss: 0.2548 - regression_loss: 0.2424 - classification_loss: 0.0125 61/500 [==>...........................] - ETA: 3:41 - loss: 0.2552 - regression_loss: 0.2427 - classification_loss: 0.0125 62/500 [==>...........................] - ETA: 3:40 - loss: 0.2534 - regression_loss: 0.2410 - classification_loss: 0.0124 63/500 [==>...........................] - ETA: 3:40 - loss: 0.2554 - regression_loss: 0.2429 - classification_loss: 0.0125 64/500 [==>...........................] - ETA: 3:39 - loss: 0.2567 - regression_loss: 0.2441 - classification_loss: 0.0125 65/500 [==>...........................] - ETA: 3:39 - loss: 0.2559 - regression_loss: 0.2432 - classification_loss: 0.0126 66/500 [==>...........................] - ETA: 3:38 - loss: 0.2562 - regression_loss: 0.2436 - classification_loss: 0.0126 67/500 [===>..........................] - ETA: 3:38 - loss: 0.2601 - regression_loss: 0.2476 - classification_loss: 0.0125 68/500 [===>..........................] - ETA: 3:37 - loss: 0.2607 - regression_loss: 0.2481 - classification_loss: 0.0126 69/500 [===>..........................] - ETA: 3:37 - loss: 0.2629 - regression_loss: 0.2495 - classification_loss: 0.0134 70/500 [===>..........................] - ETA: 3:36 - loss: 0.2646 - regression_loss: 0.2512 - classification_loss: 0.0134 71/500 [===>..........................] - ETA: 3:36 - loss: 0.2633 - regression_loss: 0.2501 - classification_loss: 0.0132 72/500 [===>..........................] - ETA: 3:35 - loss: 0.2639 - regression_loss: 0.2506 - classification_loss: 0.0133 73/500 [===>..........................] - ETA: 3:35 - loss: 0.2634 - regression_loss: 0.2501 - classification_loss: 0.0133 74/500 [===>..........................] - ETA: 3:34 - loss: 0.2617 - regression_loss: 0.2485 - classification_loss: 0.0132 75/500 [===>..........................] - ETA: 3:34 - loss: 0.2608 - regression_loss: 0.2476 - classification_loss: 0.0132 76/500 [===>..........................] - ETA: 3:33 - loss: 0.2579 - regression_loss: 0.2449 - classification_loss: 0.0130 77/500 [===>..........................] - ETA: 3:32 - loss: 0.2581 - regression_loss: 0.2450 - classification_loss: 0.0131 78/500 [===>..........................] - ETA: 3:32 - loss: 0.2570 - regression_loss: 0.2440 - classification_loss: 0.0131 79/500 [===>..........................] - ETA: 3:31 - loss: 0.2558 - regression_loss: 0.2428 - classification_loss: 0.0130 80/500 [===>..........................] - ETA: 3:31 - loss: 0.2566 - regression_loss: 0.2436 - classification_loss: 0.0129 81/500 [===>..........................] - ETA: 3:30 - loss: 0.2551 - regression_loss: 0.2423 - classification_loss: 0.0128 82/500 [===>..........................] - ETA: 3:30 - loss: 0.2571 - regression_loss: 0.2439 - classification_loss: 0.0131 83/500 [===>..........................] - ETA: 3:30 - loss: 0.2580 - regression_loss: 0.2446 - classification_loss: 0.0133 84/500 [====>.........................] - ETA: 3:29 - loss: 0.2589 - regression_loss: 0.2455 - classification_loss: 0.0134 85/500 [====>.........................] - ETA: 3:29 - loss: 0.2572 - regression_loss: 0.2439 - classification_loss: 0.0133 86/500 [====>.........................] - ETA: 3:28 - loss: 0.2582 - regression_loss: 0.2448 - classification_loss: 0.0134 87/500 [====>.........................] - ETA: 3:27 - loss: 0.2600 - regression_loss: 0.2466 - classification_loss: 0.0134 88/500 [====>.........................] - ETA: 3:27 - loss: 0.2610 - regression_loss: 0.2477 - classification_loss: 0.0133 89/500 [====>.........................] - ETA: 3:26 - loss: 0.2603 - regression_loss: 0.2471 - classification_loss: 0.0132 90/500 [====>.........................] - ETA: 3:26 - loss: 0.2620 - regression_loss: 0.2487 - classification_loss: 0.0133 91/500 [====>.........................] - ETA: 3:26 - loss: 0.2622 - regression_loss: 0.2489 - classification_loss: 0.0134 92/500 [====>.........................] - ETA: 3:25 - loss: 0.2638 - regression_loss: 0.2503 - classification_loss: 0.0135 93/500 [====>.........................] - ETA: 3:24 - loss: 0.2633 - regression_loss: 0.2499 - classification_loss: 0.0134 94/500 [====>.........................] - ETA: 3:24 - loss: 0.2627 - regression_loss: 0.2493 - classification_loss: 0.0134 95/500 [====>.........................] - ETA: 3:23 - loss: 0.2622 - regression_loss: 0.2488 - classification_loss: 0.0134 96/500 [====>.........................] - ETA: 3:23 - loss: 0.2631 - regression_loss: 0.2497 - classification_loss: 0.0134 97/500 [====>.........................] - ETA: 3:22 - loss: 0.2626 - regression_loss: 0.2492 - classification_loss: 0.0134 98/500 [====>.........................] - ETA: 3:22 - loss: 0.2617 - regression_loss: 0.2484 - classification_loss: 0.0134 99/500 [====>.........................] - ETA: 3:21 - loss: 0.2622 - regression_loss: 0.2487 - classification_loss: 0.0135 100/500 [=====>........................] - ETA: 3:21 - loss: 0.2609 - regression_loss: 0.2475 - classification_loss: 0.0134 101/500 [=====>........................] - ETA: 3:20 - loss: 0.2618 - regression_loss: 0.2483 - classification_loss: 0.0135 102/500 [=====>........................] - ETA: 3:20 - loss: 0.2614 - regression_loss: 0.2479 - classification_loss: 0.0135 103/500 [=====>........................] - ETA: 3:19 - loss: 0.2600 - regression_loss: 0.2466 - classification_loss: 0.0134 104/500 [=====>........................] - ETA: 3:19 - loss: 0.2600 - regression_loss: 0.2466 - classification_loss: 0.0133 105/500 [=====>........................] - ETA: 3:18 - loss: 0.2616 - regression_loss: 0.2482 - classification_loss: 0.0134 106/500 [=====>........................] - ETA: 3:18 - loss: 0.2631 - regression_loss: 0.2495 - classification_loss: 0.0136 107/500 [=====>........................] - ETA: 3:17 - loss: 0.2646 - regression_loss: 0.2509 - classification_loss: 0.0137 108/500 [=====>........................] - ETA: 3:17 - loss: 0.2640 - regression_loss: 0.2503 - classification_loss: 0.0136 109/500 [=====>........................] - ETA: 3:16 - loss: 0.2625 - regression_loss: 0.2489 - classification_loss: 0.0136 110/500 [=====>........................] - ETA: 3:16 - loss: 0.2626 - regression_loss: 0.2491 - classification_loss: 0.0135 111/500 [=====>........................] - ETA: 3:15 - loss: 0.2635 - regression_loss: 0.2501 - classification_loss: 0.0135 112/500 [=====>........................] - ETA: 3:15 - loss: 0.2626 - regression_loss: 0.2492 - classification_loss: 0.0134 113/500 [=====>........................] - ETA: 3:14 - loss: 0.2627 - regression_loss: 0.2491 - classification_loss: 0.0135 114/500 [=====>........................] - ETA: 3:14 - loss: 0.2629 - regression_loss: 0.2493 - classification_loss: 0.0136 115/500 [=====>........................] - ETA: 3:13 - loss: 0.2644 - regression_loss: 0.2507 - classification_loss: 0.0137 116/500 [=====>........................] - ETA: 3:13 - loss: 0.2655 - regression_loss: 0.2517 - classification_loss: 0.0138 117/500 [======>.......................] - ETA: 3:12 - loss: 0.2659 - regression_loss: 0.2520 - classification_loss: 0.0139 118/500 [======>.......................] - ETA: 3:12 - loss: 0.2661 - regression_loss: 0.2522 - classification_loss: 0.0139 119/500 [======>.......................] - ETA: 3:11 - loss: 0.2657 - regression_loss: 0.2518 - classification_loss: 0.0139 120/500 [======>.......................] - ETA: 3:11 - loss: 0.2656 - regression_loss: 0.2516 - classification_loss: 0.0140 121/500 [======>.......................] - ETA: 3:10 - loss: 0.2655 - regression_loss: 0.2515 - classification_loss: 0.0140 122/500 [======>.......................] - ETA: 3:10 - loss: 0.2648 - regression_loss: 0.2509 - classification_loss: 0.0139 123/500 [======>.......................] - ETA: 3:09 - loss: 0.2637 - regression_loss: 0.2498 - classification_loss: 0.0138 124/500 [======>.......................] - ETA: 3:09 - loss: 0.2625 - regression_loss: 0.2487 - classification_loss: 0.0138 125/500 [======>.......................] - ETA: 3:08 - loss: 0.2626 - regression_loss: 0.2488 - classification_loss: 0.0138 126/500 [======>.......................] - ETA: 3:08 - loss: 0.2630 - regression_loss: 0.2492 - classification_loss: 0.0137 127/500 [======>.......................] - ETA: 3:07 - loss: 0.2616 - regression_loss: 0.2479 - classification_loss: 0.0137 128/500 [======>.......................] - ETA: 3:07 - loss: 0.2617 - regression_loss: 0.2480 - classification_loss: 0.0137 129/500 [======>.......................] - ETA: 3:06 - loss: 0.2619 - regression_loss: 0.2482 - classification_loss: 0.0137 130/500 [======>.......................] - ETA: 3:06 - loss: 0.2610 - regression_loss: 0.2474 - classification_loss: 0.0136 131/500 [======>.......................] - ETA: 3:05 - loss: 0.2603 - regression_loss: 0.2467 - classification_loss: 0.0136 132/500 [======>.......................] - ETA: 3:05 - loss: 0.2600 - regression_loss: 0.2465 - classification_loss: 0.0136 133/500 [======>.......................] - ETA: 3:04 - loss: 0.2602 - regression_loss: 0.2465 - classification_loss: 0.0137 134/500 [=======>......................] - ETA: 3:03 - loss: 0.2589 - regression_loss: 0.2453 - classification_loss: 0.0136 135/500 [=======>......................] - ETA: 3:03 - loss: 0.2594 - regression_loss: 0.2457 - classification_loss: 0.0137 136/500 [=======>......................] - ETA: 3:02 - loss: 0.2594 - regression_loss: 0.2457 - classification_loss: 0.0136 137/500 [=======>......................] - ETA: 3:02 - loss: 0.2585 - regression_loss: 0.2449 - classification_loss: 0.0136 138/500 [=======>......................] - ETA: 3:02 - loss: 0.2579 - regression_loss: 0.2444 - classification_loss: 0.0135 139/500 [=======>......................] - ETA: 3:01 - loss: 0.2573 - regression_loss: 0.2438 - classification_loss: 0.0135 140/500 [=======>......................] - ETA: 3:00 - loss: 0.2580 - regression_loss: 0.2445 - classification_loss: 0.0135 141/500 [=======>......................] - ETA: 3:00 - loss: 0.2584 - regression_loss: 0.2448 - classification_loss: 0.0136 142/500 [=======>......................] - ETA: 2:59 - loss: 0.2579 - regression_loss: 0.2443 - classification_loss: 0.0136 143/500 [=======>......................] - ETA: 2:59 - loss: 0.2579 - regression_loss: 0.2443 - classification_loss: 0.0136 144/500 [=======>......................] - ETA: 2:58 - loss: 0.2577 - regression_loss: 0.2441 - classification_loss: 0.0136 145/500 [=======>......................] - ETA: 2:58 - loss: 0.2570 - regression_loss: 0.2435 - classification_loss: 0.0135 146/500 [=======>......................] - ETA: 2:57 - loss: 0.2570 - regression_loss: 0.2434 - classification_loss: 0.0136 147/500 [=======>......................] - ETA: 2:57 - loss: 0.2583 - regression_loss: 0.2448 - classification_loss: 0.0135 148/500 [=======>......................] - ETA: 2:56 - loss: 0.2594 - regression_loss: 0.2459 - classification_loss: 0.0136 149/500 [=======>......................] - ETA: 2:56 - loss: 0.2594 - regression_loss: 0.2458 - classification_loss: 0.0135 150/500 [========>.....................] - ETA: 2:55 - loss: 0.2585 - regression_loss: 0.2450 - classification_loss: 0.0135 151/500 [========>.....................] - ETA: 2:55 - loss: 0.2588 - regression_loss: 0.2453 - classification_loss: 0.0135 152/500 [========>.....................] - ETA: 2:54 - loss: 0.2585 - regression_loss: 0.2450 - classification_loss: 0.0135 153/500 [========>.....................] - ETA: 2:54 - loss: 0.2593 - regression_loss: 0.2457 - classification_loss: 0.0136 154/500 [========>.....................] - ETA: 2:53 - loss: 0.2593 - regression_loss: 0.2457 - classification_loss: 0.0136 155/500 [========>.....................] - ETA: 2:53 - loss: 0.2594 - regression_loss: 0.2459 - classification_loss: 0.0136 156/500 [========>.....................] - ETA: 2:52 - loss: 0.2594 - regression_loss: 0.2458 - classification_loss: 0.0136 157/500 [========>.....................] - ETA: 2:52 - loss: 0.2594 - regression_loss: 0.2458 - classification_loss: 0.0136 158/500 [========>.....................] - ETA: 2:51 - loss: 0.2599 - regression_loss: 0.2462 - classification_loss: 0.0137 159/500 [========>.....................] - ETA: 2:51 - loss: 0.2653 - regression_loss: 0.2511 - classification_loss: 0.0142 160/500 [========>.....................] - ETA: 2:50 - loss: 0.2662 - regression_loss: 0.2520 - classification_loss: 0.0142 161/500 [========>.....................] - ETA: 2:50 - loss: 0.2666 - regression_loss: 0.2524 - classification_loss: 0.0142 162/500 [========>.....................] - ETA: 2:49 - loss: 0.2662 - regression_loss: 0.2520 - classification_loss: 0.0142 163/500 [========>.....................] - ETA: 2:49 - loss: 0.2655 - regression_loss: 0.2514 - classification_loss: 0.0141 164/500 [========>.....................] - ETA: 2:48 - loss: 0.2653 - regression_loss: 0.2512 - classification_loss: 0.0141 165/500 [========>.....................] - ETA: 2:48 - loss: 0.2648 - regression_loss: 0.2507 - classification_loss: 0.0141 166/500 [========>.....................] - ETA: 2:47 - loss: 0.2651 - regression_loss: 0.2509 - classification_loss: 0.0142 167/500 [=========>....................] - ETA: 2:47 - loss: 0.2647 - regression_loss: 0.2506 - classification_loss: 0.0142 168/500 [=========>....................] - ETA: 2:46 - loss: 0.2642 - regression_loss: 0.2500 - classification_loss: 0.0141 169/500 [=========>....................] - ETA: 2:46 - loss: 0.2640 - regression_loss: 0.2499 - classification_loss: 0.0141 170/500 [=========>....................] - ETA: 2:45 - loss: 0.2650 - regression_loss: 0.2510 - classification_loss: 0.0141 171/500 [=========>....................] - ETA: 2:45 - loss: 0.2643 - regression_loss: 0.2504 - classification_loss: 0.0140 172/500 [=========>....................] - ETA: 2:44 - loss: 0.2639 - regression_loss: 0.2499 - classification_loss: 0.0139 173/500 [=========>....................] - ETA: 2:44 - loss: 0.2633 - regression_loss: 0.2495 - classification_loss: 0.0139 174/500 [=========>....................] - ETA: 2:43 - loss: 0.2638 - regression_loss: 0.2499 - classification_loss: 0.0139 175/500 [=========>....................] - ETA: 2:43 - loss: 0.2638 - regression_loss: 0.2499 - classification_loss: 0.0139 176/500 [=========>....................] - ETA: 2:42 - loss: 0.2627 - regression_loss: 0.2488 - classification_loss: 0.0140 177/500 [=========>....................] - ETA: 2:42 - loss: 0.2618 - regression_loss: 0.2479 - classification_loss: 0.0139 178/500 [=========>....................] - ETA: 2:41 - loss: 0.2616 - regression_loss: 0.2477 - classification_loss: 0.0139 179/500 [=========>....................] - ETA: 2:41 - loss: 0.2616 - regression_loss: 0.2477 - classification_loss: 0.0138 180/500 [=========>....................] - ETA: 2:40 - loss: 0.2606 - regression_loss: 0.2468 - classification_loss: 0.0138 181/500 [=========>....................] - ETA: 2:40 - loss: 0.2606 - regression_loss: 0.2468 - classification_loss: 0.0138 182/500 [=========>....................] - ETA: 2:39 - loss: 0.2612 - regression_loss: 0.2474 - classification_loss: 0.0138 183/500 [=========>....................] - ETA: 2:39 - loss: 0.2606 - regression_loss: 0.2468 - classification_loss: 0.0137 184/500 [==========>...................] - ETA: 2:38 - loss: 0.2598 - regression_loss: 0.2462 - classification_loss: 0.0137 185/500 [==========>...................] - ETA: 2:38 - loss: 0.2594 - regression_loss: 0.2457 - classification_loss: 0.0136 186/500 [==========>...................] - ETA: 2:37 - loss: 0.2592 - regression_loss: 0.2456 - classification_loss: 0.0136 187/500 [==========>...................] - ETA: 2:37 - loss: 0.2596 - regression_loss: 0.2459 - classification_loss: 0.0137 188/500 [==========>...................] - ETA: 2:36 - loss: 0.2610 - regression_loss: 0.2473 - classification_loss: 0.0137 189/500 [==========>...................] - ETA: 2:36 - loss: 0.2616 - regression_loss: 0.2479 - classification_loss: 0.0137 190/500 [==========>...................] - ETA: 2:35 - loss: 0.2606 - regression_loss: 0.2470 - classification_loss: 0.0136 191/500 [==========>...................] - ETA: 2:35 - loss: 0.2607 - regression_loss: 0.2471 - classification_loss: 0.0136 192/500 [==========>...................] - ETA: 2:34 - loss: 0.2604 - regression_loss: 0.2468 - classification_loss: 0.0136 193/500 [==========>...................] - ETA: 2:34 - loss: 0.2610 - regression_loss: 0.2474 - classification_loss: 0.0136 194/500 [==========>...................] - ETA: 2:33 - loss: 0.2613 - regression_loss: 0.2477 - classification_loss: 0.0137 195/500 [==========>...................] - ETA: 2:33 - loss: 0.2617 - regression_loss: 0.2480 - classification_loss: 0.0137 196/500 [==========>...................] - ETA: 2:32 - loss: 0.2613 - regression_loss: 0.2477 - classification_loss: 0.0136 197/500 [==========>...................] - ETA: 2:32 - loss: 0.2610 - regression_loss: 0.2474 - classification_loss: 0.0136 198/500 [==========>...................] - ETA: 2:31 - loss: 0.2609 - regression_loss: 0.2473 - classification_loss: 0.0136 199/500 [==========>...................] - ETA: 2:31 - loss: 0.2612 - regression_loss: 0.2475 - classification_loss: 0.0136 200/500 [===========>..................] - ETA: 2:30 - loss: 0.2614 - regression_loss: 0.2478 - classification_loss: 0.0137 201/500 [===========>..................] - ETA: 2:30 - loss: 0.2609 - regression_loss: 0.2473 - classification_loss: 0.0136 202/500 [===========>..................] - ETA: 2:29 - loss: 0.2612 - regression_loss: 0.2476 - classification_loss: 0.0136 203/500 [===========>..................] - ETA: 2:29 - loss: 0.2613 - regression_loss: 0.2477 - classification_loss: 0.0136 204/500 [===========>..................] - ETA: 2:28 - loss: 0.2618 - regression_loss: 0.2481 - classification_loss: 0.0137 205/500 [===========>..................] - ETA: 2:28 - loss: 0.2616 - regression_loss: 0.2479 - classification_loss: 0.0137 206/500 [===========>..................] - ETA: 2:27 - loss: 0.2614 - regression_loss: 0.2478 - classification_loss: 0.0136 207/500 [===========>..................] - ETA: 2:27 - loss: 0.2608 - regression_loss: 0.2472 - classification_loss: 0.0136 208/500 [===========>..................] - ETA: 2:26 - loss: 0.2608 - regression_loss: 0.2473 - classification_loss: 0.0136 209/500 [===========>..................] - ETA: 2:26 - loss: 0.2607 - regression_loss: 0.2471 - classification_loss: 0.0136 210/500 [===========>..................] - ETA: 2:25 - loss: 0.2613 - regression_loss: 0.2477 - classification_loss: 0.0136 211/500 [===========>..................] - ETA: 2:25 - loss: 0.2625 - regression_loss: 0.2487 - classification_loss: 0.0137 212/500 [===========>..................] - ETA: 2:24 - loss: 0.2629 - regression_loss: 0.2492 - classification_loss: 0.0137 213/500 [===========>..................] - ETA: 2:24 - loss: 0.2627 - regression_loss: 0.2490 - classification_loss: 0.0137 214/500 [===========>..................] - ETA: 2:23 - loss: 0.2635 - regression_loss: 0.2498 - classification_loss: 0.0137 215/500 [===========>..................] - ETA: 2:23 - loss: 0.2634 - regression_loss: 0.2497 - classification_loss: 0.0137 216/500 [===========>..................] - ETA: 2:22 - loss: 0.2633 - regression_loss: 0.2496 - classification_loss: 0.0137 217/500 [============>.................] - ETA: 2:22 - loss: 0.2626 - regression_loss: 0.2490 - classification_loss: 0.0136 218/500 [============>.................] - ETA: 2:21 - loss: 0.2628 - regression_loss: 0.2492 - classification_loss: 0.0136 219/500 [============>.................] - ETA: 2:21 - loss: 0.2622 - regression_loss: 0.2487 - classification_loss: 0.0136 220/500 [============>.................] - ETA: 2:20 - loss: 0.2626 - regression_loss: 0.2491 - classification_loss: 0.0135 221/500 [============>.................] - ETA: 2:20 - loss: 0.2630 - regression_loss: 0.2494 - classification_loss: 0.0135 222/500 [============>.................] - ETA: 2:19 - loss: 0.2623 - regression_loss: 0.2488 - classification_loss: 0.0135 223/500 [============>.................] - ETA: 2:19 - loss: 0.2623 - regression_loss: 0.2488 - classification_loss: 0.0135 224/500 [============>.................] - ETA: 2:18 - loss: 0.2618 - regression_loss: 0.2484 - classification_loss: 0.0135 225/500 [============>.................] - ETA: 2:18 - loss: 0.2625 - regression_loss: 0.2490 - classification_loss: 0.0135 226/500 [============>.................] - ETA: 2:17 - loss: 0.2622 - regression_loss: 0.2488 - classification_loss: 0.0134 227/500 [============>.................] - ETA: 2:17 - loss: 0.2624 - regression_loss: 0.2489 - classification_loss: 0.0134 228/500 [============>.................] - ETA: 2:16 - loss: 0.2622 - regression_loss: 0.2488 - classification_loss: 0.0134 229/500 [============>.................] - ETA: 2:16 - loss: 0.2619 - regression_loss: 0.2485 - classification_loss: 0.0134 230/500 [============>.................] - ETA: 2:15 - loss: 0.2618 - regression_loss: 0.2485 - classification_loss: 0.0134 231/500 [============>.................] - ETA: 2:15 - loss: 0.2614 - regression_loss: 0.2481 - classification_loss: 0.0133 232/500 [============>.................] - ETA: 2:14 - loss: 0.2616 - regression_loss: 0.2482 - classification_loss: 0.0134 233/500 [============>.................] - ETA: 2:14 - loss: 0.2613 - regression_loss: 0.2480 - classification_loss: 0.0133 234/500 [=============>................] - ETA: 2:13 - loss: 0.2610 - regression_loss: 0.2477 - classification_loss: 0.0133 235/500 [=============>................] - ETA: 2:13 - loss: 0.2611 - regression_loss: 0.2478 - classification_loss: 0.0133 236/500 [=============>................] - ETA: 2:12 - loss: 0.2609 - regression_loss: 0.2476 - classification_loss: 0.0133 237/500 [=============>................] - ETA: 2:12 - loss: 0.2607 - regression_loss: 0.2473 - classification_loss: 0.0134 238/500 [=============>................] - ETA: 2:11 - loss: 0.2603 - regression_loss: 0.2469 - classification_loss: 0.0133 239/500 [=============>................] - ETA: 2:11 - loss: 0.2608 - regression_loss: 0.2473 - classification_loss: 0.0134 240/500 [=============>................] - ETA: 2:10 - loss: 0.2600 - regression_loss: 0.2466 - classification_loss: 0.0134 241/500 [=============>................] - ETA: 2:10 - loss: 0.2599 - regression_loss: 0.2465 - classification_loss: 0.0134 242/500 [=============>................] - ETA: 2:09 - loss: 0.2600 - regression_loss: 0.2465 - classification_loss: 0.0134 243/500 [=============>................] - ETA: 2:09 - loss: 0.2598 - regression_loss: 0.2464 - classification_loss: 0.0134 244/500 [=============>................] - ETA: 2:08 - loss: 0.2597 - regression_loss: 0.2463 - classification_loss: 0.0134 245/500 [=============>................] - ETA: 2:08 - loss: 0.2594 - regression_loss: 0.2461 - classification_loss: 0.0133 246/500 [=============>................] - ETA: 2:07 - loss: 0.2589 - regression_loss: 0.2457 - classification_loss: 0.0133 247/500 [=============>................] - ETA: 2:07 - loss: 0.2595 - regression_loss: 0.2461 - classification_loss: 0.0134 248/500 [=============>................] - ETA: 2:06 - loss: 0.2609 - regression_loss: 0.2474 - classification_loss: 0.0135 249/500 [=============>................] - ETA: 2:06 - loss: 0.2605 - regression_loss: 0.2470 - classification_loss: 0.0135 250/500 [==============>...............] - ETA: 2:05 - loss: 0.2600 - regression_loss: 0.2465 - classification_loss: 0.0134 251/500 [==============>...............] - ETA: 2:05 - loss: 0.2595 - regression_loss: 0.2461 - classification_loss: 0.0134 252/500 [==============>...............] - ETA: 2:04 - loss: 0.2597 - regression_loss: 0.2463 - classification_loss: 0.0134 253/500 [==============>...............] - ETA: 2:04 - loss: 0.2594 - regression_loss: 0.2460 - classification_loss: 0.0134 254/500 [==============>...............] - ETA: 2:03 - loss: 0.2593 - regression_loss: 0.2459 - classification_loss: 0.0134 255/500 [==============>...............] - ETA: 2:03 - loss: 0.2590 - regression_loss: 0.2456 - classification_loss: 0.0134 256/500 [==============>...............] - ETA: 2:02 - loss: 0.2592 - regression_loss: 0.2458 - classification_loss: 0.0134 257/500 [==============>...............] - ETA: 2:02 - loss: 0.2591 - regression_loss: 0.2457 - classification_loss: 0.0134 258/500 [==============>...............] - ETA: 2:01 - loss: 0.2587 - regression_loss: 0.2453 - classification_loss: 0.0134 259/500 [==============>...............] - ETA: 2:01 - loss: 0.2591 - regression_loss: 0.2457 - classification_loss: 0.0134 260/500 [==============>...............] - ETA: 2:00 - loss: 0.2596 - regression_loss: 0.2462 - classification_loss: 0.0134 261/500 [==============>...............] - ETA: 2:00 - loss: 0.2594 - regression_loss: 0.2460 - classification_loss: 0.0134 262/500 [==============>...............] - ETA: 1:59 - loss: 0.2590 - regression_loss: 0.2457 - classification_loss: 0.0134 263/500 [==============>...............] - ETA: 1:59 - loss: 0.2588 - regression_loss: 0.2454 - classification_loss: 0.0134 264/500 [==============>...............] - ETA: 1:58 - loss: 0.2601 - regression_loss: 0.2466 - classification_loss: 0.0135 265/500 [==============>...............] - ETA: 1:58 - loss: 0.2604 - regression_loss: 0.2469 - classification_loss: 0.0135 266/500 [==============>...............] - ETA: 1:57 - loss: 0.2601 - regression_loss: 0.2466 - classification_loss: 0.0135 267/500 [===============>..............] - ETA: 1:57 - loss: 0.2602 - regression_loss: 0.2467 - classification_loss: 0.0135 268/500 [===============>..............] - ETA: 1:56 - loss: 0.2603 - regression_loss: 0.2468 - classification_loss: 0.0135 269/500 [===============>..............] - ETA: 1:56 - loss: 0.2602 - regression_loss: 0.2468 - classification_loss: 0.0135 270/500 [===============>..............] - ETA: 1:55 - loss: 0.2596 - regression_loss: 0.2462 - classification_loss: 0.0134 271/500 [===============>..............] - ETA: 1:55 - loss: 0.2594 - regression_loss: 0.2460 - classification_loss: 0.0134 272/500 [===============>..............] - ETA: 1:54 - loss: 0.2591 - regression_loss: 0.2457 - classification_loss: 0.0134 273/500 [===============>..............] - ETA: 1:54 - loss: 0.2589 - regression_loss: 0.2455 - classification_loss: 0.0134 274/500 [===============>..............] - ETA: 1:53 - loss: 0.2586 - regression_loss: 0.2452 - classification_loss: 0.0134 275/500 [===============>..............] - ETA: 1:53 - loss: 0.2582 - regression_loss: 0.2449 - classification_loss: 0.0133 276/500 [===============>..............] - ETA: 1:52 - loss: 0.2588 - regression_loss: 0.2454 - classification_loss: 0.0134 277/500 [===============>..............] - ETA: 1:52 - loss: 0.2587 - regression_loss: 0.2453 - classification_loss: 0.0134 278/500 [===============>..............] - ETA: 1:51 - loss: 0.2588 - regression_loss: 0.2454 - classification_loss: 0.0134 279/500 [===============>..............] - ETA: 1:51 - loss: 0.2582 - regression_loss: 0.2448 - classification_loss: 0.0134 280/500 [===============>..............] - ETA: 1:50 - loss: 0.2583 - regression_loss: 0.2449 - classification_loss: 0.0134 281/500 [===============>..............] - ETA: 1:50 - loss: 0.2592 - regression_loss: 0.2457 - classification_loss: 0.0134 282/500 [===============>..............] - ETA: 1:49 - loss: 0.2595 - regression_loss: 0.2461 - classification_loss: 0.0134 283/500 [===============>..............] - ETA: 1:49 - loss: 0.2595 - regression_loss: 0.2461 - classification_loss: 0.0134 284/500 [================>.............] - ETA: 1:48 - loss: 0.2595 - regression_loss: 0.2461 - classification_loss: 0.0134 285/500 [================>.............] - ETA: 1:48 - loss: 0.2594 - regression_loss: 0.2460 - classification_loss: 0.0134 286/500 [================>.............] - ETA: 1:47 - loss: 0.2598 - regression_loss: 0.2464 - classification_loss: 0.0134 287/500 [================>.............] - ETA: 1:47 - loss: 0.2595 - regression_loss: 0.2462 - classification_loss: 0.0134 288/500 [================>.............] - ETA: 1:46 - loss: 0.2599 - regression_loss: 0.2466 - classification_loss: 0.0134 289/500 [================>.............] - ETA: 1:46 - loss: 0.2594 - regression_loss: 0.2461 - classification_loss: 0.0133 290/500 [================>.............] - ETA: 1:45 - loss: 0.2595 - regression_loss: 0.2462 - classification_loss: 0.0133 291/500 [================>.............] - ETA: 1:45 - loss: 0.2594 - regression_loss: 0.2461 - classification_loss: 0.0133 292/500 [================>.............] - ETA: 1:44 - loss: 0.2598 - regression_loss: 0.2464 - classification_loss: 0.0134 293/500 [================>.............] - ETA: 1:44 - loss: 0.2596 - regression_loss: 0.2462 - classification_loss: 0.0134 294/500 [================>.............] - ETA: 1:43 - loss: 0.2600 - regression_loss: 0.2466 - classification_loss: 0.0134 295/500 [================>.............] - ETA: 1:43 - loss: 0.2602 - regression_loss: 0.2468 - classification_loss: 0.0134 296/500 [================>.............] - ETA: 1:42 - loss: 0.2608 - regression_loss: 0.2472 - classification_loss: 0.0136 297/500 [================>.............] - ETA: 1:42 - loss: 0.2611 - regression_loss: 0.2474 - classification_loss: 0.0136 298/500 [================>.............] - ETA: 1:41 - loss: 0.2612 - regression_loss: 0.2476 - classification_loss: 0.0136 299/500 [================>.............] - ETA: 1:41 - loss: 0.2615 - regression_loss: 0.2478 - classification_loss: 0.0136 300/500 [=================>............] - ETA: 1:40 - loss: 0.2616 - regression_loss: 0.2479 - classification_loss: 0.0136 301/500 [=================>............] - ETA: 1:40 - loss: 0.2616 - regression_loss: 0.2480 - classification_loss: 0.0136 302/500 [=================>............] - ETA: 1:39 - loss: 0.2612 - regression_loss: 0.2476 - classification_loss: 0.0136 303/500 [=================>............] - ETA: 1:39 - loss: 0.2616 - regression_loss: 0.2480 - classification_loss: 0.0136 304/500 [=================>............] - ETA: 1:38 - loss: 0.2614 - regression_loss: 0.2478 - classification_loss: 0.0136 305/500 [=================>............] - ETA: 1:38 - loss: 0.2610 - regression_loss: 0.2474 - classification_loss: 0.0136 306/500 [=================>............] - ETA: 1:37 - loss: 0.2605 - regression_loss: 0.2469 - classification_loss: 0.0135 307/500 [=================>............] - ETA: 1:37 - loss: 0.2601 - regression_loss: 0.2466 - classification_loss: 0.0135 308/500 [=================>............] - ETA: 1:36 - loss: 0.2601 - regression_loss: 0.2465 - classification_loss: 0.0135 309/500 [=================>............] - ETA: 1:36 - loss: 0.2599 - regression_loss: 0.2464 - classification_loss: 0.0135 310/500 [=================>............] - ETA: 1:35 - loss: 0.2597 - regression_loss: 0.2462 - classification_loss: 0.0135 311/500 [=================>............] - ETA: 1:35 - loss: 0.2592 - regression_loss: 0.2458 - classification_loss: 0.0134 312/500 [=================>............] - ETA: 1:34 - loss: 0.2593 - regression_loss: 0.2458 - classification_loss: 0.0135 313/500 [=================>............] - ETA: 1:34 - loss: 0.2586 - regression_loss: 0.2451 - classification_loss: 0.0135 314/500 [=================>............] - ETA: 1:33 - loss: 0.2582 - regression_loss: 0.2447 - classification_loss: 0.0134 315/500 [=================>............] - ETA: 1:33 - loss: 0.2583 - regression_loss: 0.2448 - classification_loss: 0.0134 316/500 [=================>............] - ETA: 1:32 - loss: 0.2583 - regression_loss: 0.2448 - classification_loss: 0.0134 317/500 [==================>...........] - ETA: 1:32 - loss: 0.2589 - regression_loss: 0.2454 - classification_loss: 0.0135 318/500 [==================>...........] - ETA: 1:31 - loss: 0.2592 - regression_loss: 0.2457 - classification_loss: 0.0136 319/500 [==================>...........] - ETA: 1:31 - loss: 0.2595 - regression_loss: 0.2459 - classification_loss: 0.0136 320/500 [==================>...........] - ETA: 1:30 - loss: 0.2592 - regression_loss: 0.2456 - classification_loss: 0.0136 321/500 [==================>...........] - ETA: 1:30 - loss: 0.2590 - regression_loss: 0.2455 - classification_loss: 0.0135 322/500 [==================>...........] - ETA: 1:29 - loss: 0.2595 - regression_loss: 0.2459 - classification_loss: 0.0136 323/500 [==================>...........] - ETA: 1:29 - loss: 0.2595 - regression_loss: 0.2459 - classification_loss: 0.0136 324/500 [==================>...........] - ETA: 1:28 - loss: 0.2592 - regression_loss: 0.2457 - classification_loss: 0.0135 325/500 [==================>...........] - ETA: 1:28 - loss: 0.2590 - regression_loss: 0.2455 - classification_loss: 0.0135 326/500 [==================>...........] - ETA: 1:27 - loss: 0.2594 - regression_loss: 0.2459 - classification_loss: 0.0135 327/500 [==================>...........] - ETA: 1:27 - loss: 0.2590 - regression_loss: 0.2455 - classification_loss: 0.0135 328/500 [==================>...........] - ETA: 1:26 - loss: 0.2589 - regression_loss: 0.2454 - classification_loss: 0.0135 329/500 [==================>...........] - ETA: 1:26 - loss: 0.2585 - regression_loss: 0.2451 - classification_loss: 0.0135 330/500 [==================>...........] - ETA: 1:25 - loss: 0.2586 - regression_loss: 0.2451 - classification_loss: 0.0135 331/500 [==================>...........] - ETA: 1:25 - loss: 0.2589 - regression_loss: 0.2454 - classification_loss: 0.0135 332/500 [==================>...........] - ETA: 1:24 - loss: 0.2591 - regression_loss: 0.2456 - classification_loss: 0.0135 333/500 [==================>...........] - ETA: 1:24 - loss: 0.2591 - regression_loss: 0.2455 - classification_loss: 0.0136 334/500 [===================>..........] - ETA: 1:23 - loss: 0.2596 - regression_loss: 0.2459 - classification_loss: 0.0137 335/500 [===================>..........] - ETA: 1:23 - loss: 0.2601 - regression_loss: 0.2463 - classification_loss: 0.0138 336/500 [===================>..........] - ETA: 1:22 - loss: 0.2598 - regression_loss: 0.2461 - classification_loss: 0.0138 337/500 [===================>..........] - ETA: 1:22 - loss: 0.2600 - regression_loss: 0.2462 - classification_loss: 0.0138 338/500 [===================>..........] - ETA: 1:21 - loss: 0.2601 - regression_loss: 0.2464 - classification_loss: 0.0138 339/500 [===================>..........] - ETA: 1:21 - loss: 0.2602 - regression_loss: 0.2465 - classification_loss: 0.0138 340/500 [===================>..........] - ETA: 1:20 - loss: 0.2610 - regression_loss: 0.2472 - classification_loss: 0.0138 341/500 [===================>..........] - ETA: 1:19 - loss: 0.2607 - regression_loss: 0.2469 - classification_loss: 0.0137 342/500 [===================>..........] - ETA: 1:19 - loss: 0.2612 - regression_loss: 0.2474 - classification_loss: 0.0138 343/500 [===================>..........] - ETA: 1:18 - loss: 0.2616 - regression_loss: 0.2478 - classification_loss: 0.0138 344/500 [===================>..........] - ETA: 1:18 - loss: 0.2620 - regression_loss: 0.2482 - classification_loss: 0.0138 345/500 [===================>..........] - ETA: 1:17 - loss: 0.2624 - regression_loss: 0.2486 - classification_loss: 0.0138 346/500 [===================>..........] - ETA: 1:17 - loss: 0.2621 - regression_loss: 0.2484 - classification_loss: 0.0138 347/500 [===================>..........] - ETA: 1:16 - loss: 0.2620 - regression_loss: 0.2483 - classification_loss: 0.0137 348/500 [===================>..........] - ETA: 1:16 - loss: 0.2622 - regression_loss: 0.2485 - classification_loss: 0.0137 349/500 [===================>..........] - ETA: 1:15 - loss: 0.2620 - regression_loss: 0.2483 - classification_loss: 0.0137 350/500 [====================>.........] - ETA: 1:15 - loss: 0.2621 - regression_loss: 0.2484 - classification_loss: 0.0137 351/500 [====================>.........] - ETA: 1:14 - loss: 0.2620 - regression_loss: 0.2483 - classification_loss: 0.0137 352/500 [====================>.........] - ETA: 1:14 - loss: 0.2617 - regression_loss: 0.2481 - classification_loss: 0.0136 353/500 [====================>.........] - ETA: 1:13 - loss: 0.2616 - regression_loss: 0.2480 - classification_loss: 0.0136 354/500 [====================>.........] - ETA: 1:13 - loss: 0.2615 - regression_loss: 0.2479 - classification_loss: 0.0136 355/500 [====================>.........] - ETA: 1:12 - loss: 0.2616 - regression_loss: 0.2480 - classification_loss: 0.0136 356/500 [====================>.........] - ETA: 1:12 - loss: 0.2612 - regression_loss: 0.2475 - classification_loss: 0.0136 357/500 [====================>.........] - ETA: 1:11 - loss: 0.2606 - regression_loss: 0.2471 - classification_loss: 0.0136 358/500 [====================>.........] - ETA: 1:11 - loss: 0.2603 - regression_loss: 0.2468 - classification_loss: 0.0136 359/500 [====================>.........] - ETA: 1:10 - loss: 0.2612 - regression_loss: 0.2476 - classification_loss: 0.0136 360/500 [====================>.........] - ETA: 1:10 - loss: 0.2616 - regression_loss: 0.2480 - classification_loss: 0.0136 361/500 [====================>.........] - ETA: 1:09 - loss: 0.2613 - regression_loss: 0.2478 - classification_loss: 0.0136 362/500 [====================>.........] - ETA: 1:09 - loss: 0.2618 - regression_loss: 0.2483 - classification_loss: 0.0136 363/500 [====================>.........] - ETA: 1:08 - loss: 0.2618 - regression_loss: 0.2482 - classification_loss: 0.0136 364/500 [====================>.........] - ETA: 1:08 - loss: 0.2617 - regression_loss: 0.2482 - classification_loss: 0.0136 365/500 [====================>.........] - ETA: 1:07 - loss: 0.2616 - regression_loss: 0.2480 - classification_loss: 0.0135 366/500 [====================>.........] - ETA: 1:07 - loss: 0.2618 - regression_loss: 0.2482 - classification_loss: 0.0135 367/500 [=====================>........] - ETA: 1:06 - loss: 0.2616 - regression_loss: 0.2480 - classification_loss: 0.0135 368/500 [=====================>........] - ETA: 1:06 - loss: 0.2612 - regression_loss: 0.2477 - classification_loss: 0.0135 369/500 [=====================>........] - ETA: 1:05 - loss: 0.2610 - regression_loss: 0.2475 - classification_loss: 0.0135 370/500 [=====================>........] - ETA: 1:05 - loss: 0.2610 - regression_loss: 0.2475 - classification_loss: 0.0135 371/500 [=====================>........] - ETA: 1:04 - loss: 0.2610 - regression_loss: 0.2475 - classification_loss: 0.0135 372/500 [=====================>........] - ETA: 1:04 - loss: 0.2608 - regression_loss: 0.2473 - classification_loss: 0.0135 373/500 [=====================>........] - ETA: 1:03 - loss: 0.2608 - regression_loss: 0.2474 - classification_loss: 0.0135 374/500 [=====================>........] - ETA: 1:03 - loss: 0.2614 - regression_loss: 0.2479 - classification_loss: 0.0135 375/500 [=====================>........] - ETA: 1:02 - loss: 0.2614 - regression_loss: 0.2479 - classification_loss: 0.0135 376/500 [=====================>........] - ETA: 1:02 - loss: 0.2611 - regression_loss: 0.2475 - classification_loss: 0.0135 377/500 [=====================>........] - ETA: 1:01 - loss: 0.2607 - regression_loss: 0.2472 - classification_loss: 0.0135 378/500 [=====================>........] - ETA: 1:01 - loss: 0.2610 - regression_loss: 0.2474 - classification_loss: 0.0135 379/500 [=====================>........] - ETA: 1:00 - loss: 0.2610 - regression_loss: 0.2475 - classification_loss: 0.0135 380/500 [=====================>........] - ETA: 1:00 - loss: 0.2611 - regression_loss: 0.2476 - classification_loss: 0.0135 381/500 [=====================>........] - ETA: 59s - loss: 0.2609 - regression_loss: 0.2474 - classification_loss: 0.0135  382/500 [=====================>........] - ETA: 59s - loss: 0.2606 - regression_loss: 0.2472 - classification_loss: 0.0135 383/500 [=====================>........] - ETA: 58s - loss: 0.2605 - regression_loss: 0.2471 - classification_loss: 0.0134 384/500 [======================>.......] - ETA: 58s - loss: 0.2602 - regression_loss: 0.2468 - classification_loss: 0.0134 385/500 [======================>.......] - ETA: 57s - loss: 0.2607 - regression_loss: 0.2473 - classification_loss: 0.0134 386/500 [======================>.......] - ETA: 57s - loss: 0.2607 - regression_loss: 0.2474 - classification_loss: 0.0134 387/500 [======================>.......] - ETA: 56s - loss: 0.2606 - regression_loss: 0.2472 - classification_loss: 0.0134 388/500 [======================>.......] - ETA: 56s - loss: 0.2604 - regression_loss: 0.2470 - classification_loss: 0.0134 389/500 [======================>.......] - ETA: 55s - loss: 0.2605 - regression_loss: 0.2471 - classification_loss: 0.0134 390/500 [======================>.......] - ETA: 55s - loss: 0.2612 - regression_loss: 0.2478 - classification_loss: 0.0134 391/500 [======================>.......] - ETA: 54s - loss: 0.2609 - regression_loss: 0.2475 - classification_loss: 0.0134 392/500 [======================>.......] - ETA: 54s - loss: 0.2613 - regression_loss: 0.2479 - classification_loss: 0.0134 393/500 [======================>.......] - ETA: 53s - loss: 0.2618 - regression_loss: 0.2484 - classification_loss: 0.0134 394/500 [======================>.......] - ETA: 53s - loss: 0.2616 - regression_loss: 0.2482 - classification_loss: 0.0134 395/500 [======================>.......] - ETA: 52s - loss: 0.2617 - regression_loss: 0.2483 - classification_loss: 0.0134 396/500 [======================>.......] - ETA: 52s - loss: 0.2622 - regression_loss: 0.2488 - classification_loss: 0.0134 397/500 [======================>.......] - ETA: 51s - loss: 0.2618 - regression_loss: 0.2484 - classification_loss: 0.0134 398/500 [======================>.......] - ETA: 51s - loss: 0.2619 - regression_loss: 0.2485 - classification_loss: 0.0134 399/500 [======================>.......] - ETA: 50s - loss: 0.2621 - regression_loss: 0.2487 - classification_loss: 0.0134 400/500 [=======================>......] - ETA: 50s - loss: 0.2618 - regression_loss: 0.2484 - classification_loss: 0.0134 401/500 [=======================>......] - ETA: 49s - loss: 0.2617 - regression_loss: 0.2483 - classification_loss: 0.0133 402/500 [=======================>......] - ETA: 49s - loss: 0.2617 - regression_loss: 0.2483 - classification_loss: 0.0134 403/500 [=======================>......] - ETA: 48s - loss: 0.2621 - regression_loss: 0.2487 - classification_loss: 0.0133 404/500 [=======================>......] - ETA: 48s - loss: 0.2623 - regression_loss: 0.2489 - classification_loss: 0.0134 405/500 [=======================>......] - ETA: 47s - loss: 0.2620 - regression_loss: 0.2486 - classification_loss: 0.0133 406/500 [=======================>......] - ETA: 47s - loss: 0.2619 - regression_loss: 0.2485 - classification_loss: 0.0133 407/500 [=======================>......] - ETA: 46s - loss: 0.2617 - regression_loss: 0.2484 - classification_loss: 0.0133 408/500 [=======================>......] - ETA: 46s - loss: 0.2620 - regression_loss: 0.2486 - classification_loss: 0.0133 409/500 [=======================>......] - ETA: 45s - loss: 0.2618 - regression_loss: 0.2485 - classification_loss: 0.0133 410/500 [=======================>......] - ETA: 45s - loss: 0.2618 - regression_loss: 0.2485 - classification_loss: 0.0133 411/500 [=======================>......] - ETA: 44s - loss: 0.2617 - regression_loss: 0.2484 - classification_loss: 0.0133 412/500 [=======================>......] - ETA: 44s - loss: 0.2613 - regression_loss: 0.2481 - classification_loss: 0.0132 413/500 [=======================>......] - ETA: 43s - loss: 0.2624 - regression_loss: 0.2491 - classification_loss: 0.0133 414/500 [=======================>......] - ETA: 43s - loss: 0.2622 - regression_loss: 0.2489 - classification_loss: 0.0133 415/500 [=======================>......] - ETA: 42s - loss: 0.2618 - regression_loss: 0.2486 - classification_loss: 0.0133 416/500 [=======================>......] - ETA: 42s - loss: 0.2616 - regression_loss: 0.2484 - classification_loss: 0.0132 417/500 [========================>.....] - ETA: 41s - loss: 0.2618 - regression_loss: 0.2486 - classification_loss: 0.0132 418/500 [========================>.....] - ETA: 41s - loss: 0.2620 - regression_loss: 0.2488 - classification_loss: 0.0132 419/500 [========================>.....] - ETA: 40s - loss: 0.2623 - regression_loss: 0.2491 - classification_loss: 0.0132 420/500 [========================>.....] - ETA: 40s - loss: 0.2620 - regression_loss: 0.2487 - classification_loss: 0.0132 421/500 [========================>.....] - ETA: 39s - loss: 0.2620 - regression_loss: 0.2488 - classification_loss: 0.0132 422/500 [========================>.....] - ETA: 39s - loss: 0.2622 - regression_loss: 0.2490 - classification_loss: 0.0132 423/500 [========================>.....] - ETA: 38s - loss: 0.2627 - regression_loss: 0.2494 - classification_loss: 0.0132 424/500 [========================>.....] - ETA: 38s - loss: 0.2628 - regression_loss: 0.2495 - classification_loss: 0.0133 425/500 [========================>.....] - ETA: 37s - loss: 0.2626 - regression_loss: 0.2494 - classification_loss: 0.0132 426/500 [========================>.....] - ETA: 37s - loss: 0.2626 - regression_loss: 0.2493 - classification_loss: 0.0132 427/500 [========================>.....] - ETA: 36s - loss: 0.2624 - regression_loss: 0.2491 - classification_loss: 0.0132 428/500 [========================>.....] - ETA: 36s - loss: 0.2624 - regression_loss: 0.2491 - classification_loss: 0.0132 429/500 [========================>.....] - ETA: 35s - loss: 0.2623 - regression_loss: 0.2491 - classification_loss: 0.0132 430/500 [========================>.....] - ETA: 35s - loss: 0.2627 - regression_loss: 0.2494 - classification_loss: 0.0133 431/500 [========================>.....] - ETA: 34s - loss: 0.2626 - regression_loss: 0.2493 - classification_loss: 0.0133 432/500 [========================>.....] - ETA: 34s - loss: 0.2626 - regression_loss: 0.2493 - classification_loss: 0.0133 433/500 [========================>.....] - ETA: 33s - loss: 0.2624 - regression_loss: 0.2492 - classification_loss: 0.0133 434/500 [=========================>....] - ETA: 33s - loss: 0.2624 - regression_loss: 0.2491 - classification_loss: 0.0133 435/500 [=========================>....] - ETA: 32s - loss: 0.2620 - regression_loss: 0.2487 - classification_loss: 0.0133 436/500 [=========================>....] - ETA: 32s - loss: 0.2620 - regression_loss: 0.2487 - classification_loss: 0.0132 437/500 [=========================>....] - ETA: 31s - loss: 0.2619 - regression_loss: 0.2487 - classification_loss: 0.0132 438/500 [=========================>....] - ETA: 31s - loss: 0.2617 - regression_loss: 0.2485 - classification_loss: 0.0132 439/500 [=========================>....] - ETA: 30s - loss: 0.2616 - regression_loss: 0.2484 - classification_loss: 0.0132 440/500 [=========================>....] - ETA: 30s - loss: 0.2613 - regression_loss: 0.2481 - classification_loss: 0.0132 441/500 [=========================>....] - ETA: 29s - loss: 0.2610 - regression_loss: 0.2478 - classification_loss: 0.0132 442/500 [=========================>....] - ETA: 29s - loss: 0.2608 - regression_loss: 0.2476 - classification_loss: 0.0132 443/500 [=========================>....] - ETA: 28s - loss: 0.2612 - regression_loss: 0.2479 - classification_loss: 0.0132 444/500 [=========================>....] - ETA: 28s - loss: 0.2610 - regression_loss: 0.2478 - classification_loss: 0.0132 445/500 [=========================>....] - ETA: 27s - loss: 0.2610 - regression_loss: 0.2477 - classification_loss: 0.0132 446/500 [=========================>....] - ETA: 27s - loss: 0.2609 - regression_loss: 0.2477 - classification_loss: 0.0132 447/500 [=========================>....] - ETA: 26s - loss: 0.2609 - regression_loss: 0.2476 - classification_loss: 0.0133 448/500 [=========================>....] - ETA: 26s - loss: 0.2609 - regression_loss: 0.2477 - classification_loss: 0.0133 449/500 [=========================>....] - ETA: 25s - loss: 0.2607 - regression_loss: 0.2475 - classification_loss: 0.0132 450/500 [==========================>...] - ETA: 25s - loss: 0.2606 - regression_loss: 0.2474 - classification_loss: 0.0133 451/500 [==========================>...] - ETA: 24s - loss: 0.2607 - regression_loss: 0.2474 - classification_loss: 0.0133 452/500 [==========================>...] - ETA: 24s - loss: 0.2610 - regression_loss: 0.2476 - classification_loss: 0.0134 453/500 [==========================>...] - ETA: 23s - loss: 0.2608 - regression_loss: 0.2475 - classification_loss: 0.0133 454/500 [==========================>...] - ETA: 23s - loss: 0.2607 - regression_loss: 0.2474 - classification_loss: 0.0133 455/500 [==========================>...] - ETA: 22s - loss: 0.2608 - regression_loss: 0.2474 - classification_loss: 0.0133 456/500 [==========================>...] - ETA: 22s - loss: 0.2608 - regression_loss: 0.2474 - classification_loss: 0.0133 457/500 [==========================>...] - ETA: 21s - loss: 0.2605 - regression_loss: 0.2472 - classification_loss: 0.0133 458/500 [==========================>...] - ETA: 21s - loss: 0.2610 - regression_loss: 0.2477 - classification_loss: 0.0133 459/500 [==========================>...] - ETA: 20s - loss: 0.2607 - regression_loss: 0.2473 - classification_loss: 0.0133 460/500 [==========================>...] - ETA: 20s - loss: 0.2605 - regression_loss: 0.2472 - classification_loss: 0.0133 461/500 [==========================>...] - ETA: 19s - loss: 0.2603 - regression_loss: 0.2470 - classification_loss: 0.0133 462/500 [==========================>...] - ETA: 19s - loss: 0.2602 - regression_loss: 0.2469 - classification_loss: 0.0133 463/500 [==========================>...] - ETA: 18s - loss: 0.2606 - regression_loss: 0.2473 - classification_loss: 0.0134 464/500 [==========================>...] - ETA: 18s - loss: 0.2605 - regression_loss: 0.2472 - classification_loss: 0.0133 465/500 [==========================>...] - ETA: 17s - loss: 0.2607 - regression_loss: 0.2473 - classification_loss: 0.0134 466/500 [==========================>...] - ETA: 17s - loss: 0.2603 - regression_loss: 0.2470 - classification_loss: 0.0133 467/500 [===========================>..] - ETA: 16s - loss: 0.2602 - regression_loss: 0.2468 - classification_loss: 0.0133 468/500 [===========================>..] - ETA: 16s - loss: 0.2604 - regression_loss: 0.2470 - classification_loss: 0.0134 469/500 [===========================>..] - ETA: 15s - loss: 0.2600 - regression_loss: 0.2467 - classification_loss: 0.0133 470/500 [===========================>..] - ETA: 15s - loss: 0.2597 - regression_loss: 0.2464 - classification_loss: 0.0133 471/500 [===========================>..] - ETA: 14s - loss: 0.2596 - regression_loss: 0.2463 - classification_loss: 0.0133 472/500 [===========================>..] - ETA: 14s - loss: 0.2594 - regression_loss: 0.2461 - classification_loss: 0.0133 473/500 [===========================>..] - ETA: 13s - loss: 0.2594 - regression_loss: 0.2461 - classification_loss: 0.0133 474/500 [===========================>..] - ETA: 13s - loss: 0.2595 - regression_loss: 0.2462 - classification_loss: 0.0133 475/500 [===========================>..] - ETA: 12s - loss: 0.2595 - regression_loss: 0.2462 - classification_loss: 0.0133 476/500 [===========================>..] - ETA: 12s - loss: 0.2596 - regression_loss: 0.2463 - classification_loss: 0.0133 477/500 [===========================>..] - ETA: 11s - loss: 0.2599 - regression_loss: 0.2466 - classification_loss: 0.0133 478/500 [===========================>..] - ETA: 11s - loss: 0.2598 - regression_loss: 0.2465 - classification_loss: 0.0133 479/500 [===========================>..] - ETA: 10s - loss: 0.2601 - regression_loss: 0.2468 - classification_loss: 0.0133 480/500 [===========================>..] - ETA: 10s - loss: 0.2601 - regression_loss: 0.2468 - classification_loss: 0.0133 481/500 [===========================>..] - ETA: 9s - loss: 0.2604 - regression_loss: 0.2471 - classification_loss: 0.0133  482/500 [===========================>..] - ETA: 9s - loss: 0.2605 - regression_loss: 0.2471 - classification_loss: 0.0134 483/500 [===========================>..] - ETA: 8s - loss: 0.2607 - regression_loss: 0.2474 - classification_loss: 0.0134 484/500 [============================>.] - ETA: 8s - loss: 0.2606 - regression_loss: 0.2472 - classification_loss: 0.0134 485/500 [============================>.] - ETA: 7s - loss: 0.2604 - regression_loss: 0.2470 - classification_loss: 0.0134 486/500 [============================>.] - ETA: 7s - loss: 0.2602 - regression_loss: 0.2469 - classification_loss: 0.0134 487/500 [============================>.] - ETA: 6s - loss: 0.2603 - regression_loss: 0.2470 - classification_loss: 0.0134 488/500 [============================>.] - ETA: 6s - loss: 0.2605 - regression_loss: 0.2471 - classification_loss: 0.0134 489/500 [============================>.] - ETA: 5s - loss: 0.2603 - regression_loss: 0.2470 - classification_loss: 0.0134 490/500 [============================>.] - ETA: 5s - loss: 0.2601 - regression_loss: 0.2468 - classification_loss: 0.0133 491/500 [============================>.] - ETA: 4s - loss: 0.2599 - regression_loss: 0.2465 - classification_loss: 0.0133 492/500 [============================>.] - ETA: 4s - loss: 0.2596 - regression_loss: 0.2463 - classification_loss: 0.0133 493/500 [============================>.] - ETA: 3s - loss: 0.2595 - regression_loss: 0.2461 - classification_loss: 0.0133 494/500 [============================>.] - ETA: 3s - loss: 0.2597 - regression_loss: 0.2464 - classification_loss: 0.0133 495/500 [============================>.] - ETA: 2s - loss: 0.2600 - regression_loss: 0.2467 - classification_loss: 0.0133 496/500 [============================>.] - ETA: 2s - loss: 0.2605 - regression_loss: 0.2471 - classification_loss: 0.0133 497/500 [============================>.] - ETA: 1s - loss: 0.2602 - regression_loss: 0.2469 - classification_loss: 0.0133 498/500 [============================>.] - ETA: 1s - loss: 0.2599 - regression_loss: 0.2465 - classification_loss: 0.0133 499/500 [============================>.] - ETA: 0s - loss: 0.2598 - regression_loss: 0.2464 - classification_loss: 0.0133 500/500 [==============================] - 251s 503ms/step - loss: 0.2595 - regression_loss: 0.2462 - classification_loss: 0.0133 1172 instances of class plum with average precision: 0.7576 mAP: 0.7576 Epoch 00025: saving model to ./training/snapshots/resnet101_pascal_25.h5 Epoch 26/150 1/500 [..............................] - ETA: 4:08 - loss: 0.2617 - regression_loss: 0.2523 - classification_loss: 0.0094 2/500 [..............................] - ETA: 4:13 - loss: 0.3461 - regression_loss: 0.3290 - classification_loss: 0.0172 3/500 [..............................] - ETA: 4:16 - loss: 0.2633 - regression_loss: 0.2496 - classification_loss: 0.0137 4/500 [..............................] - ETA: 4:14 - loss: 0.2486 - regression_loss: 0.2365 - classification_loss: 0.0121 5/500 [..............................] - ETA: 4:12 - loss: 0.2425 - regression_loss: 0.2318 - classification_loss: 0.0108 6/500 [..............................] - ETA: 4:10 - loss: 0.2543 - regression_loss: 0.2419 - classification_loss: 0.0124 7/500 [..............................] - ETA: 4:08 - loss: 0.2335 - regression_loss: 0.2220 - classification_loss: 0.0115 8/500 [..............................] - ETA: 4:11 - loss: 0.2160 - regression_loss: 0.2056 - classification_loss: 0.0104 9/500 [..............................] - ETA: 4:10 - loss: 0.2082 - regression_loss: 0.1987 - classification_loss: 0.0095 10/500 [..............................] - ETA: 4:11 - loss: 0.2234 - regression_loss: 0.2124 - classification_loss: 0.0109 11/500 [..............................] - ETA: 4:10 - loss: 0.2216 - regression_loss: 0.2108 - classification_loss: 0.0109 12/500 [..............................] - ETA: 4:10 - loss: 0.2156 - regression_loss: 0.2053 - classification_loss: 0.0103 13/500 [..............................] - ETA: 4:09 - loss: 0.2114 - regression_loss: 0.2014 - classification_loss: 0.0100 14/500 [..............................] - ETA: 4:08 - loss: 0.2090 - regression_loss: 0.1990 - classification_loss: 0.0099 15/500 [..............................] - ETA: 4:07 - loss: 0.2058 - regression_loss: 0.1960 - classification_loss: 0.0098 16/500 [..............................] - ETA: 4:06 - loss: 0.2037 - regression_loss: 0.1940 - classification_loss: 0.0097 17/500 [>.............................] - ETA: 4:05 - loss: 0.2076 - regression_loss: 0.1976 - classification_loss: 0.0100 18/500 [>.............................] - ETA: 4:05 - loss: 0.2084 - regression_loss: 0.1979 - classification_loss: 0.0105 19/500 [>.............................] - ETA: 4:04 - loss: 0.2114 - regression_loss: 0.2005 - classification_loss: 0.0109 20/500 [>.............................] - ETA: 4:04 - loss: 0.2108 - regression_loss: 0.2003 - classification_loss: 0.0105 21/500 [>.............................] - ETA: 4:03 - loss: 0.2086 - regression_loss: 0.1984 - classification_loss: 0.0102 22/500 [>.............................] - ETA: 4:02 - loss: 0.2091 - regression_loss: 0.1990 - classification_loss: 0.0101 23/500 [>.............................] - ETA: 4:01 - loss: 0.2121 - regression_loss: 0.2014 - classification_loss: 0.0107 24/500 [>.............................] - ETA: 4:00 - loss: 0.2135 - regression_loss: 0.2028 - classification_loss: 0.0108 25/500 [>.............................] - ETA: 4:00 - loss: 0.2173 - regression_loss: 0.2061 - classification_loss: 0.0112 26/500 [>.............................] - ETA: 3:59 - loss: 0.2133 - regression_loss: 0.2025 - classification_loss: 0.0109 27/500 [>.............................] - ETA: 3:58 - loss: 0.2102 - regression_loss: 0.1995 - classification_loss: 0.0107 28/500 [>.............................] - ETA: 3:58 - loss: 0.2073 - regression_loss: 0.1969 - classification_loss: 0.0105 29/500 [>.............................] - ETA: 3:58 - loss: 0.2087 - regression_loss: 0.1982 - classification_loss: 0.0104 30/500 [>.............................] - ETA: 3:57 - loss: 0.2090 - regression_loss: 0.1987 - classification_loss: 0.0103 31/500 [>.............................] - ETA: 3:57 - loss: 0.2092 - regression_loss: 0.1989 - classification_loss: 0.0103 32/500 [>.............................] - ETA: 3:56 - loss: 0.2097 - regression_loss: 0.1993 - classification_loss: 0.0104 33/500 [>.............................] - ETA: 3:55 - loss: 0.2157 - regression_loss: 0.2048 - classification_loss: 0.0108 34/500 [=>............................] - ETA: 3:55 - loss: 0.2188 - regression_loss: 0.2074 - classification_loss: 0.0114 35/500 [=>............................] - ETA: 3:54 - loss: 0.2354 - regression_loss: 0.2231 - classification_loss: 0.0124 36/500 [=>............................] - ETA: 3:54 - loss: 0.2414 - regression_loss: 0.2292 - classification_loss: 0.0122 37/500 [=>............................] - ETA: 3:53 - loss: 0.2424 - regression_loss: 0.2303 - classification_loss: 0.0121 38/500 [=>............................] - ETA: 3:53 - loss: 0.2432 - regression_loss: 0.2311 - classification_loss: 0.0121 39/500 [=>............................] - ETA: 3:52 - loss: 0.2406 - regression_loss: 0.2284 - classification_loss: 0.0122 40/500 [=>............................] - ETA: 3:52 - loss: 0.2461 - regression_loss: 0.2332 - classification_loss: 0.0129 41/500 [=>............................] - ETA: 3:51 - loss: 0.2424 - regression_loss: 0.2297 - classification_loss: 0.0127 42/500 [=>............................] - ETA: 3:51 - loss: 0.2481 - regression_loss: 0.2340 - classification_loss: 0.0141 43/500 [=>............................] - ETA: 3:50 - loss: 0.2470 - regression_loss: 0.2330 - classification_loss: 0.0140 44/500 [=>............................] - ETA: 3:49 - loss: 0.2482 - regression_loss: 0.2345 - classification_loss: 0.0137 45/500 [=>............................] - ETA: 3:49 - loss: 0.2541 - regression_loss: 0.2402 - classification_loss: 0.0139 46/500 [=>............................] - ETA: 3:48 - loss: 0.2541 - regression_loss: 0.2402 - classification_loss: 0.0139 47/500 [=>............................] - ETA: 3:48 - loss: 0.2523 - regression_loss: 0.2386 - classification_loss: 0.0137 48/500 [=>............................] - ETA: 3:47 - loss: 0.2535 - regression_loss: 0.2398 - classification_loss: 0.0138 49/500 [=>............................] - ETA: 3:47 - loss: 0.2508 - regression_loss: 0.2372 - classification_loss: 0.0136 50/500 [==>...........................] - ETA: 3:46 - loss: 0.2495 - regression_loss: 0.2360 - classification_loss: 0.0135 51/500 [==>...........................] - ETA: 3:46 - loss: 0.2497 - regression_loss: 0.2362 - classification_loss: 0.0135 52/500 [==>...........................] - ETA: 3:45 - loss: 0.2502 - regression_loss: 0.2368 - classification_loss: 0.0134 53/500 [==>...........................] - ETA: 3:45 - loss: 0.2516 - regression_loss: 0.2381 - classification_loss: 0.0135 54/500 [==>...........................] - ETA: 3:44 - loss: 0.2527 - regression_loss: 0.2394 - classification_loss: 0.0133 55/500 [==>...........................] - ETA: 3:44 - loss: 0.2537 - regression_loss: 0.2403 - classification_loss: 0.0134 56/500 [==>...........................] - ETA: 3:43 - loss: 0.2517 - regression_loss: 0.2385 - classification_loss: 0.0132 57/500 [==>...........................] - ETA: 3:43 - loss: 0.2491 - regression_loss: 0.2360 - classification_loss: 0.0130 58/500 [==>...........................] - ETA: 3:42 - loss: 0.2496 - regression_loss: 0.2363 - classification_loss: 0.0133 59/500 [==>...........................] - ETA: 3:42 - loss: 0.2465 - regression_loss: 0.2334 - classification_loss: 0.0131 60/500 [==>...........................] - ETA: 3:41 - loss: 0.2460 - regression_loss: 0.2331 - classification_loss: 0.0130 61/500 [==>...........................] - ETA: 3:41 - loss: 0.2481 - regression_loss: 0.2351 - classification_loss: 0.0130 62/500 [==>...........................] - ETA: 3:40 - loss: 0.2468 - regression_loss: 0.2339 - classification_loss: 0.0129 63/500 [==>...........................] - ETA: 3:40 - loss: 0.2479 - regression_loss: 0.2350 - classification_loss: 0.0128 64/500 [==>...........................] - ETA: 3:39 - loss: 0.2474 - regression_loss: 0.2346 - classification_loss: 0.0128 65/500 [==>...........................] - ETA: 3:39 - loss: 0.2502 - regression_loss: 0.2375 - classification_loss: 0.0128 66/500 [==>...........................] - ETA: 3:38 - loss: 0.2518 - regression_loss: 0.2389 - classification_loss: 0.0129 67/500 [===>..........................] - ETA: 3:38 - loss: 0.2513 - regression_loss: 0.2382 - classification_loss: 0.0130 68/500 [===>..........................] - ETA: 3:37 - loss: 0.2517 - regression_loss: 0.2386 - classification_loss: 0.0132 69/500 [===>..........................] - ETA: 3:37 - loss: 0.2501 - regression_loss: 0.2371 - classification_loss: 0.0131 70/500 [===>..........................] - ETA: 3:36 - loss: 0.2500 - regression_loss: 0.2370 - classification_loss: 0.0129 71/500 [===>..........................] - ETA: 3:36 - loss: 0.2480 - regression_loss: 0.2352 - classification_loss: 0.0128 72/500 [===>..........................] - ETA: 3:35 - loss: 0.2474 - regression_loss: 0.2346 - classification_loss: 0.0128 73/500 [===>..........................] - ETA: 3:34 - loss: 0.2477 - regression_loss: 0.2348 - classification_loss: 0.0129 74/500 [===>..........................] - ETA: 3:34 - loss: 0.2469 - regression_loss: 0.2340 - classification_loss: 0.0129 75/500 [===>..........................] - ETA: 3:33 - loss: 0.2453 - regression_loss: 0.2325 - classification_loss: 0.0128 76/500 [===>..........................] - ETA: 3:33 - loss: 0.2458 - regression_loss: 0.2329 - classification_loss: 0.0129 77/500 [===>..........................] - ETA: 3:32 - loss: 0.2459 - regression_loss: 0.2331 - classification_loss: 0.0129 78/500 [===>..........................] - ETA: 3:32 - loss: 0.2463 - regression_loss: 0.2334 - classification_loss: 0.0129 79/500 [===>..........................] - ETA: 3:31 - loss: 0.2457 - regression_loss: 0.2328 - classification_loss: 0.0128 80/500 [===>..........................] - ETA: 3:31 - loss: 0.2446 - regression_loss: 0.2319 - classification_loss: 0.0128 81/500 [===>..........................] - ETA: 3:30 - loss: 0.2424 - regression_loss: 0.2298 - classification_loss: 0.0126 82/500 [===>..........................] - ETA: 3:30 - loss: 0.2422 - regression_loss: 0.2296 - classification_loss: 0.0126 83/500 [===>..........................] - ETA: 3:29 - loss: 0.2421 - regression_loss: 0.2295 - classification_loss: 0.0126 84/500 [====>.........................] - ETA: 3:29 - loss: 0.2416 - regression_loss: 0.2290 - classification_loss: 0.0126 85/500 [====>.........................] - ETA: 3:28 - loss: 0.2417 - regression_loss: 0.2290 - classification_loss: 0.0127 86/500 [====>.........................] - ETA: 3:28 - loss: 0.2408 - regression_loss: 0.2282 - classification_loss: 0.0126 87/500 [====>.........................] - ETA: 3:27 - loss: 0.2410 - regression_loss: 0.2284 - classification_loss: 0.0126 88/500 [====>.........................] - ETA: 3:27 - loss: 0.2394 - regression_loss: 0.2270 - classification_loss: 0.0124 89/500 [====>.........................] - ETA: 3:26 - loss: 0.2409 - regression_loss: 0.2284 - classification_loss: 0.0125 90/500 [====>.........................] - ETA: 3:26 - loss: 0.2410 - regression_loss: 0.2285 - classification_loss: 0.0124 91/500 [====>.........................] - ETA: 3:25 - loss: 0.2422 - regression_loss: 0.2297 - classification_loss: 0.0125 92/500 [====>.........................] - ETA: 3:25 - loss: 0.2428 - regression_loss: 0.2303 - classification_loss: 0.0125 93/500 [====>.........................] - ETA: 3:24 - loss: 0.2434 - regression_loss: 0.2309 - classification_loss: 0.0125 94/500 [====>.........................] - ETA: 3:24 - loss: 0.2452 - regression_loss: 0.2325 - classification_loss: 0.0127 95/500 [====>.........................] - ETA: 3:23 - loss: 0.2450 - regression_loss: 0.2323 - classification_loss: 0.0127 96/500 [====>.........................] - ETA: 3:23 - loss: 0.2459 - regression_loss: 0.2332 - classification_loss: 0.0128 97/500 [====>.........................] - ETA: 3:22 - loss: 0.2466 - regression_loss: 0.2336 - classification_loss: 0.0130 98/500 [====>.........................] - ETA: 3:22 - loss: 0.2464 - regression_loss: 0.2335 - classification_loss: 0.0129 99/500 [====>.........................] - ETA: 3:21 - loss: 0.2474 - regression_loss: 0.2344 - classification_loss: 0.0130 100/500 [=====>........................] - ETA: 3:21 - loss: 0.2474 - regression_loss: 0.2344 - classification_loss: 0.0130 101/500 [=====>........................] - ETA: 3:20 - loss: 0.2469 - regression_loss: 0.2338 - classification_loss: 0.0131 102/500 [=====>........................] - ETA: 3:20 - loss: 0.2472 - regression_loss: 0.2340 - classification_loss: 0.0132 103/500 [=====>........................] - ETA: 3:19 - loss: 0.2467 - regression_loss: 0.2336 - classification_loss: 0.0132 104/500 [=====>........................] - ETA: 3:19 - loss: 0.2468 - regression_loss: 0.2336 - classification_loss: 0.0131 105/500 [=====>........................] - ETA: 3:18 - loss: 0.2459 - regression_loss: 0.2328 - classification_loss: 0.0131 106/500 [=====>........................] - ETA: 3:18 - loss: 0.2459 - regression_loss: 0.2328 - classification_loss: 0.0131 107/500 [=====>........................] - ETA: 3:17 - loss: 0.2463 - regression_loss: 0.2332 - classification_loss: 0.0131 108/500 [=====>........................] - ETA: 3:17 - loss: 0.2467 - regression_loss: 0.2336 - classification_loss: 0.0131 109/500 [=====>........................] - ETA: 3:16 - loss: 0.2455 - regression_loss: 0.2325 - classification_loss: 0.0130 110/500 [=====>........................] - ETA: 3:16 - loss: 0.2447 - regression_loss: 0.2318 - classification_loss: 0.0130 111/500 [=====>........................] - ETA: 3:15 - loss: 0.2470 - regression_loss: 0.2339 - classification_loss: 0.0131 112/500 [=====>........................] - ETA: 3:15 - loss: 0.2468 - regression_loss: 0.2338 - classification_loss: 0.0130 113/500 [=====>........................] - ETA: 3:14 - loss: 0.2466 - regression_loss: 0.2336 - classification_loss: 0.0130 114/500 [=====>........................] - ETA: 3:14 - loss: 0.2459 - regression_loss: 0.2330 - classification_loss: 0.0129 115/500 [=====>........................] - ETA: 3:13 - loss: 0.2457 - regression_loss: 0.2329 - classification_loss: 0.0128 116/500 [=====>........................] - ETA: 3:13 - loss: 0.2456 - regression_loss: 0.2328 - classification_loss: 0.0128 117/500 [======>.......................] - ETA: 3:12 - loss: 0.2470 - regression_loss: 0.2340 - classification_loss: 0.0130 118/500 [======>.......................] - ETA: 3:12 - loss: 0.2488 - regression_loss: 0.2356 - classification_loss: 0.0132 119/500 [======>.......................] - ETA: 3:11 - loss: 0.2482 - regression_loss: 0.2350 - classification_loss: 0.0132 120/500 [======>.......................] - ETA: 3:11 - loss: 0.2493 - regression_loss: 0.2362 - classification_loss: 0.0131 121/500 [======>.......................] - ETA: 3:10 - loss: 0.2520 - regression_loss: 0.2388 - classification_loss: 0.0132 122/500 [======>.......................] - ETA: 3:10 - loss: 0.2539 - regression_loss: 0.2407 - classification_loss: 0.0131 123/500 [======>.......................] - ETA: 3:09 - loss: 0.2553 - regression_loss: 0.2422 - classification_loss: 0.0131 124/500 [======>.......................] - ETA: 3:09 - loss: 0.2561 - regression_loss: 0.2429 - classification_loss: 0.0131 125/500 [======>.......................] - ETA: 3:08 - loss: 0.2569 - regression_loss: 0.2438 - classification_loss: 0.0131 126/500 [======>.......................] - ETA: 3:08 - loss: 0.2563 - regression_loss: 0.2433 - classification_loss: 0.0130 127/500 [======>.......................] - ETA: 3:07 - loss: 0.2547 - regression_loss: 0.2418 - classification_loss: 0.0129 128/500 [======>.......................] - ETA: 3:07 - loss: 0.2562 - regression_loss: 0.2431 - classification_loss: 0.0130 129/500 [======>.......................] - ETA: 3:06 - loss: 0.2552 - regression_loss: 0.2422 - classification_loss: 0.0130 130/500 [======>.......................] - ETA: 3:06 - loss: 0.2556 - regression_loss: 0.2425 - classification_loss: 0.0132 131/500 [======>.......................] - ETA: 3:05 - loss: 0.2570 - regression_loss: 0.2437 - classification_loss: 0.0132 132/500 [======>.......................] - ETA: 3:04 - loss: 0.2557 - regression_loss: 0.2425 - classification_loss: 0.0132 133/500 [======>.......................] - ETA: 3:04 - loss: 0.2561 - regression_loss: 0.2429 - classification_loss: 0.0132 134/500 [=======>......................] - ETA: 3:04 - loss: 0.2559 - regression_loss: 0.2427 - classification_loss: 0.0132 135/500 [=======>......................] - ETA: 3:03 - loss: 0.2555 - regression_loss: 0.2424 - classification_loss: 0.0131 136/500 [=======>......................] - ETA: 3:02 - loss: 0.2572 - regression_loss: 0.2439 - classification_loss: 0.0133 137/500 [=======>......................] - ETA: 3:02 - loss: 0.2567 - regression_loss: 0.2435 - classification_loss: 0.0132 138/500 [=======>......................] - ETA: 3:02 - loss: 0.2590 - regression_loss: 0.2455 - classification_loss: 0.0135 139/500 [=======>......................] - ETA: 3:01 - loss: 0.2586 - regression_loss: 0.2452 - classification_loss: 0.0135 140/500 [=======>......................] - ETA: 3:00 - loss: 0.2581 - regression_loss: 0.2446 - classification_loss: 0.0134 141/500 [=======>......................] - ETA: 3:00 - loss: 0.2590 - regression_loss: 0.2455 - classification_loss: 0.0135 142/500 [=======>......................] - ETA: 2:59 - loss: 0.2592 - regression_loss: 0.2457 - classification_loss: 0.0135 143/500 [=======>......................] - ETA: 2:59 - loss: 0.2594 - regression_loss: 0.2459 - classification_loss: 0.0135 144/500 [=======>......................] - ETA: 2:58 - loss: 0.2589 - regression_loss: 0.2455 - classification_loss: 0.0135 145/500 [=======>......................] - ETA: 2:58 - loss: 0.2582 - regression_loss: 0.2448 - classification_loss: 0.0135 146/500 [=======>......................] - ETA: 2:57 - loss: 0.2578 - regression_loss: 0.2443 - classification_loss: 0.0134 147/500 [=======>......................] - ETA: 2:57 - loss: 0.2578 - regression_loss: 0.2444 - classification_loss: 0.0134 148/500 [=======>......................] - ETA: 2:57 - loss: 0.2577 - regression_loss: 0.2443 - classification_loss: 0.0134 149/500 [=======>......................] - ETA: 2:56 - loss: 0.2567 - regression_loss: 0.2434 - classification_loss: 0.0133 150/500 [========>.....................] - ETA: 2:55 - loss: 0.2563 - regression_loss: 0.2430 - classification_loss: 0.0133 151/500 [========>.....................] - ETA: 2:55 - loss: 0.2552 - regression_loss: 0.2420 - classification_loss: 0.0132 152/500 [========>.....................] - ETA: 2:54 - loss: 0.2548 - regression_loss: 0.2416 - classification_loss: 0.0132 153/500 [========>.....................] - ETA: 2:54 - loss: 0.2544 - regression_loss: 0.2413 - classification_loss: 0.0131 154/500 [========>.....................] - ETA: 2:53 - loss: 0.2549 - regression_loss: 0.2418 - classification_loss: 0.0131 155/500 [========>.....................] - ETA: 2:53 - loss: 0.2543 - regression_loss: 0.2412 - classification_loss: 0.0130 156/500 [========>.....................] - ETA: 2:53 - loss: 0.2552 - regression_loss: 0.2420 - classification_loss: 0.0132 157/500 [========>.....................] - ETA: 2:52 - loss: 0.2548 - regression_loss: 0.2416 - classification_loss: 0.0132 158/500 [========>.....................] - ETA: 2:52 - loss: 0.2540 - regression_loss: 0.2409 - classification_loss: 0.0131 159/500 [========>.....................] - ETA: 2:51 - loss: 0.2538 - regression_loss: 0.2407 - classification_loss: 0.0131 160/500 [========>.....................] - ETA: 2:51 - loss: 0.2536 - regression_loss: 0.2405 - classification_loss: 0.0131 161/500 [========>.....................] - ETA: 2:50 - loss: 0.2554 - regression_loss: 0.2422 - classification_loss: 0.0132 162/500 [========>.....................] - ETA: 2:50 - loss: 0.2559 - regression_loss: 0.2427 - classification_loss: 0.0132 163/500 [========>.....................] - ETA: 2:49 - loss: 0.2574 - regression_loss: 0.2443 - classification_loss: 0.0131 164/500 [========>.....................] - ETA: 2:49 - loss: 0.2578 - regression_loss: 0.2447 - classification_loss: 0.0131 165/500 [========>.....................] - ETA: 2:48 - loss: 0.2569 - regression_loss: 0.2438 - classification_loss: 0.0131 166/500 [========>.....................] - ETA: 2:48 - loss: 0.2564 - regression_loss: 0.2434 - classification_loss: 0.0130 167/500 [=========>....................] - ETA: 2:47 - loss: 0.2572 - regression_loss: 0.2442 - classification_loss: 0.0130 168/500 [=========>....................] - ETA: 2:47 - loss: 0.2576 - regression_loss: 0.2445 - classification_loss: 0.0130 169/500 [=========>....................] - ETA: 2:46 - loss: 0.2588 - regression_loss: 0.2458 - classification_loss: 0.0131 170/500 [=========>....................] - ETA: 2:46 - loss: 0.2594 - regression_loss: 0.2462 - classification_loss: 0.0132 171/500 [=========>....................] - ETA: 2:45 - loss: 0.2603 - regression_loss: 0.2471 - classification_loss: 0.0132 172/500 [=========>....................] - ETA: 2:45 - loss: 0.2604 - regression_loss: 0.2473 - classification_loss: 0.0131 173/500 [=========>....................] - ETA: 2:44 - loss: 0.2601 - regression_loss: 0.2470 - classification_loss: 0.0131 174/500 [=========>....................] - ETA: 2:44 - loss: 0.2600 - regression_loss: 0.2470 - classification_loss: 0.0130 175/500 [=========>....................] - ETA: 2:43 - loss: 0.2603 - regression_loss: 0.2472 - classification_loss: 0.0130 176/500 [=========>....................] - ETA: 2:43 - loss: 0.2605 - regression_loss: 0.2475 - classification_loss: 0.0130 177/500 [=========>....................] - ETA: 2:42 - loss: 0.2605 - regression_loss: 0.2475 - classification_loss: 0.0130 178/500 [=========>....................] - ETA: 2:42 - loss: 0.2632 - regression_loss: 0.2501 - classification_loss: 0.0131 179/500 [=========>....................] - ETA: 2:41 - loss: 0.2629 - regression_loss: 0.2498 - classification_loss: 0.0131 180/500 [=========>....................] - ETA: 2:41 - loss: 0.2638 - regression_loss: 0.2505 - classification_loss: 0.0133 181/500 [=========>....................] - ETA: 2:40 - loss: 0.2637 - regression_loss: 0.2504 - classification_loss: 0.0132 182/500 [=========>....................] - ETA: 2:40 - loss: 0.2640 - regression_loss: 0.2507 - classification_loss: 0.0133 183/500 [=========>....................] - ETA: 2:39 - loss: 0.2644 - regression_loss: 0.2512 - classification_loss: 0.0132 184/500 [==========>...................] - ETA: 2:39 - loss: 0.2651 - regression_loss: 0.2519 - classification_loss: 0.0132 185/500 [==========>...................] - ETA: 2:38 - loss: 0.2654 - regression_loss: 0.2522 - classification_loss: 0.0132 186/500 [==========>...................] - ETA: 2:38 - loss: 0.2660 - regression_loss: 0.2527 - classification_loss: 0.0133 187/500 [==========>...................] - ETA: 2:37 - loss: 0.2657 - regression_loss: 0.2524 - classification_loss: 0.0132 188/500 [==========>...................] - ETA: 2:37 - loss: 0.2657 - regression_loss: 0.2525 - classification_loss: 0.0132 189/500 [==========>...................] - ETA: 2:36 - loss: 0.2665 - regression_loss: 0.2533 - classification_loss: 0.0131 190/500 [==========>...................] - ETA: 2:36 - loss: 0.2658 - regression_loss: 0.2527 - classification_loss: 0.0131 191/500 [==========>...................] - ETA: 2:35 - loss: 0.2666 - regression_loss: 0.2535 - classification_loss: 0.0132 192/500 [==========>...................] - ETA: 2:35 - loss: 0.2677 - regression_loss: 0.2545 - classification_loss: 0.0132 193/500 [==========>...................] - ETA: 2:34 - loss: 0.2675 - regression_loss: 0.2542 - classification_loss: 0.0132 194/500 [==========>...................] - ETA: 2:34 - loss: 0.2671 - regression_loss: 0.2539 - classification_loss: 0.0132 195/500 [==========>...................] - ETA: 2:33 - loss: 0.2663 - regression_loss: 0.2531 - classification_loss: 0.0132 196/500 [==========>...................] - ETA: 2:33 - loss: 0.2659 - regression_loss: 0.2528 - classification_loss: 0.0131 197/500 [==========>...................] - ETA: 2:32 - loss: 0.2656 - regression_loss: 0.2525 - classification_loss: 0.0131 198/500 [==========>...................] - ETA: 2:32 - loss: 0.2656 - regression_loss: 0.2525 - classification_loss: 0.0131 199/500 [==========>...................] - ETA: 2:31 - loss: 0.2653 - regression_loss: 0.2522 - classification_loss: 0.0131 200/500 [===========>..................] - ETA: 2:31 - loss: 0.2644 - regression_loss: 0.2513 - classification_loss: 0.0130 201/500 [===========>..................] - ETA: 2:30 - loss: 0.2639 - regression_loss: 0.2509 - classification_loss: 0.0130 202/500 [===========>..................] - ETA: 2:29 - loss: 0.2639 - regression_loss: 0.2509 - classification_loss: 0.0130 203/500 [===========>..................] - ETA: 2:29 - loss: 0.2638 - regression_loss: 0.2508 - classification_loss: 0.0130 204/500 [===========>..................] - ETA: 2:29 - loss: 0.2636 - regression_loss: 0.2507 - classification_loss: 0.0130 205/500 [===========>..................] - ETA: 2:28 - loss: 0.2635 - regression_loss: 0.2506 - classification_loss: 0.0130 206/500 [===========>..................] - ETA: 2:28 - loss: 0.2643 - regression_loss: 0.2513 - classification_loss: 0.0130 207/500 [===========>..................] - ETA: 2:27 - loss: 0.2641 - regression_loss: 0.2511 - classification_loss: 0.0130 208/500 [===========>..................] - ETA: 2:26 - loss: 0.2638 - regression_loss: 0.2508 - classification_loss: 0.0129 209/500 [===========>..................] - ETA: 2:26 - loss: 0.2633 - regression_loss: 0.2504 - classification_loss: 0.0129 210/500 [===========>..................] - ETA: 2:26 - loss: 0.2639 - regression_loss: 0.2510 - classification_loss: 0.0129 211/500 [===========>..................] - ETA: 2:25 - loss: 0.2634 - regression_loss: 0.2506 - classification_loss: 0.0129 212/500 [===========>..................] - ETA: 2:24 - loss: 0.2639 - regression_loss: 0.2509 - classification_loss: 0.0129 213/500 [===========>..................] - ETA: 2:24 - loss: 0.2657 - regression_loss: 0.2526 - classification_loss: 0.0130 214/500 [===========>..................] - ETA: 2:23 - loss: 0.2654 - regression_loss: 0.2524 - classification_loss: 0.0130 215/500 [===========>..................] - ETA: 2:23 - loss: 0.2649 - regression_loss: 0.2519 - classification_loss: 0.0130 216/500 [===========>..................] - ETA: 2:22 - loss: 0.2644 - regression_loss: 0.2514 - classification_loss: 0.0130 217/500 [============>.................] - ETA: 2:22 - loss: 0.2638 - regression_loss: 0.2509 - classification_loss: 0.0129 218/500 [============>.................] - ETA: 2:21 - loss: 0.2634 - regression_loss: 0.2505 - classification_loss: 0.0129 219/500 [============>.................] - ETA: 2:21 - loss: 0.2628 - regression_loss: 0.2499 - classification_loss: 0.0128 220/500 [============>.................] - ETA: 2:20 - loss: 0.2626 - regression_loss: 0.2497 - classification_loss: 0.0129 221/500 [============>.................] - ETA: 2:20 - loss: 0.2623 - regression_loss: 0.2495 - classification_loss: 0.0128 222/500 [============>.................] - ETA: 2:19 - loss: 0.2626 - regression_loss: 0.2497 - classification_loss: 0.0129 223/500 [============>.................] - ETA: 2:19 - loss: 0.2624 - regression_loss: 0.2496 - classification_loss: 0.0129 224/500 [============>.................] - ETA: 2:18 - loss: 0.2626 - regression_loss: 0.2497 - classification_loss: 0.0129 225/500 [============>.................] - ETA: 2:18 - loss: 0.2624 - regression_loss: 0.2496 - classification_loss: 0.0129 226/500 [============>.................] - ETA: 2:17 - loss: 0.2627 - regression_loss: 0.2498 - classification_loss: 0.0129 227/500 [============>.................] - ETA: 2:17 - loss: 0.2623 - regression_loss: 0.2494 - classification_loss: 0.0129 228/500 [============>.................] - ETA: 2:16 - loss: 0.2618 - regression_loss: 0.2490 - classification_loss: 0.0128 229/500 [============>.................] - ETA: 2:16 - loss: 0.2613 - regression_loss: 0.2485 - classification_loss: 0.0128 230/500 [============>.................] - ETA: 2:15 - loss: 0.2612 - regression_loss: 0.2484 - classification_loss: 0.0128 231/500 [============>.................] - ETA: 2:15 - loss: 0.2617 - regression_loss: 0.2489 - classification_loss: 0.0128 232/500 [============>.................] - ETA: 2:14 - loss: 0.2611 - regression_loss: 0.2483 - classification_loss: 0.0128 233/500 [============>.................] - ETA: 2:14 - loss: 0.2620 - regression_loss: 0.2491 - classification_loss: 0.0128 234/500 [=============>................] - ETA: 2:13 - loss: 0.2622 - regression_loss: 0.2493 - classification_loss: 0.0130 235/500 [=============>................] - ETA: 2:13 - loss: 0.2619 - regression_loss: 0.2489 - classification_loss: 0.0130 236/500 [=============>................] - ETA: 2:12 - loss: 0.2621 - regression_loss: 0.2491 - classification_loss: 0.0130 237/500 [=============>................] - ETA: 2:12 - loss: 0.2626 - regression_loss: 0.2496 - classification_loss: 0.0130 238/500 [=============>................] - ETA: 2:11 - loss: 0.2623 - regression_loss: 0.2493 - classification_loss: 0.0130 239/500 [=============>................] - ETA: 2:11 - loss: 0.2625 - regression_loss: 0.2496 - classification_loss: 0.0129 240/500 [=============>................] - ETA: 2:10 - loss: 0.2618 - regression_loss: 0.2490 - classification_loss: 0.0129 241/500 [=============>................] - ETA: 2:10 - loss: 0.2620 - regression_loss: 0.2491 - classification_loss: 0.0128 242/500 [=============>................] - ETA: 2:09 - loss: 0.2628 - regression_loss: 0.2500 - classification_loss: 0.0128 243/500 [=============>................] - ETA: 2:09 - loss: 0.2632 - regression_loss: 0.2503 - classification_loss: 0.0128 244/500 [=============>................] - ETA: 2:08 - loss: 0.2637 - regression_loss: 0.2509 - classification_loss: 0.0129 245/500 [=============>................] - ETA: 2:08 - loss: 0.2642 - regression_loss: 0.2513 - classification_loss: 0.0129 246/500 [=============>................] - ETA: 2:07 - loss: 0.2643 - regression_loss: 0.2514 - classification_loss: 0.0129 247/500 [=============>................] - ETA: 2:07 - loss: 0.2640 - regression_loss: 0.2511 - classification_loss: 0.0129 248/500 [=============>................] - ETA: 2:06 - loss: 0.2633 - regression_loss: 0.2504 - classification_loss: 0.0128 249/500 [=============>................] - ETA: 2:06 - loss: 0.2633 - regression_loss: 0.2504 - classification_loss: 0.0129 250/500 [==============>...............] - ETA: 2:05 - loss: 0.2633 - regression_loss: 0.2504 - classification_loss: 0.0128 251/500 [==============>...............] - ETA: 2:05 - loss: 0.2637 - regression_loss: 0.2508 - classification_loss: 0.0129 252/500 [==============>...............] - ETA: 2:04 - loss: 0.2637 - regression_loss: 0.2508 - classification_loss: 0.0129 253/500 [==============>...............] - ETA: 2:04 - loss: 0.2631 - regression_loss: 0.2502 - classification_loss: 0.0129 254/500 [==============>...............] - ETA: 2:03 - loss: 0.2625 - regression_loss: 0.2496 - classification_loss: 0.0128 255/500 [==============>...............] - ETA: 2:03 - loss: 0.2622 - regression_loss: 0.2493 - classification_loss: 0.0128 256/500 [==============>...............] - ETA: 2:02 - loss: 0.2621 - regression_loss: 0.2492 - classification_loss: 0.0128 257/500 [==============>...............] - ETA: 2:02 - loss: 0.2625 - regression_loss: 0.2497 - classification_loss: 0.0128 258/500 [==============>...............] - ETA: 2:01 - loss: 0.2622 - regression_loss: 0.2494 - classification_loss: 0.0128 259/500 [==============>...............] - ETA: 2:01 - loss: 0.2627 - regression_loss: 0.2499 - classification_loss: 0.0129 260/500 [==============>...............] - ETA: 2:00 - loss: 0.2625 - regression_loss: 0.2497 - classification_loss: 0.0128 261/500 [==============>...............] - ETA: 2:00 - loss: 0.2624 - regression_loss: 0.2495 - classification_loss: 0.0129 262/500 [==============>...............] - ETA: 1:59 - loss: 0.2620 - regression_loss: 0.2492 - classification_loss: 0.0128 263/500 [==============>...............] - ETA: 1:59 - loss: 0.2617 - regression_loss: 0.2488 - classification_loss: 0.0128 264/500 [==============>...............] - ETA: 1:58 - loss: 0.2611 - regression_loss: 0.2483 - classification_loss: 0.0128 265/500 [==============>...............] - ETA: 1:58 - loss: 0.2613 - regression_loss: 0.2485 - classification_loss: 0.0128 266/500 [==============>...............] - ETA: 1:57 - loss: 0.2614 - regression_loss: 0.2486 - classification_loss: 0.0128 267/500 [===============>..............] - ETA: 1:57 - loss: 0.2612 - regression_loss: 0.2484 - classification_loss: 0.0128 268/500 [===============>..............] - ETA: 1:56 - loss: 0.2611 - regression_loss: 0.2484 - classification_loss: 0.0128 269/500 [===============>..............] - ETA: 1:56 - loss: 0.2608 - regression_loss: 0.2481 - classification_loss: 0.0128 270/500 [===============>..............] - ETA: 1:55 - loss: 0.2611 - regression_loss: 0.2484 - classification_loss: 0.0127 271/500 [===============>..............] - ETA: 1:55 - loss: 0.2611 - regression_loss: 0.2483 - classification_loss: 0.0127 272/500 [===============>..............] - ETA: 1:54 - loss: 0.2608 - regression_loss: 0.2481 - classification_loss: 0.0127 273/500 [===============>..............] - ETA: 1:54 - loss: 0.2603 - regression_loss: 0.2476 - classification_loss: 0.0127 274/500 [===============>..............] - ETA: 1:53 - loss: 0.2596 - regression_loss: 0.2469 - classification_loss: 0.0127 275/500 [===============>..............] - ETA: 1:53 - loss: 0.2595 - regression_loss: 0.2469 - classification_loss: 0.0126 276/500 [===============>..............] - ETA: 1:52 - loss: 0.2596 - regression_loss: 0.2470 - classification_loss: 0.0126 277/500 [===============>..............] - ETA: 1:52 - loss: 0.2604 - regression_loss: 0.2475 - classification_loss: 0.0129 278/500 [===============>..............] - ETA: 1:51 - loss: 0.2603 - regression_loss: 0.2475 - classification_loss: 0.0129 279/500 [===============>..............] - ETA: 1:51 - loss: 0.2604 - regression_loss: 0.2475 - classification_loss: 0.0129 280/500 [===============>..............] - ETA: 1:50 - loss: 0.2599 - regression_loss: 0.2470 - classification_loss: 0.0128 281/500 [===============>..............] - ETA: 1:50 - loss: 0.2594 - regression_loss: 0.2466 - classification_loss: 0.0128 282/500 [===============>..............] - ETA: 1:49 - loss: 0.2596 - regression_loss: 0.2467 - classification_loss: 0.0129 283/500 [===============>..............] - ETA: 1:49 - loss: 0.2594 - regression_loss: 0.2465 - classification_loss: 0.0129 284/500 [================>.............] - ETA: 1:48 - loss: 0.2600 - regression_loss: 0.2472 - classification_loss: 0.0129 285/500 [================>.............] - ETA: 1:48 - loss: 0.2600 - regression_loss: 0.2471 - classification_loss: 0.0129 286/500 [================>.............] - ETA: 1:47 - loss: 0.2598 - regression_loss: 0.2469 - classification_loss: 0.0129 287/500 [================>.............] - ETA: 1:47 - loss: 0.2596 - regression_loss: 0.2467 - classification_loss: 0.0129 288/500 [================>.............] - ETA: 1:46 - loss: 0.2594 - regression_loss: 0.2465 - classification_loss: 0.0129 289/500 [================>.............] - ETA: 1:46 - loss: 0.2592 - regression_loss: 0.2462 - classification_loss: 0.0129 290/500 [================>.............] - ETA: 1:45 - loss: 0.2592 - regression_loss: 0.2462 - classification_loss: 0.0130 291/500 [================>.............] - ETA: 1:45 - loss: 0.2586 - regression_loss: 0.2456 - classification_loss: 0.0130 292/500 [================>.............] - ETA: 1:44 - loss: 0.2585 - regression_loss: 0.2456 - classification_loss: 0.0130 293/500 [================>.............] - ETA: 1:44 - loss: 0.2580 - regression_loss: 0.2451 - classification_loss: 0.0129 294/500 [================>.............] - ETA: 1:43 - loss: 0.2580 - regression_loss: 0.2451 - classification_loss: 0.0129 295/500 [================>.............] - ETA: 1:43 - loss: 0.2580 - regression_loss: 0.2451 - classification_loss: 0.0129 296/500 [================>.............] - ETA: 1:42 - loss: 0.2579 - regression_loss: 0.2450 - classification_loss: 0.0129 297/500 [================>.............] - ETA: 1:42 - loss: 0.2577 - regression_loss: 0.2448 - classification_loss: 0.0129 298/500 [================>.............] - ETA: 1:41 - loss: 0.2579 - regression_loss: 0.2450 - classification_loss: 0.0129 299/500 [================>.............] - ETA: 1:41 - loss: 0.2577 - regression_loss: 0.2448 - classification_loss: 0.0129 300/500 [=================>............] - ETA: 1:40 - loss: 0.2577 - regression_loss: 0.2447 - classification_loss: 0.0129 301/500 [=================>............] - ETA: 1:40 - loss: 0.2572 - regression_loss: 0.2443 - classification_loss: 0.0129 302/500 [=================>............] - ETA: 1:39 - loss: 0.2570 - regression_loss: 0.2441 - classification_loss: 0.0129 303/500 [=================>............] - ETA: 1:39 - loss: 0.2568 - regression_loss: 0.2439 - classification_loss: 0.0129 304/500 [=================>............] - ETA: 1:38 - loss: 0.2564 - regression_loss: 0.2435 - classification_loss: 0.0129 305/500 [=================>............] - ETA: 1:38 - loss: 0.2565 - regression_loss: 0.2436 - classification_loss: 0.0129 306/500 [=================>............] - ETA: 1:37 - loss: 0.2561 - regression_loss: 0.2433 - classification_loss: 0.0128 307/500 [=================>............] - ETA: 1:37 - loss: 0.2563 - regression_loss: 0.2435 - classification_loss: 0.0129 308/500 [=================>............] - ETA: 1:36 - loss: 0.2561 - regression_loss: 0.2432 - classification_loss: 0.0128 309/500 [=================>............] - ETA: 1:36 - loss: 0.2562 - regression_loss: 0.2434 - classification_loss: 0.0129 310/500 [=================>............] - ETA: 1:35 - loss: 0.2562 - regression_loss: 0.2434 - classification_loss: 0.0129 311/500 [=================>............] - ETA: 1:35 - loss: 0.2562 - regression_loss: 0.2433 - classification_loss: 0.0129 312/500 [=================>............] - ETA: 1:34 - loss: 0.2566 - regression_loss: 0.2437 - classification_loss: 0.0129 313/500 [=================>............] - ETA: 1:34 - loss: 0.2565 - regression_loss: 0.2436 - classification_loss: 0.0129 314/500 [=================>............] - ETA: 1:33 - loss: 0.2566 - regression_loss: 0.2437 - classification_loss: 0.0129 315/500 [=================>............] - ETA: 1:33 - loss: 0.2567 - regression_loss: 0.2438 - classification_loss: 0.0129 316/500 [=================>............] - ETA: 1:32 - loss: 0.2571 - regression_loss: 0.2442 - classification_loss: 0.0128 317/500 [==================>...........] - ETA: 1:32 - loss: 0.2571 - regression_loss: 0.2442 - classification_loss: 0.0128 318/500 [==================>...........] - ETA: 1:31 - loss: 0.2569 - regression_loss: 0.2441 - classification_loss: 0.0128 319/500 [==================>...........] - ETA: 1:31 - loss: 0.2567 - regression_loss: 0.2439 - classification_loss: 0.0128 320/500 [==================>...........] - ETA: 1:30 - loss: 0.2566 - regression_loss: 0.2439 - classification_loss: 0.0128 321/500 [==================>...........] - ETA: 1:30 - loss: 0.2572 - regression_loss: 0.2443 - classification_loss: 0.0128 322/500 [==================>...........] - ETA: 1:29 - loss: 0.2570 - regression_loss: 0.2442 - classification_loss: 0.0128 323/500 [==================>...........] - ETA: 1:29 - loss: 0.2575 - regression_loss: 0.2447 - classification_loss: 0.0128 324/500 [==================>...........] - ETA: 1:28 - loss: 0.2571 - regression_loss: 0.2443 - classification_loss: 0.0128 325/500 [==================>...........] - ETA: 1:28 - loss: 0.2571 - regression_loss: 0.2443 - classification_loss: 0.0128 326/500 [==================>...........] - ETA: 1:27 - loss: 0.2564 - regression_loss: 0.2437 - classification_loss: 0.0128 327/500 [==================>...........] - ETA: 1:27 - loss: 0.2563 - regression_loss: 0.2435 - classification_loss: 0.0128 328/500 [==================>...........] - ETA: 1:26 - loss: 0.2562 - regression_loss: 0.2435 - classification_loss: 0.0127 329/500 [==================>...........] - ETA: 1:26 - loss: 0.2560 - regression_loss: 0.2433 - classification_loss: 0.0127 330/500 [==================>...........] - ETA: 1:25 - loss: 0.2566 - regression_loss: 0.2439 - classification_loss: 0.0128 331/500 [==================>...........] - ETA: 1:25 - loss: 0.2570 - regression_loss: 0.2442 - classification_loss: 0.0128 332/500 [==================>...........] - ETA: 1:24 - loss: 0.2575 - regression_loss: 0.2448 - classification_loss: 0.0128 333/500 [==================>...........] - ETA: 1:24 - loss: 0.2582 - regression_loss: 0.2453 - classification_loss: 0.0129 334/500 [===================>..........] - ETA: 1:23 - loss: 0.2584 - regression_loss: 0.2455 - classification_loss: 0.0129 335/500 [===================>..........] - ETA: 1:23 - loss: 0.2585 - regression_loss: 0.2456 - classification_loss: 0.0129 336/500 [===================>..........] - ETA: 1:22 - loss: 0.2586 - regression_loss: 0.2457 - classification_loss: 0.0129 337/500 [===================>..........] - ETA: 1:22 - loss: 0.2583 - regression_loss: 0.2455 - classification_loss: 0.0128 338/500 [===================>..........] - ETA: 1:21 - loss: 0.2587 - regression_loss: 0.2459 - classification_loss: 0.0128 339/500 [===================>..........] - ETA: 1:21 - loss: 0.2590 - regression_loss: 0.2461 - classification_loss: 0.0129 340/500 [===================>..........] - ETA: 1:20 - loss: 0.2592 - regression_loss: 0.2463 - classification_loss: 0.0129 341/500 [===================>..........] - ETA: 1:20 - loss: 0.2589 - regression_loss: 0.2461 - classification_loss: 0.0129 342/500 [===================>..........] - ETA: 1:19 - loss: 0.2599 - regression_loss: 0.2470 - classification_loss: 0.0129 343/500 [===================>..........] - ETA: 1:19 - loss: 0.2606 - regression_loss: 0.2477 - classification_loss: 0.0129 344/500 [===================>..........] - ETA: 1:18 - loss: 0.2603 - regression_loss: 0.2475 - classification_loss: 0.0129 345/500 [===================>..........] - ETA: 1:18 - loss: 0.2599 - regression_loss: 0.2471 - classification_loss: 0.0128 346/500 [===================>..........] - ETA: 1:17 - loss: 0.2604 - regression_loss: 0.2475 - classification_loss: 0.0129 347/500 [===================>..........] - ETA: 1:17 - loss: 0.2605 - regression_loss: 0.2476 - classification_loss: 0.0128 348/500 [===================>..........] - ETA: 1:16 - loss: 0.2602 - regression_loss: 0.2473 - classification_loss: 0.0128 349/500 [===================>..........] - ETA: 1:16 - loss: 0.2603 - regression_loss: 0.2475 - classification_loss: 0.0128 350/500 [====================>.........] - ETA: 1:15 - loss: 0.2604 - regression_loss: 0.2476 - classification_loss: 0.0128 351/500 [====================>.........] - ETA: 1:14 - loss: 0.2609 - regression_loss: 0.2480 - classification_loss: 0.0129 352/500 [====================>.........] - ETA: 1:14 - loss: 0.2607 - regression_loss: 0.2479 - classification_loss: 0.0129 353/500 [====================>.........] - ETA: 1:13 - loss: 0.2604 - regression_loss: 0.2476 - classification_loss: 0.0128 354/500 [====================>.........] - ETA: 1:13 - loss: 0.2602 - regression_loss: 0.2474 - classification_loss: 0.0128 355/500 [====================>.........] - ETA: 1:12 - loss: 0.2604 - regression_loss: 0.2476 - classification_loss: 0.0128 356/500 [====================>.........] - ETA: 1:12 - loss: 0.2606 - regression_loss: 0.2478 - classification_loss: 0.0128 357/500 [====================>.........] - ETA: 1:11 - loss: 0.2606 - regression_loss: 0.2478 - classification_loss: 0.0128 358/500 [====================>.........] - ETA: 1:11 - loss: 0.2605 - regression_loss: 0.2477 - classification_loss: 0.0128 359/500 [====================>.........] - ETA: 1:10 - loss: 0.2604 - regression_loss: 0.2475 - classification_loss: 0.0128 360/500 [====================>.........] - ETA: 1:10 - loss: 0.2608 - regression_loss: 0.2480 - classification_loss: 0.0128 361/500 [====================>.........] - ETA: 1:09 - loss: 0.2608 - regression_loss: 0.2480 - classification_loss: 0.0128 362/500 [====================>.........] - ETA: 1:09 - loss: 0.2609 - regression_loss: 0.2481 - classification_loss: 0.0128 363/500 [====================>.........] - ETA: 1:08 - loss: 0.2608 - regression_loss: 0.2480 - classification_loss: 0.0128 364/500 [====================>.........] - ETA: 1:08 - loss: 0.2608 - regression_loss: 0.2481 - classification_loss: 0.0128 365/500 [====================>.........] - ETA: 1:07 - loss: 0.2612 - regression_loss: 0.2484 - classification_loss: 0.0127 366/500 [====================>.........] - ETA: 1:07 - loss: 0.2619 - regression_loss: 0.2491 - classification_loss: 0.0128 367/500 [=====================>........] - ETA: 1:06 - loss: 0.2620 - regression_loss: 0.2493 - classification_loss: 0.0128 368/500 [=====================>........] - ETA: 1:06 - loss: 0.2619 - regression_loss: 0.2491 - classification_loss: 0.0128 369/500 [=====================>........] - ETA: 1:05 - loss: 0.2621 - regression_loss: 0.2493 - classification_loss: 0.0128 370/500 [=====================>........] - ETA: 1:05 - loss: 0.2622 - regression_loss: 0.2495 - classification_loss: 0.0128 371/500 [=====================>........] - ETA: 1:04 - loss: 0.2619 - regression_loss: 0.2492 - classification_loss: 0.0128 372/500 [=====================>........] - ETA: 1:04 - loss: 0.2618 - regression_loss: 0.2491 - classification_loss: 0.0127 373/500 [=====================>........] - ETA: 1:03 - loss: 0.2619 - regression_loss: 0.2491 - classification_loss: 0.0127 374/500 [=====================>........] - ETA: 1:03 - loss: 0.2619 - regression_loss: 0.2491 - classification_loss: 0.0127 375/500 [=====================>........] - ETA: 1:02 - loss: 0.2621 - regression_loss: 0.2493 - classification_loss: 0.0128 376/500 [=====================>........] - ETA: 1:02 - loss: 0.2618 - regression_loss: 0.2491 - classification_loss: 0.0127 377/500 [=====================>........] - ETA: 1:01 - loss: 0.2619 - regression_loss: 0.2492 - classification_loss: 0.0127 378/500 [=====================>........] - ETA: 1:01 - loss: 0.2617 - regression_loss: 0.2490 - classification_loss: 0.0127 379/500 [=====================>........] - ETA: 1:00 - loss: 0.2615 - regression_loss: 0.2488 - classification_loss: 0.0127 380/500 [=====================>........] - ETA: 1:00 - loss: 0.2612 - regression_loss: 0.2485 - classification_loss: 0.0127 381/500 [=====================>........] - ETA: 59s - loss: 0.2614 - regression_loss: 0.2487 - classification_loss: 0.0127  382/500 [=====================>........] - ETA: 59s - loss: 0.2612 - regression_loss: 0.2485 - classification_loss: 0.0127 383/500 [=====================>........] - ETA: 58s - loss: 0.2611 - regression_loss: 0.2485 - classification_loss: 0.0126 384/500 [======================>.......] - ETA: 58s - loss: 0.2609 - regression_loss: 0.2482 - classification_loss: 0.0126 385/500 [======================>.......] - ETA: 57s - loss: 0.2611 - regression_loss: 0.2485 - classification_loss: 0.0126 386/500 [======================>.......] - ETA: 57s - loss: 0.2610 - regression_loss: 0.2484 - classification_loss: 0.0126 387/500 [======================>.......] - ETA: 56s - loss: 0.2604 - regression_loss: 0.2479 - classification_loss: 0.0126 388/500 [======================>.......] - ETA: 56s - loss: 0.2605 - regression_loss: 0.2479 - classification_loss: 0.0126 389/500 [======================>.......] - ETA: 55s - loss: 0.2603 - regression_loss: 0.2477 - classification_loss: 0.0126 390/500 [======================>.......] - ETA: 55s - loss: 0.2601 - regression_loss: 0.2476 - classification_loss: 0.0126 391/500 [======================>.......] - ETA: 54s - loss: 0.2601 - regression_loss: 0.2476 - classification_loss: 0.0126 392/500 [======================>.......] - ETA: 54s - loss: 0.2601 - regression_loss: 0.2476 - classification_loss: 0.0126 393/500 [======================>.......] - ETA: 53s - loss: 0.2606 - regression_loss: 0.2480 - classification_loss: 0.0126 394/500 [======================>.......] - ETA: 53s - loss: 0.2608 - regression_loss: 0.2481 - classification_loss: 0.0126 395/500 [======================>.......] - ETA: 52s - loss: 0.2610 - regression_loss: 0.2483 - classification_loss: 0.0127 396/500 [======================>.......] - ETA: 52s - loss: 0.2612 - regression_loss: 0.2484 - classification_loss: 0.0127 397/500 [======================>.......] - ETA: 51s - loss: 0.2612 - regression_loss: 0.2485 - classification_loss: 0.0127 398/500 [======================>.......] - ETA: 51s - loss: 0.2612 - regression_loss: 0.2484 - classification_loss: 0.0127 399/500 [======================>.......] - ETA: 50s - loss: 0.2611 - regression_loss: 0.2484 - classification_loss: 0.0127 400/500 [=======================>......] - ETA: 50s - loss: 0.2611 - regression_loss: 0.2483 - classification_loss: 0.0127 401/500 [=======================>......] - ETA: 49s - loss: 0.2612 - regression_loss: 0.2484 - classification_loss: 0.0127 402/500 [=======================>......] - ETA: 49s - loss: 0.2615 - regression_loss: 0.2487 - classification_loss: 0.0128 403/500 [=======================>......] - ETA: 48s - loss: 0.2615 - regression_loss: 0.2487 - classification_loss: 0.0128 404/500 [=======================>......] - ETA: 48s - loss: 0.2617 - regression_loss: 0.2490 - classification_loss: 0.0128 405/500 [=======================>......] - ETA: 47s - loss: 0.2620 - regression_loss: 0.2492 - classification_loss: 0.0128 406/500 [=======================>......] - ETA: 47s - loss: 0.2628 - regression_loss: 0.2500 - classification_loss: 0.0128 407/500 [=======================>......] - ETA: 46s - loss: 0.2629 - regression_loss: 0.2501 - classification_loss: 0.0128 408/500 [=======================>......] - ETA: 46s - loss: 0.2629 - regression_loss: 0.2501 - classification_loss: 0.0128 409/500 [=======================>......] - ETA: 45s - loss: 0.2625 - regression_loss: 0.2497 - classification_loss: 0.0128 410/500 [=======================>......] - ETA: 45s - loss: 0.2624 - regression_loss: 0.2497 - classification_loss: 0.0127 411/500 [=======================>......] - ETA: 44s - loss: 0.2620 - regression_loss: 0.2493 - classification_loss: 0.0127 412/500 [=======================>......] - ETA: 44s - loss: 0.2619 - regression_loss: 0.2492 - classification_loss: 0.0127 413/500 [=======================>......] - ETA: 43s - loss: 0.2619 - regression_loss: 0.2492 - classification_loss: 0.0127 414/500 [=======================>......] - ETA: 43s - loss: 0.2616 - regression_loss: 0.2489 - classification_loss: 0.0127 415/500 [=======================>......] - ETA: 42s - loss: 0.2614 - regression_loss: 0.2487 - classification_loss: 0.0127 416/500 [=======================>......] - ETA: 42s - loss: 0.2614 - regression_loss: 0.2488 - classification_loss: 0.0127 417/500 [========================>.....] - ETA: 41s - loss: 0.2612 - regression_loss: 0.2485 - classification_loss: 0.0127 418/500 [========================>.....] - ETA: 41s - loss: 0.2613 - regression_loss: 0.2486 - classification_loss: 0.0127 419/500 [========================>.....] - ETA: 40s - loss: 0.2611 - regression_loss: 0.2484 - classification_loss: 0.0127 420/500 [========================>.....] - ETA: 40s - loss: 0.2611 - regression_loss: 0.2484 - classification_loss: 0.0127 421/500 [========================>.....] - ETA: 39s - loss: 0.2610 - regression_loss: 0.2483 - classification_loss: 0.0127 422/500 [========================>.....] - ETA: 39s - loss: 0.2622 - regression_loss: 0.2495 - classification_loss: 0.0127 423/500 [========================>.....] - ETA: 38s - loss: 0.2621 - regression_loss: 0.2494 - classification_loss: 0.0127 424/500 [========================>.....] - ETA: 38s - loss: 0.2627 - regression_loss: 0.2500 - classification_loss: 0.0127 425/500 [========================>.....] - ETA: 37s - loss: 0.2630 - regression_loss: 0.2503 - classification_loss: 0.0127 426/500 [========================>.....] - ETA: 37s - loss: 0.2627 - regression_loss: 0.2500 - classification_loss: 0.0127 427/500 [========================>.....] - ETA: 36s - loss: 0.2625 - regression_loss: 0.2498 - classification_loss: 0.0127 428/500 [========================>.....] - ETA: 36s - loss: 0.2624 - regression_loss: 0.2497 - classification_loss: 0.0127 429/500 [========================>.....] - ETA: 35s - loss: 0.2625 - regression_loss: 0.2498 - classification_loss: 0.0127 430/500 [========================>.....] - ETA: 35s - loss: 0.2622 - regression_loss: 0.2495 - classification_loss: 0.0127 431/500 [========================>.....] - ETA: 34s - loss: 0.2624 - regression_loss: 0.2498 - classification_loss: 0.0127 432/500 [========================>.....] - ETA: 34s - loss: 0.2624 - regression_loss: 0.2498 - classification_loss: 0.0127 433/500 [========================>.....] - ETA: 33s - loss: 0.2623 - regression_loss: 0.2497 - classification_loss: 0.0126 434/500 [=========================>....] - ETA: 33s - loss: 0.2625 - regression_loss: 0.2499 - classification_loss: 0.0126 435/500 [=========================>....] - ETA: 32s - loss: 0.2626 - regression_loss: 0.2500 - classification_loss: 0.0126 436/500 [=========================>....] - ETA: 32s - loss: 0.2625 - regression_loss: 0.2499 - classification_loss: 0.0126 437/500 [=========================>....] - ETA: 31s - loss: 0.2624 - regression_loss: 0.2498 - classification_loss: 0.0126 438/500 [=========================>....] - ETA: 31s - loss: 0.2622 - regression_loss: 0.2496 - classification_loss: 0.0126 439/500 [=========================>....] - ETA: 30s - loss: 0.2619 - regression_loss: 0.2494 - classification_loss: 0.0126 440/500 [=========================>....] - ETA: 30s - loss: 0.2618 - regression_loss: 0.2492 - classification_loss: 0.0126 441/500 [=========================>....] - ETA: 29s - loss: 0.2623 - regression_loss: 0.2497 - classification_loss: 0.0126 442/500 [=========================>....] - ETA: 29s - loss: 0.2627 - regression_loss: 0.2500 - classification_loss: 0.0126 443/500 [=========================>....] - ETA: 28s - loss: 0.2627 - regression_loss: 0.2501 - classification_loss: 0.0127 444/500 [=========================>....] - ETA: 28s - loss: 0.2629 - regression_loss: 0.2502 - classification_loss: 0.0126 445/500 [=========================>....] - ETA: 27s - loss: 0.2628 - regression_loss: 0.2502 - classification_loss: 0.0126 446/500 [=========================>....] - ETA: 27s - loss: 0.2627 - regression_loss: 0.2501 - classification_loss: 0.0126 447/500 [=========================>....] - ETA: 26s - loss: 0.2627 - regression_loss: 0.2501 - classification_loss: 0.0127 448/500 [=========================>....] - ETA: 26s - loss: 0.2628 - regression_loss: 0.2502 - classification_loss: 0.0127 449/500 [=========================>....] - ETA: 25s - loss: 0.2627 - regression_loss: 0.2500 - classification_loss: 0.0127 450/500 [==========================>...] - ETA: 25s - loss: 0.2625 - regression_loss: 0.2498 - classification_loss: 0.0126 451/500 [==========================>...] - ETA: 24s - loss: 0.2626 - regression_loss: 0.2500 - classification_loss: 0.0127 452/500 [==========================>...] - ETA: 24s - loss: 0.2626 - regression_loss: 0.2500 - classification_loss: 0.0127 453/500 [==========================>...] - ETA: 23s - loss: 0.2627 - regression_loss: 0.2501 - classification_loss: 0.0127 454/500 [==========================>...] - ETA: 23s - loss: 0.2623 - regression_loss: 0.2497 - classification_loss: 0.0126 455/500 [==========================>...] - ETA: 22s - loss: 0.2622 - regression_loss: 0.2496 - classification_loss: 0.0126 456/500 [==========================>...] - ETA: 22s - loss: 0.2623 - regression_loss: 0.2497 - classification_loss: 0.0126 457/500 [==========================>...] - ETA: 21s - loss: 0.2618 - regression_loss: 0.2492 - classification_loss: 0.0126 458/500 [==========================>...] - ETA: 21s - loss: 0.2619 - regression_loss: 0.2493 - classification_loss: 0.0126 459/500 [==========================>...] - ETA: 20s - loss: 0.2621 - regression_loss: 0.2495 - classification_loss: 0.0126 460/500 [==========================>...] - ETA: 20s - loss: 0.2619 - regression_loss: 0.2493 - classification_loss: 0.0126 461/500 [==========================>...] - ETA: 19s - loss: 0.2615 - regression_loss: 0.2489 - classification_loss: 0.0126 462/500 [==========================>...] - ETA: 19s - loss: 0.2612 - regression_loss: 0.2487 - classification_loss: 0.0126 463/500 [==========================>...] - ETA: 18s - loss: 0.2609 - regression_loss: 0.2483 - classification_loss: 0.0125 464/500 [==========================>...] - ETA: 18s - loss: 0.2612 - regression_loss: 0.2486 - classification_loss: 0.0125 465/500 [==========================>...] - ETA: 17s - loss: 0.2609 - regression_loss: 0.2484 - classification_loss: 0.0125 466/500 [==========================>...] - ETA: 17s - loss: 0.2609 - regression_loss: 0.2483 - classification_loss: 0.0125 467/500 [===========================>..] - ETA: 16s - loss: 0.2612 - regression_loss: 0.2487 - classification_loss: 0.0126 468/500 [===========================>..] - ETA: 16s - loss: 0.2616 - regression_loss: 0.2490 - classification_loss: 0.0126 469/500 [===========================>..] - ETA: 15s - loss: 0.2622 - regression_loss: 0.2496 - classification_loss: 0.0126 470/500 [===========================>..] - ETA: 15s - loss: 0.2624 - regression_loss: 0.2498 - classification_loss: 0.0126 471/500 [===========================>..] - ETA: 14s - loss: 0.2625 - regression_loss: 0.2499 - classification_loss: 0.0126 472/500 [===========================>..] - ETA: 14s - loss: 0.2623 - regression_loss: 0.2498 - classification_loss: 0.0126 473/500 [===========================>..] - ETA: 13s - loss: 0.2623 - regression_loss: 0.2497 - classification_loss: 0.0125 474/500 [===========================>..] - ETA: 13s - loss: 0.2620 - regression_loss: 0.2494 - classification_loss: 0.0125 475/500 [===========================>..] - ETA: 12s - loss: 0.2624 - regression_loss: 0.2498 - classification_loss: 0.0126 476/500 [===========================>..] - ETA: 12s - loss: 0.2623 - regression_loss: 0.2497 - classification_loss: 0.0126 477/500 [===========================>..] - ETA: 11s - loss: 0.2621 - regression_loss: 0.2495 - classification_loss: 0.0126 478/500 [===========================>..] - ETA: 11s - loss: 0.2619 - regression_loss: 0.2494 - classification_loss: 0.0125 479/500 [===========================>..] - ETA: 10s - loss: 0.2619 - regression_loss: 0.2493 - classification_loss: 0.0125 480/500 [===========================>..] - ETA: 10s - loss: 0.2617 - regression_loss: 0.2492 - classification_loss: 0.0125 481/500 [===========================>..] - ETA: 9s - loss: 0.2618 - regression_loss: 0.2493 - classification_loss: 0.0125  482/500 [===========================>..] - ETA: 9s - loss: 0.2621 - regression_loss: 0.2495 - classification_loss: 0.0125 483/500 [===========================>..] - ETA: 8s - loss: 0.2619 - regression_loss: 0.2494 - classification_loss: 0.0125 484/500 [============================>.] - ETA: 8s - loss: 0.2619 - regression_loss: 0.2493 - classification_loss: 0.0125 485/500 [============================>.] - ETA: 7s - loss: 0.2617 - regression_loss: 0.2492 - classification_loss: 0.0125 486/500 [============================>.] - ETA: 7s - loss: 0.2621 - regression_loss: 0.2495 - classification_loss: 0.0125 487/500 [============================>.] - ETA: 6s - loss: 0.2623 - regression_loss: 0.2497 - classification_loss: 0.0126 488/500 [============================>.] - ETA: 6s - loss: 0.2620 - regression_loss: 0.2495 - classification_loss: 0.0125 489/500 [============================>.] - ETA: 5s - loss: 0.2621 - regression_loss: 0.2496 - classification_loss: 0.0126 490/500 [============================>.] - ETA: 5s - loss: 0.2619 - regression_loss: 0.2493 - classification_loss: 0.0126 491/500 [============================>.] - ETA: 4s - loss: 0.2618 - regression_loss: 0.2493 - classification_loss: 0.0126 492/500 [============================>.] - ETA: 4s - loss: 0.2617 - regression_loss: 0.2491 - classification_loss: 0.0125 493/500 [============================>.] - ETA: 3s - loss: 0.2620 - regression_loss: 0.2494 - classification_loss: 0.0126 494/500 [============================>.] - ETA: 3s - loss: 0.2617 - regression_loss: 0.2491 - classification_loss: 0.0126 495/500 [============================>.] - ETA: 2s - loss: 0.2616 - regression_loss: 0.2490 - classification_loss: 0.0126 496/500 [============================>.] - ETA: 2s - loss: 0.2617 - regression_loss: 0.2491 - classification_loss: 0.0126 497/500 [============================>.] - ETA: 1s - loss: 0.2615 - regression_loss: 0.2489 - classification_loss: 0.0126 498/500 [============================>.] - ETA: 1s - loss: 0.2613 - regression_loss: 0.2487 - classification_loss: 0.0126 499/500 [============================>.] - ETA: 0s - loss: 0.2612 - regression_loss: 0.2486 - classification_loss: 0.0126 500/500 [==============================] - 251s 503ms/step - loss: 0.2609 - regression_loss: 0.2484 - classification_loss: 0.0126 1172 instances of class plum with average precision: 0.7530 mAP: 0.7530 Epoch 00026: saving model to ./training/snapshots/resnet101_pascal_26.h5 Epoch 27/150 1/500 [..............................] - ETA: 4:09 - loss: 0.1595 - regression_loss: 0.1516 - classification_loss: 0.0080 2/500 [..............................] - ETA: 4:09 - loss: 0.2634 - regression_loss: 0.2452 - classification_loss: 0.0182 3/500 [..............................] - ETA: 4:10 - loss: 0.2333 - regression_loss: 0.2164 - classification_loss: 0.0169 4/500 [..............................] - ETA: 4:08 - loss: 0.2387 - regression_loss: 0.2237 - classification_loss: 0.0150 5/500 [..............................] - ETA: 4:13 - loss: 0.2788 - regression_loss: 0.2530 - classification_loss: 0.0257 6/500 [..............................] - ETA: 4:11 - loss: 0.2472 - regression_loss: 0.2251 - classification_loss: 0.0221 7/500 [..............................] - ETA: 4:10 - loss: 0.2736 - regression_loss: 0.2496 - classification_loss: 0.0241 8/500 [..............................] - ETA: 4:09 - loss: 0.2680 - regression_loss: 0.2456 - classification_loss: 0.0224 9/500 [..............................] - ETA: 4:07 - loss: 0.2522 - regression_loss: 0.2321 - classification_loss: 0.0201 10/500 [..............................] - ETA: 4:06 - loss: 0.2470 - regression_loss: 0.2285 - classification_loss: 0.0186 11/500 [..............................] - ETA: 4:05 - loss: 0.2603 - regression_loss: 0.2420 - classification_loss: 0.0182 12/500 [..............................] - ETA: 4:05 - loss: 0.2550 - regression_loss: 0.2377 - classification_loss: 0.0173 13/500 [..............................] - ETA: 4:05 - loss: 0.2704 - regression_loss: 0.2534 - classification_loss: 0.0171 14/500 [..............................] - ETA: 4:05 - loss: 0.2658 - regression_loss: 0.2496 - classification_loss: 0.0161 15/500 [..............................] - ETA: 4:04 - loss: 0.2673 - regression_loss: 0.2507 - classification_loss: 0.0167 16/500 [..............................] - ETA: 4:03 - loss: 0.2609 - regression_loss: 0.2450 - classification_loss: 0.0159 17/500 [>.............................] - ETA: 4:03 - loss: 0.2598 - regression_loss: 0.2438 - classification_loss: 0.0160 18/500 [>.............................] - ETA: 4:03 - loss: 0.2529 - regression_loss: 0.2375 - classification_loss: 0.0154 19/500 [>.............................] - ETA: 4:02 - loss: 0.2641 - regression_loss: 0.2474 - classification_loss: 0.0166 20/500 [>.............................] - ETA: 4:01 - loss: 0.2682 - regression_loss: 0.2521 - classification_loss: 0.0160 21/500 [>.............................] - ETA: 4:01 - loss: 0.2725 - regression_loss: 0.2564 - classification_loss: 0.0161 22/500 [>.............................] - ETA: 4:00 - loss: 0.2721 - regression_loss: 0.2560 - classification_loss: 0.0161 23/500 [>.............................] - ETA: 3:59 - loss: 0.2674 - regression_loss: 0.2515 - classification_loss: 0.0159 24/500 [>.............................] - ETA: 3:59 - loss: 0.2649 - regression_loss: 0.2493 - classification_loss: 0.0156 25/500 [>.............................] - ETA: 3:58 - loss: 0.2611 - regression_loss: 0.2457 - classification_loss: 0.0154 26/500 [>.............................] - ETA: 3:58 - loss: 0.2593 - regression_loss: 0.2442 - classification_loss: 0.0150 27/500 [>.............................] - ETA: 3:57 - loss: 0.2566 - regression_loss: 0.2417 - classification_loss: 0.0149 28/500 [>.............................] - ETA: 3:57 - loss: 0.2538 - regression_loss: 0.2391 - classification_loss: 0.0147 29/500 [>.............................] - ETA: 3:56 - loss: 0.2481 - regression_loss: 0.2339 - classification_loss: 0.0142 30/500 [>.............................] - ETA: 3:56 - loss: 0.2519 - regression_loss: 0.2367 - classification_loss: 0.0152 31/500 [>.............................] - ETA: 3:55 - loss: 0.2509 - regression_loss: 0.2361 - classification_loss: 0.0148 32/500 [>.............................] - ETA: 3:55 - loss: 0.2501 - regression_loss: 0.2355 - classification_loss: 0.0146 33/500 [>.............................] - ETA: 3:54 - loss: 0.2553 - regression_loss: 0.2403 - classification_loss: 0.0150 34/500 [=>............................] - ETA: 3:54 - loss: 0.2517 - regression_loss: 0.2370 - classification_loss: 0.0147 35/500 [=>............................] - ETA: 3:54 - loss: 0.2481 - regression_loss: 0.2337 - classification_loss: 0.0144 36/500 [=>............................] - ETA: 3:53 - loss: 0.2462 - regression_loss: 0.2320 - classification_loss: 0.0142 37/500 [=>............................] - ETA: 3:52 - loss: 0.2454 - regression_loss: 0.2315 - classification_loss: 0.0139 38/500 [=>............................] - ETA: 3:52 - loss: 0.2448 - regression_loss: 0.2310 - classification_loss: 0.0138 39/500 [=>............................] - ETA: 3:51 - loss: 0.2422 - regression_loss: 0.2286 - classification_loss: 0.0136 40/500 [=>............................] - ETA: 3:51 - loss: 0.2425 - regression_loss: 0.2290 - classification_loss: 0.0135 41/500 [=>............................] - ETA: 3:51 - loss: 0.2406 - regression_loss: 0.2273 - classification_loss: 0.0133 42/500 [=>............................] - ETA: 3:50 - loss: 0.2415 - regression_loss: 0.2282 - classification_loss: 0.0133 43/500 [=>............................] - ETA: 3:49 - loss: 0.2377 - regression_loss: 0.2246 - classification_loss: 0.0131 44/500 [=>............................] - ETA: 3:49 - loss: 0.2378 - regression_loss: 0.2249 - classification_loss: 0.0129 45/500 [=>............................] - ETA: 3:49 - loss: 0.2516 - regression_loss: 0.2387 - classification_loss: 0.0129 46/500 [=>............................] - ETA: 3:48 - loss: 0.2511 - regression_loss: 0.2382 - classification_loss: 0.0129 47/500 [=>............................] - ETA: 3:47 - loss: 0.2510 - regression_loss: 0.2381 - classification_loss: 0.0129 48/500 [=>............................] - ETA: 3:47 - loss: 0.2530 - regression_loss: 0.2402 - classification_loss: 0.0128 49/500 [=>............................] - ETA: 3:46 - loss: 0.2522 - regression_loss: 0.2395 - classification_loss: 0.0127 50/500 [==>...........................] - ETA: 3:46 - loss: 0.2530 - regression_loss: 0.2404 - classification_loss: 0.0126 51/500 [==>...........................] - ETA: 3:46 - loss: 0.2560 - regression_loss: 0.2431 - classification_loss: 0.0129 52/500 [==>...........................] - ETA: 3:45 - loss: 0.2598 - regression_loss: 0.2467 - classification_loss: 0.0131 53/500 [==>...........................] - ETA: 3:44 - loss: 0.2588 - regression_loss: 0.2458 - classification_loss: 0.0131 54/500 [==>...........................] - ETA: 3:44 - loss: 0.2607 - regression_loss: 0.2476 - classification_loss: 0.0131 55/500 [==>...........................] - ETA: 3:43 - loss: 0.2602 - regression_loss: 0.2473 - classification_loss: 0.0130 56/500 [==>...........................] - ETA: 3:43 - loss: 0.2570 - regression_loss: 0.2442 - classification_loss: 0.0128 57/500 [==>...........................] - ETA: 3:42 - loss: 0.2542 - regression_loss: 0.2416 - classification_loss: 0.0126 58/500 [==>...........................] - ETA: 3:41 - loss: 0.2526 - regression_loss: 0.2400 - classification_loss: 0.0126 59/500 [==>...........................] - ETA: 3:41 - loss: 0.2571 - regression_loss: 0.2435 - classification_loss: 0.0136 60/500 [==>...........................] - ETA: 3:40 - loss: 0.2587 - regression_loss: 0.2450 - classification_loss: 0.0137 61/500 [==>...........................] - ETA: 3:40 - loss: 0.2560 - regression_loss: 0.2424 - classification_loss: 0.0136 62/500 [==>...........................] - ETA: 3:39 - loss: 0.2529 - regression_loss: 0.2395 - classification_loss: 0.0134 63/500 [==>...........................] - ETA: 3:39 - loss: 0.2531 - regression_loss: 0.2398 - classification_loss: 0.0133 64/500 [==>...........................] - ETA: 3:38 - loss: 0.2523 - regression_loss: 0.2391 - classification_loss: 0.0132 65/500 [==>...........................] - ETA: 3:38 - loss: 0.2523 - regression_loss: 0.2391 - classification_loss: 0.0132 66/500 [==>...........................] - ETA: 3:37 - loss: 0.2523 - regression_loss: 0.2392 - classification_loss: 0.0132 67/500 [===>..........................] - ETA: 3:37 - loss: 0.2516 - regression_loss: 0.2383 - classification_loss: 0.0132 68/500 [===>..........................] - ETA: 3:36 - loss: 0.2516 - regression_loss: 0.2385 - classification_loss: 0.0132 69/500 [===>..........................] - ETA: 3:36 - loss: 0.2522 - regression_loss: 0.2390 - classification_loss: 0.0132 70/500 [===>..........................] - ETA: 3:35 - loss: 0.2526 - regression_loss: 0.2395 - classification_loss: 0.0131 71/500 [===>..........................] - ETA: 3:35 - loss: 0.2518 - regression_loss: 0.2386 - classification_loss: 0.0132 72/500 [===>..........................] - ETA: 3:34 - loss: 0.2512 - regression_loss: 0.2382 - classification_loss: 0.0130 73/500 [===>..........................] - ETA: 3:34 - loss: 0.2507 - regression_loss: 0.2377 - classification_loss: 0.0130 74/500 [===>..........................] - ETA: 3:33 - loss: 0.2503 - regression_loss: 0.2374 - classification_loss: 0.0129 75/500 [===>..........................] - ETA: 3:33 - loss: 0.2504 - regression_loss: 0.2375 - classification_loss: 0.0129 76/500 [===>..........................] - ETA: 3:32 - loss: 0.2499 - regression_loss: 0.2371 - classification_loss: 0.0128 77/500 [===>..........................] - ETA: 3:32 - loss: 0.2504 - regression_loss: 0.2375 - classification_loss: 0.0129 78/500 [===>..........................] - ETA: 3:31 - loss: 0.2494 - regression_loss: 0.2366 - classification_loss: 0.0128 79/500 [===>..........................] - ETA: 3:31 - loss: 0.2495 - regression_loss: 0.2365 - classification_loss: 0.0129 80/500 [===>..........................] - ETA: 3:30 - loss: 0.2523 - regression_loss: 0.2391 - classification_loss: 0.0132 81/500 [===>..........................] - ETA: 3:30 - loss: 0.2529 - regression_loss: 0.2398 - classification_loss: 0.0131 82/500 [===>..........................] - ETA: 3:29 - loss: 0.2547 - regression_loss: 0.2415 - classification_loss: 0.0132 83/500 [===>..........................] - ETA: 3:29 - loss: 0.2528 - regression_loss: 0.2397 - classification_loss: 0.0130 84/500 [====>.........................] - ETA: 3:28 - loss: 0.2519 - regression_loss: 0.2390 - classification_loss: 0.0130 85/500 [====>.........................] - ETA: 3:28 - loss: 0.2524 - regression_loss: 0.2393 - classification_loss: 0.0131 86/500 [====>.........................] - ETA: 3:27 - loss: 0.2513 - regression_loss: 0.2383 - classification_loss: 0.0130 87/500 [====>.........................] - ETA: 3:27 - loss: 0.2499 - regression_loss: 0.2370 - classification_loss: 0.0129 88/500 [====>.........................] - ETA: 3:26 - loss: 0.2482 - regression_loss: 0.2354 - classification_loss: 0.0128 89/500 [====>.........................] - ETA: 3:26 - loss: 0.2480 - regression_loss: 0.2352 - classification_loss: 0.0128 90/500 [====>.........................] - ETA: 3:25 - loss: 0.2484 - regression_loss: 0.2356 - classification_loss: 0.0128 91/500 [====>.........................] - ETA: 3:25 - loss: 0.2473 - regression_loss: 0.2346 - classification_loss: 0.0127 92/500 [====>.........................] - ETA: 3:24 - loss: 0.2478 - regression_loss: 0.2351 - classification_loss: 0.0127 93/500 [====>.........................] - ETA: 3:24 - loss: 0.2478 - regression_loss: 0.2352 - classification_loss: 0.0126 94/500 [====>.........................] - ETA: 3:23 - loss: 0.2473 - regression_loss: 0.2347 - classification_loss: 0.0126 95/500 [====>.........................] - ETA: 3:23 - loss: 0.2468 - regression_loss: 0.2343 - classification_loss: 0.0125 96/500 [====>.........................] - ETA: 3:22 - loss: 0.2466 - regression_loss: 0.2341 - classification_loss: 0.0125 97/500 [====>.........................] - ETA: 3:22 - loss: 0.2464 - regression_loss: 0.2339 - classification_loss: 0.0125 98/500 [====>.........................] - ETA: 3:21 - loss: 0.2453 - regression_loss: 0.2328 - classification_loss: 0.0125 99/500 [====>.........................] - ETA: 3:21 - loss: 0.2455 - regression_loss: 0.2330 - classification_loss: 0.0125 100/500 [=====>........................] - ETA: 3:20 - loss: 0.2465 - regression_loss: 0.2339 - classification_loss: 0.0126 101/500 [=====>........................] - ETA: 3:20 - loss: 0.2483 - regression_loss: 0.2355 - classification_loss: 0.0128 102/500 [=====>........................] - ETA: 3:19 - loss: 0.2472 - regression_loss: 0.2345 - classification_loss: 0.0127 103/500 [=====>........................] - ETA: 3:19 - loss: 0.2473 - regression_loss: 0.2346 - classification_loss: 0.0127 104/500 [=====>........................] - ETA: 3:18 - loss: 0.2463 - regression_loss: 0.2337 - classification_loss: 0.0126 105/500 [=====>........................] - ETA: 3:18 - loss: 0.2453 - regression_loss: 0.2328 - classification_loss: 0.0125 106/500 [=====>........................] - ETA: 3:17 - loss: 0.2465 - regression_loss: 0.2339 - classification_loss: 0.0126 107/500 [=====>........................] - ETA: 3:17 - loss: 0.2462 - regression_loss: 0.2335 - classification_loss: 0.0127 108/500 [=====>........................] - ETA: 3:16 - loss: 0.2473 - regression_loss: 0.2345 - classification_loss: 0.0128 109/500 [=====>........................] - ETA: 3:16 - loss: 0.2475 - regression_loss: 0.2348 - classification_loss: 0.0127 110/500 [=====>........................] - ETA: 3:15 - loss: 0.2479 - regression_loss: 0.2352 - classification_loss: 0.0127 111/500 [=====>........................] - ETA: 3:15 - loss: 0.2465 - regression_loss: 0.2339 - classification_loss: 0.0126 112/500 [=====>........................] - ETA: 3:14 - loss: 0.2465 - regression_loss: 0.2340 - classification_loss: 0.0126 113/500 [=====>........................] - ETA: 3:14 - loss: 0.2457 - regression_loss: 0.2332 - classification_loss: 0.0125 114/500 [=====>........................] - ETA: 3:13 - loss: 0.2447 - regression_loss: 0.2323 - classification_loss: 0.0125 115/500 [=====>........................] - ETA: 3:13 - loss: 0.2439 - regression_loss: 0.2315 - classification_loss: 0.0124 116/500 [=====>........................] - ETA: 3:12 - loss: 0.2427 - regression_loss: 0.2304 - classification_loss: 0.0123 117/500 [======>.......................] - ETA: 3:12 - loss: 0.2430 - regression_loss: 0.2307 - classification_loss: 0.0123 118/500 [======>.......................] - ETA: 3:11 - loss: 0.2418 - regression_loss: 0.2296 - classification_loss: 0.0122 119/500 [======>.......................] - ETA: 3:11 - loss: 0.2442 - regression_loss: 0.2318 - classification_loss: 0.0124 120/500 [======>.......................] - ETA: 3:10 - loss: 0.2457 - regression_loss: 0.2330 - classification_loss: 0.0127 121/500 [======>.......................] - ETA: 3:10 - loss: 0.2459 - regression_loss: 0.2332 - classification_loss: 0.0127 122/500 [======>.......................] - ETA: 3:09 - loss: 0.2453 - regression_loss: 0.2327 - classification_loss: 0.0127 123/500 [======>.......................] - ETA: 3:09 - loss: 0.2452 - regression_loss: 0.2326 - classification_loss: 0.0126 124/500 [======>.......................] - ETA: 3:08 - loss: 0.2464 - regression_loss: 0.2337 - classification_loss: 0.0127 125/500 [======>.......................] - ETA: 3:08 - loss: 0.2465 - regression_loss: 0.2337 - classification_loss: 0.0128 126/500 [======>.......................] - ETA: 3:07 - loss: 0.2458 - regression_loss: 0.2330 - classification_loss: 0.0127 127/500 [======>.......................] - ETA: 3:07 - loss: 0.2451 - regression_loss: 0.2324 - classification_loss: 0.0127 128/500 [======>.......................] - ETA: 3:06 - loss: 0.2451 - regression_loss: 0.2324 - classification_loss: 0.0127 129/500 [======>.......................] - ETA: 3:06 - loss: 0.2449 - regression_loss: 0.2322 - classification_loss: 0.0127 130/500 [======>.......................] - ETA: 3:05 - loss: 0.2448 - regression_loss: 0.2320 - classification_loss: 0.0127 131/500 [======>.......................] - ETA: 3:05 - loss: 0.2447 - regression_loss: 0.2320 - classification_loss: 0.0128 132/500 [======>.......................] - ETA: 3:04 - loss: 0.2439 - regression_loss: 0.2311 - classification_loss: 0.0127 133/500 [======>.......................] - ETA: 3:04 - loss: 0.2434 - regression_loss: 0.2307 - classification_loss: 0.0127 134/500 [=======>......................] - ETA: 3:03 - loss: 0.2430 - regression_loss: 0.2303 - classification_loss: 0.0127 135/500 [=======>......................] - ETA: 3:03 - loss: 0.2443 - regression_loss: 0.2315 - classification_loss: 0.0128 136/500 [=======>......................] - ETA: 3:02 - loss: 0.2459 - regression_loss: 0.2331 - classification_loss: 0.0129 137/500 [=======>......................] - ETA: 3:02 - loss: 0.2475 - regression_loss: 0.2345 - classification_loss: 0.0130 138/500 [=======>......................] - ETA: 3:01 - loss: 0.2471 - regression_loss: 0.2341 - classification_loss: 0.0129 139/500 [=======>......................] - ETA: 3:01 - loss: 0.2466 - regression_loss: 0.2337 - classification_loss: 0.0129 140/500 [=======>......................] - ETA: 3:00 - loss: 0.2458 - regression_loss: 0.2330 - classification_loss: 0.0128 141/500 [=======>......................] - ETA: 3:00 - loss: 0.2446 - regression_loss: 0.2319 - classification_loss: 0.0127 142/500 [=======>......................] - ETA: 2:59 - loss: 0.2457 - regression_loss: 0.2328 - classification_loss: 0.0129 143/500 [=======>......................] - ETA: 2:59 - loss: 0.2452 - regression_loss: 0.2324 - classification_loss: 0.0129 144/500 [=======>......................] - ETA: 2:58 - loss: 0.2452 - regression_loss: 0.2322 - classification_loss: 0.0130 145/500 [=======>......................] - ETA: 2:58 - loss: 0.2451 - regression_loss: 0.2321 - classification_loss: 0.0130 146/500 [=======>......................] - ETA: 2:57 - loss: 0.2443 - regression_loss: 0.2314 - classification_loss: 0.0129 147/500 [=======>......................] - ETA: 2:57 - loss: 0.2439 - regression_loss: 0.2310 - classification_loss: 0.0129 148/500 [=======>......................] - ETA: 2:56 - loss: 0.2434 - regression_loss: 0.2305 - classification_loss: 0.0129 149/500 [=======>......................] - ETA: 2:56 - loss: 0.2429 - regression_loss: 0.2300 - classification_loss: 0.0128 150/500 [========>.....................] - ETA: 2:55 - loss: 0.2426 - regression_loss: 0.2298 - classification_loss: 0.0128 151/500 [========>.....................] - ETA: 2:55 - loss: 0.2423 - regression_loss: 0.2295 - classification_loss: 0.0128 152/500 [========>.....................] - ETA: 2:54 - loss: 0.2431 - regression_loss: 0.2303 - classification_loss: 0.0128 153/500 [========>.....................] - ETA: 2:54 - loss: 0.2433 - regression_loss: 0.2305 - classification_loss: 0.0128 154/500 [========>.....................] - ETA: 2:53 - loss: 0.2433 - regression_loss: 0.2305 - classification_loss: 0.0128 155/500 [========>.....................] - ETA: 2:53 - loss: 0.2425 - regression_loss: 0.2298 - classification_loss: 0.0127 156/500 [========>.....................] - ETA: 2:52 - loss: 0.2422 - regression_loss: 0.2295 - classification_loss: 0.0127 157/500 [========>.....................] - ETA: 2:52 - loss: 0.2417 - regression_loss: 0.2290 - classification_loss: 0.0127 158/500 [========>.....................] - ETA: 2:51 - loss: 0.2420 - regression_loss: 0.2292 - classification_loss: 0.0128 159/500 [========>.....................] - ETA: 2:51 - loss: 0.2419 - regression_loss: 0.2291 - classification_loss: 0.0128 160/500 [========>.....................] - ETA: 2:50 - loss: 0.2421 - regression_loss: 0.2293 - classification_loss: 0.0128 161/500 [========>.....................] - ETA: 2:50 - loss: 0.2430 - regression_loss: 0.2301 - classification_loss: 0.0129 162/500 [========>.....................] - ETA: 2:49 - loss: 0.2427 - regression_loss: 0.2299 - classification_loss: 0.0128 163/500 [========>.....................] - ETA: 2:49 - loss: 0.2430 - regression_loss: 0.2302 - classification_loss: 0.0128 164/500 [========>.....................] - ETA: 2:48 - loss: 0.2427 - regression_loss: 0.2298 - classification_loss: 0.0128 165/500 [========>.....................] - ETA: 2:48 - loss: 0.2417 - regression_loss: 0.2289 - classification_loss: 0.0128 166/500 [========>.....................] - ETA: 2:47 - loss: 0.2410 - regression_loss: 0.2283 - classification_loss: 0.0127 167/500 [=========>....................] - ETA: 2:47 - loss: 0.2405 - regression_loss: 0.2278 - classification_loss: 0.0127 168/500 [=========>....................] - ETA: 2:46 - loss: 0.2402 - regression_loss: 0.2276 - classification_loss: 0.0127 169/500 [=========>....................] - ETA: 2:46 - loss: 0.2407 - regression_loss: 0.2280 - classification_loss: 0.0127 170/500 [=========>....................] - ETA: 2:45 - loss: 0.2404 - regression_loss: 0.2277 - classification_loss: 0.0127 171/500 [=========>....................] - ETA: 2:45 - loss: 0.2401 - regression_loss: 0.2274 - classification_loss: 0.0127 172/500 [=========>....................] - ETA: 2:44 - loss: 0.2407 - regression_loss: 0.2280 - classification_loss: 0.0127 173/500 [=========>....................] - ETA: 2:44 - loss: 0.2404 - regression_loss: 0.2277 - classification_loss: 0.0126 174/500 [=========>....................] - ETA: 2:43 - loss: 0.2397 - regression_loss: 0.2272 - classification_loss: 0.0126 175/500 [=========>....................] - ETA: 2:43 - loss: 0.2397 - regression_loss: 0.2271 - classification_loss: 0.0126 176/500 [=========>....................] - ETA: 2:42 - loss: 0.2403 - regression_loss: 0.2276 - classification_loss: 0.0127 177/500 [=========>....................] - ETA: 2:42 - loss: 0.2400 - regression_loss: 0.2273 - classification_loss: 0.0127 178/500 [=========>....................] - ETA: 2:41 - loss: 0.2399 - regression_loss: 0.2273 - classification_loss: 0.0126 179/500 [=========>....................] - ETA: 2:41 - loss: 0.2394 - regression_loss: 0.2269 - classification_loss: 0.0126 180/500 [=========>....................] - ETA: 2:40 - loss: 0.2390 - regression_loss: 0.2264 - classification_loss: 0.0126 181/500 [=========>....................] - ETA: 2:40 - loss: 0.2387 - regression_loss: 0.2261 - classification_loss: 0.0126 182/500 [=========>....................] - ETA: 2:39 - loss: 0.2395 - regression_loss: 0.2269 - classification_loss: 0.0125 183/500 [=========>....................] - ETA: 2:39 - loss: 0.2391 - regression_loss: 0.2266 - classification_loss: 0.0125 184/500 [==========>...................] - ETA: 2:38 - loss: 0.2396 - regression_loss: 0.2272 - classification_loss: 0.0125 185/500 [==========>...................] - ETA: 2:38 - loss: 0.2402 - regression_loss: 0.2277 - classification_loss: 0.0125 186/500 [==========>...................] - ETA: 2:37 - loss: 0.2408 - regression_loss: 0.2282 - classification_loss: 0.0126 187/500 [==========>...................] - ETA: 2:37 - loss: 0.2407 - regression_loss: 0.2281 - classification_loss: 0.0125 188/500 [==========>...................] - ETA: 2:36 - loss: 0.2402 - regression_loss: 0.2277 - classification_loss: 0.0125 189/500 [==========>...................] - ETA: 2:36 - loss: 0.2402 - regression_loss: 0.2278 - classification_loss: 0.0125 190/500 [==========>...................] - ETA: 2:35 - loss: 0.2404 - regression_loss: 0.2279 - classification_loss: 0.0124 191/500 [==========>...................] - ETA: 2:35 - loss: 0.2400 - regression_loss: 0.2276 - classification_loss: 0.0124 192/500 [==========>...................] - ETA: 2:34 - loss: 0.2405 - regression_loss: 0.2281 - classification_loss: 0.0124 193/500 [==========>...................] - ETA: 2:34 - loss: 0.2406 - regression_loss: 0.2282 - classification_loss: 0.0124 194/500 [==========>...................] - ETA: 2:33 - loss: 0.2410 - regression_loss: 0.2285 - classification_loss: 0.0124 195/500 [==========>...................] - ETA: 2:33 - loss: 0.2407 - regression_loss: 0.2283 - classification_loss: 0.0124 196/500 [==========>...................] - ETA: 2:32 - loss: 0.2404 - regression_loss: 0.2281 - classification_loss: 0.0124 197/500 [==========>...................] - ETA: 2:32 - loss: 0.2408 - regression_loss: 0.2285 - classification_loss: 0.0123 198/500 [==========>...................] - ETA: 2:31 - loss: 0.2408 - regression_loss: 0.2286 - classification_loss: 0.0123 199/500 [==========>...................] - ETA: 2:31 - loss: 0.2408 - regression_loss: 0.2286 - classification_loss: 0.0123 200/500 [===========>..................] - ETA: 2:30 - loss: 0.2404 - regression_loss: 0.2281 - classification_loss: 0.0122 201/500 [===========>..................] - ETA: 2:30 - loss: 0.2399 - regression_loss: 0.2277 - classification_loss: 0.0122 202/500 [===========>..................] - ETA: 2:29 - loss: 0.2398 - regression_loss: 0.2276 - classification_loss: 0.0122 203/500 [===========>..................] - ETA: 2:29 - loss: 0.2405 - regression_loss: 0.2282 - classification_loss: 0.0123 204/500 [===========>..................] - ETA: 2:28 - loss: 0.2398 - regression_loss: 0.2276 - classification_loss: 0.0122 205/500 [===========>..................] - ETA: 2:28 - loss: 0.2403 - regression_loss: 0.2280 - classification_loss: 0.0123 206/500 [===========>..................] - ETA: 2:27 - loss: 0.2402 - regression_loss: 0.2279 - classification_loss: 0.0122 207/500 [===========>..................] - ETA: 2:27 - loss: 0.2405 - regression_loss: 0.2283 - classification_loss: 0.0123 208/500 [===========>..................] - ETA: 2:26 - loss: 0.2402 - regression_loss: 0.2280 - classification_loss: 0.0122 209/500 [===========>..................] - ETA: 2:26 - loss: 0.2403 - regression_loss: 0.2281 - classification_loss: 0.0122 210/500 [===========>..................] - ETA: 2:25 - loss: 0.2406 - regression_loss: 0.2284 - classification_loss: 0.0122 211/500 [===========>..................] - ETA: 2:25 - loss: 0.2407 - regression_loss: 0.2285 - classification_loss: 0.0122 212/500 [===========>..................] - ETA: 2:24 - loss: 0.2419 - regression_loss: 0.2296 - classification_loss: 0.0123 213/500 [===========>..................] - ETA: 2:24 - loss: 0.2419 - regression_loss: 0.2296 - classification_loss: 0.0123 214/500 [===========>..................] - ETA: 2:23 - loss: 0.2434 - regression_loss: 0.2311 - classification_loss: 0.0123 215/500 [===========>..................] - ETA: 2:23 - loss: 0.2440 - regression_loss: 0.2317 - classification_loss: 0.0123 216/500 [===========>..................] - ETA: 2:22 - loss: 0.2447 - regression_loss: 0.2324 - classification_loss: 0.0123 217/500 [============>.................] - ETA: 2:22 - loss: 0.2450 - regression_loss: 0.2327 - classification_loss: 0.0123 218/500 [============>.................] - ETA: 2:21 - loss: 0.2446 - regression_loss: 0.2324 - classification_loss: 0.0122 219/500 [============>.................] - ETA: 2:21 - loss: 0.2450 - regression_loss: 0.2327 - classification_loss: 0.0123 220/500 [============>.................] - ETA: 2:20 - loss: 0.2447 - regression_loss: 0.2324 - classification_loss: 0.0122 221/500 [============>.................] - ETA: 2:20 - loss: 0.2445 - regression_loss: 0.2323 - classification_loss: 0.0122 222/500 [============>.................] - ETA: 2:19 - loss: 0.2444 - regression_loss: 0.2322 - classification_loss: 0.0122 223/500 [============>.................] - ETA: 2:19 - loss: 0.2441 - regression_loss: 0.2319 - classification_loss: 0.0122 224/500 [============>.................] - ETA: 2:18 - loss: 0.2443 - regression_loss: 0.2322 - classification_loss: 0.0122 225/500 [============>.................] - ETA: 2:18 - loss: 0.2444 - regression_loss: 0.2322 - classification_loss: 0.0122 226/500 [============>.................] - ETA: 2:17 - loss: 0.2441 - regression_loss: 0.2320 - classification_loss: 0.0121 227/500 [============>.................] - ETA: 2:17 - loss: 0.2436 - regression_loss: 0.2315 - classification_loss: 0.0121 228/500 [============>.................] - ETA: 2:16 - loss: 0.2432 - regression_loss: 0.2311 - classification_loss: 0.0121 229/500 [============>.................] - ETA: 2:16 - loss: 0.2448 - regression_loss: 0.2327 - classification_loss: 0.0121 230/500 [============>.................] - ETA: 2:15 - loss: 0.2442 - regression_loss: 0.2322 - classification_loss: 0.0120 231/500 [============>.................] - ETA: 2:15 - loss: 0.2442 - regression_loss: 0.2322 - classification_loss: 0.0120 232/500 [============>.................] - ETA: 2:14 - loss: 0.2436 - regression_loss: 0.2317 - classification_loss: 0.0120 233/500 [============>.................] - ETA: 2:14 - loss: 0.2438 - regression_loss: 0.2318 - classification_loss: 0.0120 234/500 [=============>................] - ETA: 2:13 - loss: 0.2436 - regression_loss: 0.2317 - classification_loss: 0.0119 235/500 [=============>................] - ETA: 2:13 - loss: 0.2432 - regression_loss: 0.2313 - classification_loss: 0.0119 236/500 [=============>................] - ETA: 2:12 - loss: 0.2434 - regression_loss: 0.2315 - classification_loss: 0.0119 237/500 [=============>................] - ETA: 2:12 - loss: 0.2433 - regression_loss: 0.2314 - classification_loss: 0.0119 238/500 [=============>................] - ETA: 2:11 - loss: 0.2435 - regression_loss: 0.2316 - classification_loss: 0.0119 239/500 [=============>................] - ETA: 2:11 - loss: 0.2431 - regression_loss: 0.2312 - classification_loss: 0.0119 240/500 [=============>................] - ETA: 2:10 - loss: 0.2429 - regression_loss: 0.2311 - classification_loss: 0.0119 241/500 [=============>................] - ETA: 2:10 - loss: 0.2433 - regression_loss: 0.2314 - classification_loss: 0.0119 242/500 [=============>................] - ETA: 2:09 - loss: 0.2435 - regression_loss: 0.2317 - classification_loss: 0.0119 243/500 [=============>................] - ETA: 2:09 - loss: 0.2436 - regression_loss: 0.2317 - classification_loss: 0.0119 244/500 [=============>................] - ETA: 2:08 - loss: 0.2439 - regression_loss: 0.2320 - classification_loss: 0.0119 245/500 [=============>................] - ETA: 2:08 - loss: 0.2435 - regression_loss: 0.2317 - classification_loss: 0.0118 246/500 [=============>................] - ETA: 2:07 - loss: 0.2433 - regression_loss: 0.2315 - classification_loss: 0.0118 247/500 [=============>................] - ETA: 2:07 - loss: 0.2433 - regression_loss: 0.2315 - classification_loss: 0.0118 248/500 [=============>................] - ETA: 2:06 - loss: 0.2432 - regression_loss: 0.2314 - classification_loss: 0.0118 249/500 [=============>................] - ETA: 2:06 - loss: 0.2427 - regression_loss: 0.2309 - classification_loss: 0.0118 250/500 [==============>...............] - ETA: 2:05 - loss: 0.2434 - regression_loss: 0.2316 - classification_loss: 0.0118 251/500 [==============>...............] - ETA: 2:05 - loss: 0.2431 - regression_loss: 0.2313 - classification_loss: 0.0118 252/500 [==============>...............] - ETA: 2:04 - loss: 0.2431 - regression_loss: 0.2313 - classification_loss: 0.0118 253/500 [==============>...............] - ETA: 2:04 - loss: 0.2431 - regression_loss: 0.2314 - classification_loss: 0.0118 254/500 [==============>...............] - ETA: 2:03 - loss: 0.2427 - regression_loss: 0.2309 - classification_loss: 0.0118 255/500 [==============>...............] - ETA: 2:03 - loss: 0.2427 - regression_loss: 0.2310 - classification_loss: 0.0118 256/500 [==============>...............] - ETA: 2:02 - loss: 0.2427 - regression_loss: 0.2309 - classification_loss: 0.0117 257/500 [==============>...............] - ETA: 2:02 - loss: 0.2421 - regression_loss: 0.2304 - classification_loss: 0.0117 258/500 [==============>...............] - ETA: 2:01 - loss: 0.2421 - regression_loss: 0.2304 - classification_loss: 0.0117 259/500 [==============>...............] - ETA: 2:01 - loss: 0.2429 - regression_loss: 0.2311 - classification_loss: 0.0118 260/500 [==============>...............] - ETA: 2:00 - loss: 0.2429 - regression_loss: 0.2311 - classification_loss: 0.0118 261/500 [==============>...............] - ETA: 2:00 - loss: 0.2429 - regression_loss: 0.2311 - classification_loss: 0.0118 262/500 [==============>...............] - ETA: 1:59 - loss: 0.2429 - regression_loss: 0.2310 - classification_loss: 0.0118 263/500 [==============>...............] - ETA: 1:59 - loss: 0.2425 - regression_loss: 0.2307 - classification_loss: 0.0119 264/500 [==============>...............] - ETA: 1:58 - loss: 0.2422 - regression_loss: 0.2304 - classification_loss: 0.0119 265/500 [==============>...............] - ETA: 1:58 - loss: 0.2420 - regression_loss: 0.2302 - classification_loss: 0.0118 266/500 [==============>...............] - ETA: 1:57 - loss: 0.2419 - regression_loss: 0.2300 - classification_loss: 0.0118 267/500 [===============>..............] - ETA: 1:57 - loss: 0.2416 - regression_loss: 0.2298 - classification_loss: 0.0118 268/500 [===============>..............] - ETA: 1:56 - loss: 0.2414 - regression_loss: 0.2296 - classification_loss: 0.0118 269/500 [===============>..............] - ETA: 1:56 - loss: 0.2411 - regression_loss: 0.2293 - classification_loss: 0.0118 270/500 [===============>..............] - ETA: 1:55 - loss: 0.2404 - regression_loss: 0.2287 - classification_loss: 0.0117 271/500 [===============>..............] - ETA: 1:55 - loss: 0.2401 - regression_loss: 0.2284 - classification_loss: 0.0117 272/500 [===============>..............] - ETA: 1:54 - loss: 0.2402 - regression_loss: 0.2285 - classification_loss: 0.0117 273/500 [===============>..............] - ETA: 1:54 - loss: 0.2410 - regression_loss: 0.2292 - classification_loss: 0.0117 274/500 [===============>..............] - ETA: 1:53 - loss: 0.2407 - regression_loss: 0.2290 - classification_loss: 0.0117 275/500 [===============>..............] - ETA: 1:53 - loss: 0.2405 - regression_loss: 0.2288 - classification_loss: 0.0117 276/500 [===============>..............] - ETA: 1:52 - loss: 0.2401 - regression_loss: 0.2284 - classification_loss: 0.0117 277/500 [===============>..............] - ETA: 1:52 - loss: 0.2406 - regression_loss: 0.2289 - classification_loss: 0.0117 278/500 [===============>..............] - ETA: 1:51 - loss: 0.2404 - regression_loss: 0.2288 - classification_loss: 0.0116 279/500 [===============>..............] - ETA: 1:51 - loss: 0.2401 - regression_loss: 0.2285 - classification_loss: 0.0116 280/500 [===============>..............] - ETA: 1:50 - loss: 0.2405 - regression_loss: 0.2289 - classification_loss: 0.0116 281/500 [===============>..............] - ETA: 1:50 - loss: 0.2410 - regression_loss: 0.2293 - classification_loss: 0.0117 282/500 [===============>..............] - ETA: 1:49 - loss: 0.2410 - regression_loss: 0.2293 - classification_loss: 0.0117 283/500 [===============>..............] - ETA: 1:49 - loss: 0.2413 - regression_loss: 0.2295 - classification_loss: 0.0118 284/500 [================>.............] - ETA: 1:48 - loss: 0.2415 - regression_loss: 0.2297 - classification_loss: 0.0118 285/500 [================>.............] - ETA: 1:48 - loss: 0.2420 - regression_loss: 0.2302 - classification_loss: 0.0118 286/500 [================>.............] - ETA: 1:47 - loss: 0.2415 - regression_loss: 0.2297 - classification_loss: 0.0117 287/500 [================>.............] - ETA: 1:47 - loss: 0.2410 - regression_loss: 0.2293 - classification_loss: 0.0117 288/500 [================>.............] - ETA: 1:46 - loss: 0.2411 - regression_loss: 0.2294 - classification_loss: 0.0117 289/500 [================>.............] - ETA: 1:46 - loss: 0.2411 - regression_loss: 0.2293 - classification_loss: 0.0117 290/500 [================>.............] - ETA: 1:45 - loss: 0.2409 - regression_loss: 0.2292 - classification_loss: 0.0117 291/500 [================>.............] - ETA: 1:45 - loss: 0.2407 - regression_loss: 0.2290 - classification_loss: 0.0117 292/500 [================>.............] - ETA: 1:44 - loss: 0.2416 - regression_loss: 0.2299 - classification_loss: 0.0117 293/500 [================>.............] - ETA: 1:44 - loss: 0.2414 - regression_loss: 0.2297 - classification_loss: 0.0117 294/500 [================>.............] - ETA: 1:43 - loss: 0.2423 - regression_loss: 0.2305 - classification_loss: 0.0118 295/500 [================>.............] - ETA: 1:43 - loss: 0.2428 - regression_loss: 0.2310 - classification_loss: 0.0119 296/500 [================>.............] - ETA: 1:42 - loss: 0.2426 - regression_loss: 0.2307 - classification_loss: 0.0118 297/500 [================>.............] - ETA: 1:42 - loss: 0.2427 - regression_loss: 0.2309 - classification_loss: 0.0119 298/500 [================>.............] - ETA: 1:41 - loss: 0.2426 - regression_loss: 0.2308 - classification_loss: 0.0118 299/500 [================>.............] - ETA: 1:41 - loss: 0.2430 - regression_loss: 0.2311 - classification_loss: 0.0119 300/500 [=================>............] - ETA: 1:40 - loss: 0.2427 - regression_loss: 0.2309 - classification_loss: 0.0118 301/500 [=================>............] - ETA: 1:40 - loss: 0.2428 - regression_loss: 0.2310 - classification_loss: 0.0118 302/500 [=================>............] - ETA: 1:39 - loss: 0.2426 - regression_loss: 0.2308 - classification_loss: 0.0118 303/500 [=================>............] - ETA: 1:39 - loss: 0.2430 - regression_loss: 0.2312 - classification_loss: 0.0118 304/500 [=================>............] - ETA: 1:38 - loss: 0.2433 - regression_loss: 0.2314 - classification_loss: 0.0119 305/500 [=================>............] - ETA: 1:38 - loss: 0.2428 - regression_loss: 0.2310 - classification_loss: 0.0118 306/500 [=================>............] - ETA: 1:37 - loss: 0.2424 - regression_loss: 0.2306 - classification_loss: 0.0118 307/500 [=================>............] - ETA: 1:37 - loss: 0.2429 - regression_loss: 0.2310 - classification_loss: 0.0119 308/500 [=================>............] - ETA: 1:36 - loss: 0.2429 - regression_loss: 0.2311 - classification_loss: 0.0119 309/500 [=================>............] - ETA: 1:36 - loss: 0.2431 - regression_loss: 0.2312 - classification_loss: 0.0119 310/500 [=================>............] - ETA: 1:35 - loss: 0.2426 - regression_loss: 0.2307 - classification_loss: 0.0119 311/500 [=================>............] - ETA: 1:35 - loss: 0.2427 - regression_loss: 0.2308 - classification_loss: 0.0119 312/500 [=================>............] - ETA: 1:34 - loss: 0.2427 - regression_loss: 0.2308 - classification_loss: 0.0119 313/500 [=================>............] - ETA: 1:34 - loss: 0.2428 - regression_loss: 0.2309 - classification_loss: 0.0119 314/500 [=================>............] - ETA: 1:33 - loss: 0.2427 - regression_loss: 0.2308 - classification_loss: 0.0119 315/500 [=================>............] - ETA: 1:33 - loss: 0.2432 - regression_loss: 0.2313 - classification_loss: 0.0119 316/500 [=================>............] - ETA: 1:32 - loss: 0.2439 - regression_loss: 0.2319 - classification_loss: 0.0120 317/500 [==================>...........] - ETA: 1:32 - loss: 0.2435 - regression_loss: 0.2315 - classification_loss: 0.0119 318/500 [==================>...........] - ETA: 1:31 - loss: 0.2433 - regression_loss: 0.2314 - classification_loss: 0.0119 319/500 [==================>...........] - ETA: 1:31 - loss: 0.2439 - regression_loss: 0.2319 - classification_loss: 0.0120 320/500 [==================>...........] - ETA: 1:30 - loss: 0.2439 - regression_loss: 0.2319 - classification_loss: 0.0120 321/500 [==================>...........] - ETA: 1:30 - loss: 0.2441 - regression_loss: 0.2321 - classification_loss: 0.0120 322/500 [==================>...........] - ETA: 1:29 - loss: 0.2448 - regression_loss: 0.2328 - classification_loss: 0.0120 323/500 [==================>...........] - ETA: 1:29 - loss: 0.2451 - regression_loss: 0.2331 - classification_loss: 0.0120 324/500 [==================>...........] - ETA: 1:28 - loss: 0.2451 - regression_loss: 0.2331 - classification_loss: 0.0120 325/500 [==================>...........] - ETA: 1:28 - loss: 0.2449 - regression_loss: 0.2330 - classification_loss: 0.0120 326/500 [==================>...........] - ETA: 1:27 - loss: 0.2455 - regression_loss: 0.2334 - classification_loss: 0.0120 327/500 [==================>...........] - ETA: 1:27 - loss: 0.2459 - regression_loss: 0.2339 - classification_loss: 0.0121 328/500 [==================>...........] - ETA: 1:26 - loss: 0.2462 - regression_loss: 0.2341 - classification_loss: 0.0121 329/500 [==================>...........] - ETA: 1:26 - loss: 0.2459 - regression_loss: 0.2339 - classification_loss: 0.0121 330/500 [==================>...........] - ETA: 1:25 - loss: 0.2455 - regression_loss: 0.2334 - classification_loss: 0.0121 331/500 [==================>...........] - ETA: 1:25 - loss: 0.2451 - regression_loss: 0.2331 - classification_loss: 0.0120 332/500 [==================>...........] - ETA: 1:24 - loss: 0.2453 - regression_loss: 0.2332 - classification_loss: 0.0121 333/500 [==================>...........] - ETA: 1:23 - loss: 0.2455 - regression_loss: 0.2334 - classification_loss: 0.0121 334/500 [===================>..........] - ETA: 1:23 - loss: 0.2453 - regression_loss: 0.2332 - classification_loss: 0.0120 335/500 [===================>..........] - ETA: 1:22 - loss: 0.2451 - regression_loss: 0.2331 - classification_loss: 0.0120 336/500 [===================>..........] - ETA: 1:22 - loss: 0.2453 - regression_loss: 0.2333 - classification_loss: 0.0120 337/500 [===================>..........] - ETA: 1:21 - loss: 0.2457 - regression_loss: 0.2337 - classification_loss: 0.0120 338/500 [===================>..........] - ETA: 1:21 - loss: 0.2456 - regression_loss: 0.2336 - classification_loss: 0.0120 339/500 [===================>..........] - ETA: 1:20 - loss: 0.2466 - regression_loss: 0.2345 - classification_loss: 0.0121 340/500 [===================>..........] - ETA: 1:20 - loss: 0.2463 - regression_loss: 0.2342 - classification_loss: 0.0120 341/500 [===================>..........] - ETA: 1:19 - loss: 0.2458 - regression_loss: 0.2337 - classification_loss: 0.0120 342/500 [===================>..........] - ETA: 1:19 - loss: 0.2460 - regression_loss: 0.2339 - classification_loss: 0.0121 343/500 [===================>..........] - ETA: 1:18 - loss: 0.2468 - regression_loss: 0.2346 - classification_loss: 0.0122 344/500 [===================>..........] - ETA: 1:18 - loss: 0.2466 - regression_loss: 0.2344 - classification_loss: 0.0122 345/500 [===================>..........] - ETA: 1:17 - loss: 0.2464 - regression_loss: 0.2343 - classification_loss: 0.0122 346/500 [===================>..........] - ETA: 1:17 - loss: 0.2466 - regression_loss: 0.2344 - classification_loss: 0.0122 347/500 [===================>..........] - ETA: 1:16 - loss: 0.2465 - regression_loss: 0.2343 - classification_loss: 0.0122 348/500 [===================>..........] - ETA: 1:16 - loss: 0.2461 - regression_loss: 0.2339 - classification_loss: 0.0121 349/500 [===================>..........] - ETA: 1:15 - loss: 0.2462 - regression_loss: 0.2340 - classification_loss: 0.0121 350/500 [====================>.........] - ETA: 1:15 - loss: 0.2462 - regression_loss: 0.2341 - classification_loss: 0.0121 351/500 [====================>.........] - ETA: 1:14 - loss: 0.2461 - regression_loss: 0.2340 - classification_loss: 0.0121 352/500 [====================>.........] - ETA: 1:14 - loss: 0.2461 - regression_loss: 0.2339 - classification_loss: 0.0121 353/500 [====================>.........] - ETA: 1:13 - loss: 0.2456 - regression_loss: 0.2335 - classification_loss: 0.0121 354/500 [====================>.........] - ETA: 1:13 - loss: 0.2457 - regression_loss: 0.2336 - classification_loss: 0.0122 355/500 [====================>.........] - ETA: 1:12 - loss: 0.2458 - regression_loss: 0.2337 - classification_loss: 0.0121 356/500 [====================>.........] - ETA: 1:12 - loss: 0.2453 - regression_loss: 0.2332 - classification_loss: 0.0121 357/500 [====================>.........] - ETA: 1:11 - loss: 0.2452 - regression_loss: 0.2331 - classification_loss: 0.0121 358/500 [====================>.........] - ETA: 1:11 - loss: 0.2460 - regression_loss: 0.2337 - classification_loss: 0.0123 359/500 [====================>.........] - ETA: 1:10 - loss: 0.2462 - regression_loss: 0.2339 - classification_loss: 0.0123 360/500 [====================>.........] - ETA: 1:10 - loss: 0.2460 - regression_loss: 0.2338 - classification_loss: 0.0123 361/500 [====================>.........] - ETA: 1:09 - loss: 0.2461 - regression_loss: 0.2338 - classification_loss: 0.0123 362/500 [====================>.........] - ETA: 1:09 - loss: 0.2461 - regression_loss: 0.2338 - classification_loss: 0.0123 363/500 [====================>.........] - ETA: 1:08 - loss: 0.2458 - regression_loss: 0.2335 - classification_loss: 0.0123 364/500 [====================>.........] - ETA: 1:08 - loss: 0.2456 - regression_loss: 0.2333 - classification_loss: 0.0123 365/500 [====================>.........] - ETA: 1:07 - loss: 0.2456 - regression_loss: 0.2333 - classification_loss: 0.0123 366/500 [====================>.........] - ETA: 1:07 - loss: 0.2458 - regression_loss: 0.2335 - classification_loss: 0.0123 367/500 [=====================>........] - ETA: 1:06 - loss: 0.2459 - regression_loss: 0.2336 - classification_loss: 0.0123 368/500 [=====================>........] - ETA: 1:06 - loss: 0.2457 - regression_loss: 0.2335 - classification_loss: 0.0123 369/500 [=====================>........] - ETA: 1:05 - loss: 0.2461 - regression_loss: 0.2339 - classification_loss: 0.0123 370/500 [=====================>........] - ETA: 1:05 - loss: 0.2463 - regression_loss: 0.2340 - classification_loss: 0.0123 371/500 [=====================>........] - ETA: 1:04 - loss: 0.2460 - regression_loss: 0.2338 - classification_loss: 0.0122 372/500 [=====================>........] - ETA: 1:04 - loss: 0.2460 - regression_loss: 0.2338 - classification_loss: 0.0122 373/500 [=====================>........] - ETA: 1:03 - loss: 0.2458 - regression_loss: 0.2336 - classification_loss: 0.0122 374/500 [=====================>........] - ETA: 1:03 - loss: 0.2454 - regression_loss: 0.2332 - classification_loss: 0.0122 375/500 [=====================>........] - ETA: 1:02 - loss: 0.2455 - regression_loss: 0.2333 - classification_loss: 0.0122 376/500 [=====================>........] - ETA: 1:02 - loss: 0.2459 - regression_loss: 0.2335 - classification_loss: 0.0124 377/500 [=====================>........] - ETA: 1:01 - loss: 0.2461 - regression_loss: 0.2337 - classification_loss: 0.0124 378/500 [=====================>........] - ETA: 1:01 - loss: 0.2460 - regression_loss: 0.2337 - classification_loss: 0.0123 379/500 [=====================>........] - ETA: 1:00 - loss: 0.2462 - regression_loss: 0.2338 - classification_loss: 0.0124 380/500 [=====================>........] - ETA: 1:00 - loss: 0.2460 - regression_loss: 0.2337 - classification_loss: 0.0123 381/500 [=====================>........] - ETA: 59s - loss: 0.2461 - regression_loss: 0.2337 - classification_loss: 0.0123  382/500 [=====================>........] - ETA: 59s - loss: 0.2460 - regression_loss: 0.2337 - classification_loss: 0.0123 383/500 [=====================>........] - ETA: 58s - loss: 0.2459 - regression_loss: 0.2336 - classification_loss: 0.0123 384/500 [======================>.......] - ETA: 58s - loss: 0.2461 - regression_loss: 0.2337 - classification_loss: 0.0124 385/500 [======================>.......] - ETA: 57s - loss: 0.2466 - regression_loss: 0.2340 - classification_loss: 0.0125 386/500 [======================>.......] - ETA: 57s - loss: 0.2463 - regression_loss: 0.2338 - classification_loss: 0.0125 387/500 [======================>.......] - ETA: 56s - loss: 0.2459 - regression_loss: 0.2334 - classification_loss: 0.0125 388/500 [======================>.......] - ETA: 56s - loss: 0.2455 - regression_loss: 0.2331 - classification_loss: 0.0125 389/500 [======================>.......] - ETA: 55s - loss: 0.2456 - regression_loss: 0.2331 - classification_loss: 0.0125 390/500 [======================>.......] - ETA: 55s - loss: 0.2457 - regression_loss: 0.2332 - classification_loss: 0.0125 391/500 [======================>.......] - ETA: 54s - loss: 0.2454 - regression_loss: 0.2330 - classification_loss: 0.0124 392/500 [======================>.......] - ETA: 54s - loss: 0.2452 - regression_loss: 0.2327 - classification_loss: 0.0124 393/500 [======================>.......] - ETA: 53s - loss: 0.2451 - regression_loss: 0.2326 - classification_loss: 0.0124 394/500 [======================>.......] - ETA: 53s - loss: 0.2450 - regression_loss: 0.2326 - classification_loss: 0.0124 395/500 [======================>.......] - ETA: 52s - loss: 0.2452 - regression_loss: 0.2327 - classification_loss: 0.0125 396/500 [======================>.......] - ETA: 52s - loss: 0.2447 - regression_loss: 0.2323 - classification_loss: 0.0124 397/500 [======================>.......] - ETA: 51s - loss: 0.2446 - regression_loss: 0.2321 - classification_loss: 0.0124 398/500 [======================>.......] - ETA: 51s - loss: 0.2446 - regression_loss: 0.2322 - classification_loss: 0.0124 399/500 [======================>.......] - ETA: 50s - loss: 0.2447 - regression_loss: 0.2323 - classification_loss: 0.0124 400/500 [=======================>......] - ETA: 50s - loss: 0.2446 - regression_loss: 0.2322 - classification_loss: 0.0124 401/500 [=======================>......] - ETA: 49s - loss: 0.2448 - regression_loss: 0.2324 - classification_loss: 0.0124 402/500 [=======================>......] - ETA: 49s - loss: 0.2449 - regression_loss: 0.2324 - classification_loss: 0.0124 403/500 [=======================>......] - ETA: 48s - loss: 0.2452 - regression_loss: 0.2327 - classification_loss: 0.0125 404/500 [=======================>......] - ETA: 48s - loss: 0.2456 - regression_loss: 0.2331 - classification_loss: 0.0125 405/500 [=======================>......] - ETA: 47s - loss: 0.2460 - regression_loss: 0.2335 - classification_loss: 0.0125 406/500 [=======================>......] - ETA: 47s - loss: 0.2463 - regression_loss: 0.2337 - classification_loss: 0.0125 407/500 [=======================>......] - ETA: 46s - loss: 0.2459 - regression_loss: 0.2334 - classification_loss: 0.0125 408/500 [=======================>......] - ETA: 46s - loss: 0.2456 - regression_loss: 0.2331 - classification_loss: 0.0125 409/500 [=======================>......] - ETA: 45s - loss: 0.2458 - regression_loss: 0.2333 - classification_loss: 0.0125 410/500 [=======================>......] - ETA: 45s - loss: 0.2457 - regression_loss: 0.2332 - classification_loss: 0.0125 411/500 [=======================>......] - ETA: 44s - loss: 0.2458 - regression_loss: 0.2333 - classification_loss: 0.0125 412/500 [=======================>......] - ETA: 44s - loss: 0.2457 - regression_loss: 0.2333 - classification_loss: 0.0125 413/500 [=======================>......] - ETA: 43s - loss: 0.2459 - regression_loss: 0.2334 - classification_loss: 0.0125 414/500 [=======================>......] - ETA: 43s - loss: 0.2460 - regression_loss: 0.2336 - classification_loss: 0.0125 415/500 [=======================>......] - ETA: 42s - loss: 0.2461 - regression_loss: 0.2335 - classification_loss: 0.0125 416/500 [=======================>......] - ETA: 42s - loss: 0.2461 - regression_loss: 0.2336 - classification_loss: 0.0125 417/500 [========================>.....] - ETA: 41s - loss: 0.2459 - regression_loss: 0.2334 - classification_loss: 0.0125 418/500 [========================>.....] - ETA: 41s - loss: 0.2456 - regression_loss: 0.2331 - classification_loss: 0.0125 419/500 [========================>.....] - ETA: 40s - loss: 0.2457 - regression_loss: 0.2332 - classification_loss: 0.0125 420/500 [========================>.....] - ETA: 40s - loss: 0.2454 - regression_loss: 0.2330 - classification_loss: 0.0125 421/500 [========================>.....] - ETA: 39s - loss: 0.2454 - regression_loss: 0.2329 - classification_loss: 0.0125 422/500 [========================>.....] - ETA: 39s - loss: 0.2461 - regression_loss: 0.2336 - classification_loss: 0.0125 423/500 [========================>.....] - ETA: 38s - loss: 0.2466 - regression_loss: 0.2340 - classification_loss: 0.0125 424/500 [========================>.....] - ETA: 38s - loss: 0.2466 - regression_loss: 0.2341 - classification_loss: 0.0125 425/500 [========================>.....] - ETA: 37s - loss: 0.2466 - regression_loss: 0.2342 - classification_loss: 0.0125 426/500 [========================>.....] - ETA: 37s - loss: 0.2465 - regression_loss: 0.2340 - classification_loss: 0.0125 427/500 [========================>.....] - ETA: 36s - loss: 0.2464 - regression_loss: 0.2340 - classification_loss: 0.0125 428/500 [========================>.....] - ETA: 36s - loss: 0.2462 - regression_loss: 0.2337 - classification_loss: 0.0125 429/500 [========================>.....] - ETA: 35s - loss: 0.2466 - regression_loss: 0.2341 - classification_loss: 0.0125 430/500 [========================>.....] - ETA: 35s - loss: 0.2463 - regression_loss: 0.2338 - classification_loss: 0.0124 431/500 [========================>.....] - ETA: 34s - loss: 0.2461 - regression_loss: 0.2336 - classification_loss: 0.0124 432/500 [========================>.....] - ETA: 34s - loss: 0.2463 - regression_loss: 0.2339 - classification_loss: 0.0124 433/500 [========================>.....] - ETA: 33s - loss: 0.2460 - regression_loss: 0.2336 - classification_loss: 0.0124 434/500 [=========================>....] - ETA: 33s - loss: 0.2460 - regression_loss: 0.2335 - classification_loss: 0.0124 435/500 [=========================>....] - ETA: 32s - loss: 0.2460 - regression_loss: 0.2336 - classification_loss: 0.0124 436/500 [=========================>....] - ETA: 32s - loss: 0.2460 - regression_loss: 0.2336 - classification_loss: 0.0124 437/500 [=========================>....] - ETA: 31s - loss: 0.2459 - regression_loss: 0.2334 - classification_loss: 0.0124 438/500 [=========================>....] - ETA: 31s - loss: 0.2459 - regression_loss: 0.2335 - classification_loss: 0.0124 439/500 [=========================>....] - ETA: 30s - loss: 0.2457 - regression_loss: 0.2333 - classification_loss: 0.0124 440/500 [=========================>....] - ETA: 30s - loss: 0.2458 - regression_loss: 0.2334 - classification_loss: 0.0124 441/500 [=========================>....] - ETA: 29s - loss: 0.2457 - regression_loss: 0.2334 - classification_loss: 0.0124 442/500 [=========================>....] - ETA: 29s - loss: 0.2456 - regression_loss: 0.2333 - classification_loss: 0.0124 443/500 [=========================>....] - ETA: 28s - loss: 0.2457 - regression_loss: 0.2334 - classification_loss: 0.0124 444/500 [=========================>....] - ETA: 28s - loss: 0.2454 - regression_loss: 0.2330 - classification_loss: 0.0124 445/500 [=========================>....] - ETA: 27s - loss: 0.2455 - regression_loss: 0.2332 - classification_loss: 0.0124 446/500 [=========================>....] - ETA: 27s - loss: 0.2456 - regression_loss: 0.2333 - classification_loss: 0.0124 447/500 [=========================>....] - ETA: 26s - loss: 0.2458 - regression_loss: 0.2334 - classification_loss: 0.0124 448/500 [=========================>....] - ETA: 26s - loss: 0.2459 - regression_loss: 0.2335 - classification_loss: 0.0124 449/500 [=========================>....] - ETA: 25s - loss: 0.2463 - regression_loss: 0.2339 - classification_loss: 0.0124 450/500 [==========================>...] - ETA: 25s - loss: 0.2464 - regression_loss: 0.2340 - classification_loss: 0.0124 451/500 [==========================>...] - ETA: 24s - loss: 0.2467 - regression_loss: 0.2343 - classification_loss: 0.0124 452/500 [==========================>...] - ETA: 24s - loss: 0.2464 - regression_loss: 0.2340 - classification_loss: 0.0124 453/500 [==========================>...] - ETA: 23s - loss: 0.2463 - regression_loss: 0.2339 - classification_loss: 0.0124 454/500 [==========================>...] - ETA: 23s - loss: 0.2467 - regression_loss: 0.2342 - classification_loss: 0.0124 455/500 [==========================>...] - ETA: 22s - loss: 0.2466 - regression_loss: 0.2341 - classification_loss: 0.0124 456/500 [==========================>...] - ETA: 22s - loss: 0.2469 - regression_loss: 0.2344 - classification_loss: 0.0124 457/500 [==========================>...] - ETA: 21s - loss: 0.2468 - regression_loss: 0.2344 - classification_loss: 0.0124 458/500 [==========================>...] - ETA: 21s - loss: 0.2471 - regression_loss: 0.2347 - classification_loss: 0.0125 459/500 [==========================>...] - ETA: 20s - loss: 0.2471 - regression_loss: 0.2347 - classification_loss: 0.0124 460/500 [==========================>...] - ETA: 20s - loss: 0.2471 - regression_loss: 0.2347 - classification_loss: 0.0124 461/500 [==========================>...] - ETA: 19s - loss: 0.2471 - regression_loss: 0.2346 - classification_loss: 0.0124 462/500 [==========================>...] - ETA: 19s - loss: 0.2466 - regression_loss: 0.2342 - classification_loss: 0.0124 463/500 [==========================>...] - ETA: 18s - loss: 0.2468 - regression_loss: 0.2344 - classification_loss: 0.0124 464/500 [==========================>...] - ETA: 18s - loss: 0.2464 - regression_loss: 0.2340 - classification_loss: 0.0124 465/500 [==========================>...] - ETA: 17s - loss: 0.2464 - regression_loss: 0.2340 - classification_loss: 0.0124 466/500 [==========================>...] - ETA: 17s - loss: 0.2462 - regression_loss: 0.2338 - classification_loss: 0.0124 467/500 [===========================>..] - ETA: 16s - loss: 0.2462 - regression_loss: 0.2338 - classification_loss: 0.0124 468/500 [===========================>..] - ETA: 16s - loss: 0.2457 - regression_loss: 0.2334 - classification_loss: 0.0123 469/500 [===========================>..] - ETA: 15s - loss: 0.2461 - regression_loss: 0.2337 - classification_loss: 0.0124 470/500 [===========================>..] - ETA: 15s - loss: 0.2463 - regression_loss: 0.2340 - classification_loss: 0.0124 471/500 [===========================>..] - ETA: 14s - loss: 0.2464 - regression_loss: 0.2341 - classification_loss: 0.0124 472/500 [===========================>..] - ETA: 14s - loss: 0.2466 - regression_loss: 0.2343 - classification_loss: 0.0124 473/500 [===========================>..] - ETA: 13s - loss: 0.2469 - regression_loss: 0.2344 - classification_loss: 0.0124 474/500 [===========================>..] - ETA: 13s - loss: 0.2473 - regression_loss: 0.2349 - classification_loss: 0.0124 475/500 [===========================>..] - ETA: 12s - loss: 0.2474 - regression_loss: 0.2349 - classification_loss: 0.0124 476/500 [===========================>..] - ETA: 12s - loss: 0.2472 - regression_loss: 0.2347 - classification_loss: 0.0124 477/500 [===========================>..] - ETA: 11s - loss: 0.2469 - regression_loss: 0.2345 - classification_loss: 0.0124 478/500 [===========================>..] - ETA: 11s - loss: 0.2467 - regression_loss: 0.2343 - classification_loss: 0.0124 479/500 [===========================>..] - ETA: 10s - loss: 0.2464 - regression_loss: 0.2341 - classification_loss: 0.0124 480/500 [===========================>..] - ETA: 10s - loss: 0.2467 - regression_loss: 0.2344 - classification_loss: 0.0124 481/500 [===========================>..] - ETA: 9s - loss: 0.2465 - regression_loss: 0.2341 - classification_loss: 0.0124  482/500 [===========================>..] - ETA: 9s - loss: 0.2478 - regression_loss: 0.2354 - classification_loss: 0.0124 483/500 [===========================>..] - ETA: 8s - loss: 0.2484 - regression_loss: 0.2359 - classification_loss: 0.0124 484/500 [============================>.] - ETA: 8s - loss: 0.2484 - regression_loss: 0.2360 - classification_loss: 0.0124 485/500 [============================>.] - ETA: 7s - loss: 0.2488 - regression_loss: 0.2363 - classification_loss: 0.0125 486/500 [============================>.] - ETA: 7s - loss: 0.2489 - regression_loss: 0.2365 - classification_loss: 0.0125 487/500 [============================>.] - ETA: 6s - loss: 0.2488 - regression_loss: 0.2363 - classification_loss: 0.0125 488/500 [============================>.] - ETA: 6s - loss: 0.2485 - regression_loss: 0.2361 - classification_loss: 0.0124 489/500 [============================>.] - ETA: 5s - loss: 0.2483 - regression_loss: 0.2359 - classification_loss: 0.0124 490/500 [============================>.] - ETA: 5s - loss: 0.2482 - regression_loss: 0.2358 - classification_loss: 0.0124 491/500 [============================>.] - ETA: 4s - loss: 0.2479 - regression_loss: 0.2355 - classification_loss: 0.0124 492/500 [============================>.] - ETA: 4s - loss: 0.2478 - regression_loss: 0.2354 - classification_loss: 0.0124 493/500 [============================>.] - ETA: 3s - loss: 0.2479 - regression_loss: 0.2355 - classification_loss: 0.0124 494/500 [============================>.] - ETA: 3s - loss: 0.2484 - regression_loss: 0.2359 - classification_loss: 0.0125 495/500 [============================>.] - ETA: 2s - loss: 0.2483 - regression_loss: 0.2358 - classification_loss: 0.0125 496/500 [============================>.] - ETA: 2s - loss: 0.2491 - regression_loss: 0.2365 - classification_loss: 0.0125 497/500 [============================>.] - ETA: 1s - loss: 0.2487 - regression_loss: 0.2362 - classification_loss: 0.0125 498/500 [============================>.] - ETA: 1s - loss: 0.2488 - regression_loss: 0.2363 - classification_loss: 0.0125 499/500 [============================>.] - ETA: 0s - loss: 0.2490 - regression_loss: 0.2365 - classification_loss: 0.0126 500/500 [==============================] - 251s 503ms/step - loss: 0.2494 - regression_loss: 0.2368 - classification_loss: 0.0126 1172 instances of class plum with average precision: 0.7348 mAP: 0.7348 Epoch 00027: saving model to ./training/snapshots/resnet101_pascal_27.h5 Epoch 28/150 1/500 [..............................] - ETA: 4:08 - loss: 0.9529 - regression_loss: 0.9293 - classification_loss: 0.0236 2/500 [..............................] - ETA: 4:14 - loss: 0.7142 - regression_loss: 0.7001 - classification_loss: 0.0141 3/500 [..............................] - ETA: 4:10 - loss: 0.5677 - regression_loss: 0.5575 - classification_loss: 0.0102 4/500 [..............................] - ETA: 4:08 - loss: 0.5212 - regression_loss: 0.5120 - classification_loss: 0.0092 5/500 [..............................] - ETA: 4:06 - loss: 0.5145 - regression_loss: 0.5055 - classification_loss: 0.0090 6/500 [..............................] - ETA: 4:07 - loss: 0.4708 - regression_loss: 0.4604 - classification_loss: 0.0104 7/500 [..............................] - ETA: 4:07 - loss: 0.4659 - regression_loss: 0.4547 - classification_loss: 0.0112 8/500 [..............................] - ETA: 4:05 - loss: 0.4417 - regression_loss: 0.4306 - classification_loss: 0.0111 9/500 [..............................] - ETA: 4:05 - loss: 0.4150 - regression_loss: 0.4048 - classification_loss: 0.0102 10/500 [..............................] - ETA: 4:05 - loss: 0.4162 - regression_loss: 0.4026 - classification_loss: 0.0136 11/500 [..............................] - ETA: 4:03 - loss: 0.3931 - regression_loss: 0.3804 - classification_loss: 0.0127 12/500 [..............................] - ETA: 4:03 - loss: 0.3761 - regression_loss: 0.3640 - classification_loss: 0.0121 13/500 [..............................] - ETA: 4:01 - loss: 0.3673 - regression_loss: 0.3552 - classification_loss: 0.0121 14/500 [..............................] - ETA: 4:01 - loss: 0.3644 - regression_loss: 0.3520 - classification_loss: 0.0123 15/500 [..............................] - ETA: 4:00 - loss: 0.3518 - regression_loss: 0.3402 - classification_loss: 0.0116 16/500 [..............................] - ETA: 4:01 - loss: 0.3562 - regression_loss: 0.3434 - classification_loss: 0.0128 17/500 [>.............................] - ETA: 4:00 - loss: 0.3530 - regression_loss: 0.3392 - classification_loss: 0.0138 18/500 [>.............................] - ETA: 4:00 - loss: 0.3495 - regression_loss: 0.3359 - classification_loss: 0.0136 19/500 [>.............................] - ETA: 3:59 - loss: 0.3408 - regression_loss: 0.3276 - classification_loss: 0.0132 20/500 [>.............................] - ETA: 3:59 - loss: 0.3431 - regression_loss: 0.3300 - classification_loss: 0.0131 21/500 [>.............................] - ETA: 3:58 - loss: 0.3321 - regression_loss: 0.3195 - classification_loss: 0.0126 22/500 [>.............................] - ETA: 3:58 - loss: 0.3268 - regression_loss: 0.3145 - classification_loss: 0.0123 23/500 [>.............................] - ETA: 3:58 - loss: 0.3295 - regression_loss: 0.3168 - classification_loss: 0.0127 24/500 [>.............................] - ETA: 3:58 - loss: 0.3216 - regression_loss: 0.3093 - classification_loss: 0.0123 25/500 [>.............................] - ETA: 3:57 - loss: 0.3276 - regression_loss: 0.3148 - classification_loss: 0.0128 26/500 [>.............................] - ETA: 3:56 - loss: 0.3335 - regression_loss: 0.3202 - classification_loss: 0.0133 27/500 [>.............................] - ETA: 3:56 - loss: 0.3340 - regression_loss: 0.3209 - classification_loss: 0.0131 28/500 [>.............................] - ETA: 3:55 - loss: 0.3320 - regression_loss: 0.3191 - classification_loss: 0.0129 29/500 [>.............................] - ETA: 3:54 - loss: 0.3317 - regression_loss: 0.3188 - classification_loss: 0.0129 30/500 [>.............................] - ETA: 3:53 - loss: 0.3264 - regression_loss: 0.3137 - classification_loss: 0.0127 31/500 [>.............................] - ETA: 3:53 - loss: 0.3210 - regression_loss: 0.3086 - classification_loss: 0.0124 32/500 [>.............................] - ETA: 3:53 - loss: 0.3208 - regression_loss: 0.3083 - classification_loss: 0.0126 33/500 [>.............................] - ETA: 3:53 - loss: 0.3153 - regression_loss: 0.3029 - classification_loss: 0.0123 34/500 [=>............................] - ETA: 3:52 - loss: 0.3135 - regression_loss: 0.3009 - classification_loss: 0.0125 35/500 [=>............................] - ETA: 3:52 - loss: 0.3118 - regression_loss: 0.2994 - classification_loss: 0.0124 36/500 [=>............................] - ETA: 3:51 - loss: 0.3080 - regression_loss: 0.2957 - classification_loss: 0.0122 37/500 [=>............................] - ETA: 3:51 - loss: 0.3092 - regression_loss: 0.2968 - classification_loss: 0.0123 38/500 [=>............................] - ETA: 3:50 - loss: 0.3044 - regression_loss: 0.2923 - classification_loss: 0.0122 39/500 [=>............................] - ETA: 3:50 - loss: 0.3043 - regression_loss: 0.2921 - classification_loss: 0.0122 40/500 [=>............................] - ETA: 3:49 - loss: 0.3081 - regression_loss: 0.2952 - classification_loss: 0.0129 41/500 [=>............................] - ETA: 3:49 - loss: 0.3065 - regression_loss: 0.2937 - classification_loss: 0.0129 42/500 [=>............................] - ETA: 3:49 - loss: 0.3065 - regression_loss: 0.2937 - classification_loss: 0.0128 43/500 [=>............................] - ETA: 3:48 - loss: 0.3065 - regression_loss: 0.2937 - classification_loss: 0.0128 44/500 [=>............................] - ETA: 3:47 - loss: 0.3058 - regression_loss: 0.2929 - classification_loss: 0.0129 45/500 [=>............................] - ETA: 3:47 - loss: 0.3036 - regression_loss: 0.2909 - classification_loss: 0.0128 46/500 [=>............................] - ETA: 3:46 - loss: 0.3016 - regression_loss: 0.2890 - classification_loss: 0.0126 47/500 [=>............................] - ETA: 3:46 - loss: 0.3029 - regression_loss: 0.2904 - classification_loss: 0.0125 48/500 [=>............................] - ETA: 3:46 - loss: 0.2989 - regression_loss: 0.2866 - classification_loss: 0.0123 49/500 [=>............................] - ETA: 3:45 - loss: 0.3076 - regression_loss: 0.2949 - classification_loss: 0.0127 50/500 [==>...........................] - ETA: 3:45 - loss: 0.3051 - regression_loss: 0.2925 - classification_loss: 0.0126 51/500 [==>...........................] - ETA: 3:44 - loss: 0.3025 - regression_loss: 0.2900 - classification_loss: 0.0125 52/500 [==>...........................] - ETA: 3:43 - loss: 0.2999 - regression_loss: 0.2875 - classification_loss: 0.0123 53/500 [==>...........................] - ETA: 3:43 - loss: 0.3044 - regression_loss: 0.2920 - classification_loss: 0.0123 54/500 [==>...........................] - ETA: 3:42 - loss: 0.3028 - regression_loss: 0.2906 - classification_loss: 0.0122 55/500 [==>...........................] - ETA: 3:42 - loss: 0.3029 - regression_loss: 0.2908 - classification_loss: 0.0122 56/500 [==>...........................] - ETA: 3:41 - loss: 0.3022 - regression_loss: 0.2902 - classification_loss: 0.0120 57/500 [==>...........................] - ETA: 3:40 - loss: 0.3016 - regression_loss: 0.2894 - classification_loss: 0.0121 58/500 [==>...........................] - ETA: 3:40 - loss: 0.2997 - regression_loss: 0.2878 - classification_loss: 0.0119 59/500 [==>...........................] - ETA: 3:40 - loss: 0.3001 - regression_loss: 0.2880 - classification_loss: 0.0121 60/500 [==>...........................] - ETA: 3:39 - loss: 0.2975 - regression_loss: 0.2856 - classification_loss: 0.0119 61/500 [==>...........................] - ETA: 3:39 - loss: 0.2950 - regression_loss: 0.2832 - classification_loss: 0.0118 62/500 [==>...........................] - ETA: 3:38 - loss: 0.2934 - regression_loss: 0.2816 - classification_loss: 0.0118 63/500 [==>...........................] - ETA: 3:38 - loss: 0.2944 - regression_loss: 0.2827 - classification_loss: 0.0117 64/500 [==>...........................] - ETA: 3:37 - loss: 0.2912 - regression_loss: 0.2796 - classification_loss: 0.0116 65/500 [==>...........................] - ETA: 3:37 - loss: 0.2900 - regression_loss: 0.2785 - classification_loss: 0.0115 66/500 [==>...........................] - ETA: 3:36 - loss: 0.2902 - regression_loss: 0.2787 - classification_loss: 0.0114 67/500 [===>..........................] - ETA: 3:36 - loss: 0.2976 - regression_loss: 0.2856 - classification_loss: 0.0121 68/500 [===>..........................] - ETA: 3:35 - loss: 0.3031 - regression_loss: 0.2907 - classification_loss: 0.0124 69/500 [===>..........................] - ETA: 3:35 - loss: 0.3021 - regression_loss: 0.2898 - classification_loss: 0.0123 70/500 [===>..........................] - ETA: 3:34 - loss: 0.3023 - regression_loss: 0.2899 - classification_loss: 0.0124 71/500 [===>..........................] - ETA: 3:34 - loss: 0.3006 - regression_loss: 0.2882 - classification_loss: 0.0124 72/500 [===>..........................] - ETA: 3:33 - loss: 0.2984 - regression_loss: 0.2861 - classification_loss: 0.0123 73/500 [===>..........................] - ETA: 3:33 - loss: 0.2963 - regression_loss: 0.2841 - classification_loss: 0.0122 74/500 [===>..........................] - ETA: 3:32 - loss: 0.2933 - regression_loss: 0.2811 - classification_loss: 0.0122 75/500 [===>..........................] - ETA: 3:32 - loss: 0.2942 - regression_loss: 0.2808 - classification_loss: 0.0134 76/500 [===>..........................] - ETA: 3:32 - loss: 0.2926 - regression_loss: 0.2794 - classification_loss: 0.0133 77/500 [===>..........................] - ETA: 3:31 - loss: 0.2905 - regression_loss: 0.2774 - classification_loss: 0.0132 78/500 [===>..........................] - ETA: 3:30 - loss: 0.2887 - regression_loss: 0.2757 - classification_loss: 0.0130 79/500 [===>..........................] - ETA: 3:30 - loss: 0.2910 - regression_loss: 0.2779 - classification_loss: 0.0131 80/500 [===>..........................] - ETA: 3:30 - loss: 0.2918 - regression_loss: 0.2787 - classification_loss: 0.0131 81/500 [===>..........................] - ETA: 3:29 - loss: 0.2892 - regression_loss: 0.2762 - classification_loss: 0.0131 82/500 [===>..........................] - ETA: 3:29 - loss: 0.2887 - regression_loss: 0.2757 - classification_loss: 0.0130 83/500 [===>..........................] - ETA: 3:28 - loss: 0.2891 - regression_loss: 0.2762 - classification_loss: 0.0130 84/500 [====>.........................] - ETA: 3:28 - loss: 0.2890 - regression_loss: 0.2760 - classification_loss: 0.0129 85/500 [====>.........................] - ETA: 3:27 - loss: 0.2906 - regression_loss: 0.2775 - classification_loss: 0.0132 86/500 [====>.........................] - ETA: 3:27 - loss: 0.2897 - regression_loss: 0.2765 - classification_loss: 0.0132 87/500 [====>.........................] - ETA: 3:26 - loss: 0.2881 - regression_loss: 0.2748 - classification_loss: 0.0132 88/500 [====>.........................] - ETA: 3:26 - loss: 0.2876 - regression_loss: 0.2743 - classification_loss: 0.0132 89/500 [====>.........................] - ETA: 3:25 - loss: 0.2854 - regression_loss: 0.2723 - classification_loss: 0.0131 90/500 [====>.........................] - ETA: 3:25 - loss: 0.2866 - regression_loss: 0.2734 - classification_loss: 0.0132 91/500 [====>.........................] - ETA: 3:24 - loss: 0.2864 - regression_loss: 0.2732 - classification_loss: 0.0132 92/500 [====>.........................] - ETA: 3:24 - loss: 0.2883 - regression_loss: 0.2751 - classification_loss: 0.0131 93/500 [====>.........................] - ETA: 3:23 - loss: 0.2883 - regression_loss: 0.2752 - classification_loss: 0.0130 94/500 [====>.........................] - ETA: 3:23 - loss: 0.2887 - regression_loss: 0.2756 - classification_loss: 0.0131 95/500 [====>.........................] - ETA: 3:22 - loss: 0.2875 - regression_loss: 0.2744 - classification_loss: 0.0131 96/500 [====>.........................] - ETA: 3:22 - loss: 0.2878 - regression_loss: 0.2747 - classification_loss: 0.0131 97/500 [====>.........................] - ETA: 3:21 - loss: 0.2876 - regression_loss: 0.2745 - classification_loss: 0.0131 98/500 [====>.........................] - ETA: 3:21 - loss: 0.2873 - regression_loss: 0.2742 - classification_loss: 0.0131 99/500 [====>.........................] - ETA: 3:20 - loss: 0.2860 - regression_loss: 0.2731 - classification_loss: 0.0130 100/500 [=====>........................] - ETA: 3:20 - loss: 0.2852 - regression_loss: 0.2723 - classification_loss: 0.0129 101/500 [=====>........................] - ETA: 3:19 - loss: 0.2842 - regression_loss: 0.2713 - classification_loss: 0.0129 102/500 [=====>........................] - ETA: 3:19 - loss: 0.2843 - regression_loss: 0.2715 - classification_loss: 0.0128 103/500 [=====>........................] - ETA: 3:18 - loss: 0.2834 - regression_loss: 0.2706 - classification_loss: 0.0128 104/500 [=====>........................] - ETA: 3:18 - loss: 0.2823 - regression_loss: 0.2696 - classification_loss: 0.0127 105/500 [=====>........................] - ETA: 3:17 - loss: 0.2821 - regression_loss: 0.2694 - classification_loss: 0.0127 106/500 [=====>........................] - ETA: 3:17 - loss: 0.2826 - regression_loss: 0.2699 - classification_loss: 0.0127 107/500 [=====>........................] - ETA: 3:16 - loss: 0.2839 - regression_loss: 0.2712 - classification_loss: 0.0128 108/500 [=====>........................] - ETA: 3:16 - loss: 0.2828 - regression_loss: 0.2700 - classification_loss: 0.0128 109/500 [=====>........................] - ETA: 3:15 - loss: 0.2825 - regression_loss: 0.2698 - classification_loss: 0.0127 110/500 [=====>........................] - ETA: 3:15 - loss: 0.2820 - regression_loss: 0.2693 - classification_loss: 0.0127 111/500 [=====>........................] - ETA: 3:14 - loss: 0.2811 - regression_loss: 0.2684 - classification_loss: 0.0127 112/500 [=====>........................] - ETA: 3:14 - loss: 0.2805 - regression_loss: 0.2679 - classification_loss: 0.0127 113/500 [=====>........................] - ETA: 3:13 - loss: 0.2803 - regression_loss: 0.2677 - classification_loss: 0.0126 114/500 [=====>........................] - ETA: 3:13 - loss: 0.2789 - regression_loss: 0.2663 - classification_loss: 0.0125 115/500 [=====>........................] - ETA: 3:12 - loss: 0.2780 - regression_loss: 0.2655 - classification_loss: 0.0124 116/500 [=====>........................] - ETA: 3:12 - loss: 0.2791 - regression_loss: 0.2666 - classification_loss: 0.0125 117/500 [======>.......................] - ETA: 3:11 - loss: 0.2795 - regression_loss: 0.2671 - classification_loss: 0.0124 118/500 [======>.......................] - ETA: 3:11 - loss: 0.2806 - regression_loss: 0.2680 - classification_loss: 0.0125 119/500 [======>.......................] - ETA: 3:10 - loss: 0.2808 - regression_loss: 0.2683 - classification_loss: 0.0125 120/500 [======>.......................] - ETA: 3:10 - loss: 0.2805 - regression_loss: 0.2681 - classification_loss: 0.0125 121/500 [======>.......................] - ETA: 3:09 - loss: 0.2803 - regression_loss: 0.2678 - classification_loss: 0.0125 122/500 [======>.......................] - ETA: 3:09 - loss: 0.2798 - regression_loss: 0.2674 - classification_loss: 0.0125 123/500 [======>.......................] - ETA: 3:08 - loss: 0.2796 - regression_loss: 0.2671 - classification_loss: 0.0125 124/500 [======>.......................] - ETA: 3:08 - loss: 0.2804 - regression_loss: 0.2678 - classification_loss: 0.0125 125/500 [======>.......................] - ETA: 3:07 - loss: 0.2807 - regression_loss: 0.2682 - classification_loss: 0.0125 126/500 [======>.......................] - ETA: 3:07 - loss: 0.2825 - regression_loss: 0.2699 - classification_loss: 0.0126 127/500 [======>.......................] - ETA: 3:06 - loss: 0.2827 - regression_loss: 0.2701 - classification_loss: 0.0126 128/500 [======>.......................] - ETA: 3:06 - loss: 0.2849 - regression_loss: 0.2723 - classification_loss: 0.0126 129/500 [======>.......................] - ETA: 3:05 - loss: 0.2841 - regression_loss: 0.2715 - classification_loss: 0.0126 130/500 [======>.......................] - ETA: 3:05 - loss: 0.2827 - regression_loss: 0.2702 - classification_loss: 0.0125 131/500 [======>.......................] - ETA: 3:04 - loss: 0.2833 - regression_loss: 0.2703 - classification_loss: 0.0129 132/500 [======>.......................] - ETA: 3:04 - loss: 0.2823 - regression_loss: 0.2695 - classification_loss: 0.0129 133/500 [======>.......................] - ETA: 3:03 - loss: 0.2819 - regression_loss: 0.2691 - classification_loss: 0.0128 134/500 [=======>......................] - ETA: 3:03 - loss: 0.2820 - regression_loss: 0.2692 - classification_loss: 0.0128 135/500 [=======>......................] - ETA: 3:02 - loss: 0.2814 - regression_loss: 0.2686 - classification_loss: 0.0128 136/500 [=======>......................] - ETA: 3:02 - loss: 0.2803 - regression_loss: 0.2676 - classification_loss: 0.0127 137/500 [=======>......................] - ETA: 3:01 - loss: 0.2810 - regression_loss: 0.2682 - classification_loss: 0.0128 138/500 [=======>......................] - ETA: 3:01 - loss: 0.2805 - regression_loss: 0.2677 - classification_loss: 0.0127 139/500 [=======>......................] - ETA: 3:00 - loss: 0.2797 - regression_loss: 0.2670 - classification_loss: 0.0127 140/500 [=======>......................] - ETA: 3:00 - loss: 0.2799 - regression_loss: 0.2672 - classification_loss: 0.0127 141/500 [=======>......................] - ETA: 2:59 - loss: 0.2785 - regression_loss: 0.2659 - classification_loss: 0.0126 142/500 [=======>......................] - ETA: 2:58 - loss: 0.2775 - regression_loss: 0.2649 - classification_loss: 0.0126 143/500 [=======>......................] - ETA: 2:58 - loss: 0.2777 - regression_loss: 0.2650 - classification_loss: 0.0127 144/500 [=======>......................] - ETA: 2:57 - loss: 0.2774 - regression_loss: 0.2648 - classification_loss: 0.0127 145/500 [=======>......................] - ETA: 2:57 - loss: 0.2773 - regression_loss: 0.2646 - classification_loss: 0.0127 146/500 [=======>......................] - ETA: 2:56 - loss: 0.2771 - regression_loss: 0.2644 - classification_loss: 0.0127 147/500 [=======>......................] - ETA: 2:56 - loss: 0.2770 - regression_loss: 0.2643 - classification_loss: 0.0127 148/500 [=======>......................] - ETA: 2:55 - loss: 0.2767 - regression_loss: 0.2639 - classification_loss: 0.0128 149/500 [=======>......................] - ETA: 2:55 - loss: 0.2757 - regression_loss: 0.2630 - classification_loss: 0.0127 150/500 [========>.....................] - ETA: 2:54 - loss: 0.2745 - regression_loss: 0.2619 - classification_loss: 0.0126 151/500 [========>.....................] - ETA: 2:54 - loss: 0.2749 - regression_loss: 0.2621 - classification_loss: 0.0127 152/500 [========>.....................] - ETA: 2:53 - loss: 0.2738 - regression_loss: 0.2611 - classification_loss: 0.0127 153/500 [========>.....................] - ETA: 2:53 - loss: 0.2733 - regression_loss: 0.2607 - classification_loss: 0.0126 154/500 [========>.....................] - ETA: 2:52 - loss: 0.2739 - regression_loss: 0.2612 - classification_loss: 0.0126 155/500 [========>.....................] - ETA: 2:52 - loss: 0.2732 - regression_loss: 0.2606 - classification_loss: 0.0126 156/500 [========>.....................] - ETA: 2:51 - loss: 0.2738 - regression_loss: 0.2612 - classification_loss: 0.0126 157/500 [========>.....................] - ETA: 2:51 - loss: 0.2738 - regression_loss: 0.2613 - classification_loss: 0.0125 158/500 [========>.....................] - ETA: 2:50 - loss: 0.2740 - regression_loss: 0.2615 - classification_loss: 0.0125 159/500 [========>.....................] - ETA: 2:50 - loss: 0.2741 - regression_loss: 0.2616 - classification_loss: 0.0125 160/500 [========>.....................] - ETA: 2:49 - loss: 0.2738 - regression_loss: 0.2613 - classification_loss: 0.0125 161/500 [========>.....................] - ETA: 2:49 - loss: 0.2741 - regression_loss: 0.2616 - classification_loss: 0.0125 162/500 [========>.....................] - ETA: 2:48 - loss: 0.2741 - regression_loss: 0.2615 - classification_loss: 0.0125 163/500 [========>.....................] - ETA: 2:48 - loss: 0.2736 - regression_loss: 0.2611 - classification_loss: 0.0125 164/500 [========>.....................] - ETA: 2:47 - loss: 0.2732 - regression_loss: 0.2607 - classification_loss: 0.0125 165/500 [========>.....................] - ETA: 2:47 - loss: 0.2729 - regression_loss: 0.2604 - classification_loss: 0.0126 166/500 [========>.....................] - ETA: 2:46 - loss: 0.2727 - regression_loss: 0.2601 - classification_loss: 0.0126 167/500 [=========>....................] - ETA: 2:46 - loss: 0.2749 - regression_loss: 0.2622 - classification_loss: 0.0127 168/500 [=========>....................] - ETA: 2:45 - loss: 0.2754 - regression_loss: 0.2627 - classification_loss: 0.0127 169/500 [=========>....................] - ETA: 2:45 - loss: 0.2746 - regression_loss: 0.2620 - classification_loss: 0.0126 170/500 [=========>....................] - ETA: 2:44 - loss: 0.2747 - regression_loss: 0.2621 - classification_loss: 0.0126 171/500 [=========>....................] - ETA: 2:44 - loss: 0.2749 - regression_loss: 0.2623 - classification_loss: 0.0126 172/500 [=========>....................] - ETA: 2:43 - loss: 0.2744 - regression_loss: 0.2618 - classification_loss: 0.0126 173/500 [=========>....................] - ETA: 2:43 - loss: 0.2744 - regression_loss: 0.2618 - classification_loss: 0.0126 174/500 [=========>....................] - ETA: 2:42 - loss: 0.2740 - regression_loss: 0.2614 - classification_loss: 0.0126 175/500 [=========>....................] - ETA: 2:42 - loss: 0.2752 - regression_loss: 0.2625 - classification_loss: 0.0126 176/500 [=========>....................] - ETA: 2:41 - loss: 0.2753 - regression_loss: 0.2627 - classification_loss: 0.0126 177/500 [=========>....................] - ETA: 2:41 - loss: 0.2756 - regression_loss: 0.2631 - classification_loss: 0.0126 178/500 [=========>....................] - ETA: 2:40 - loss: 0.2751 - regression_loss: 0.2626 - classification_loss: 0.0125 179/500 [=========>....................] - ETA: 2:40 - loss: 0.2756 - regression_loss: 0.2630 - classification_loss: 0.0126 180/500 [=========>....................] - ETA: 2:39 - loss: 0.2748 - regression_loss: 0.2623 - classification_loss: 0.0125 181/500 [=========>....................] - ETA: 2:39 - loss: 0.2748 - regression_loss: 0.2623 - classification_loss: 0.0125 182/500 [=========>....................] - ETA: 2:39 - loss: 0.2743 - regression_loss: 0.2618 - classification_loss: 0.0125 183/500 [=========>....................] - ETA: 2:38 - loss: 0.2734 - regression_loss: 0.2609 - classification_loss: 0.0124 184/500 [==========>...................] - ETA: 2:38 - loss: 0.2740 - regression_loss: 0.2615 - classification_loss: 0.0125 185/500 [==========>...................] - ETA: 2:37 - loss: 0.2741 - regression_loss: 0.2616 - classification_loss: 0.0125 186/500 [==========>...................] - ETA: 2:37 - loss: 0.2735 - regression_loss: 0.2610 - classification_loss: 0.0125 187/500 [==========>...................] - ETA: 2:36 - loss: 0.2728 - regression_loss: 0.2604 - classification_loss: 0.0124 188/500 [==========>...................] - ETA: 2:36 - loss: 0.2721 - regression_loss: 0.2598 - classification_loss: 0.0124 189/500 [==========>...................] - ETA: 2:35 - loss: 0.2715 - regression_loss: 0.2592 - classification_loss: 0.0123 190/500 [==========>...................] - ETA: 2:35 - loss: 0.2709 - regression_loss: 0.2587 - classification_loss: 0.0123 191/500 [==========>...................] - ETA: 2:34 - loss: 0.2707 - regression_loss: 0.2584 - classification_loss: 0.0122 192/500 [==========>...................] - ETA: 2:34 - loss: 0.2707 - regression_loss: 0.2585 - classification_loss: 0.0122 193/500 [==========>...................] - ETA: 2:33 - loss: 0.2724 - regression_loss: 0.2600 - classification_loss: 0.0125 194/500 [==========>...................] - ETA: 2:33 - loss: 0.2741 - regression_loss: 0.2615 - classification_loss: 0.0126 195/500 [==========>...................] - ETA: 2:32 - loss: 0.2754 - regression_loss: 0.2627 - classification_loss: 0.0127 196/500 [==========>...................] - ETA: 2:32 - loss: 0.2762 - regression_loss: 0.2633 - classification_loss: 0.0129 197/500 [==========>...................] - ETA: 2:31 - loss: 0.2761 - regression_loss: 0.2632 - classification_loss: 0.0130 198/500 [==========>...................] - ETA: 2:31 - loss: 0.2755 - regression_loss: 0.2625 - classification_loss: 0.0129 199/500 [==========>...................] - ETA: 2:30 - loss: 0.2749 - regression_loss: 0.2620 - classification_loss: 0.0129 200/500 [===========>..................] - ETA: 2:30 - loss: 0.2753 - regression_loss: 0.2624 - classification_loss: 0.0129 201/500 [===========>..................] - ETA: 2:29 - loss: 0.2750 - regression_loss: 0.2621 - classification_loss: 0.0129 202/500 [===========>..................] - ETA: 2:29 - loss: 0.2759 - regression_loss: 0.2629 - classification_loss: 0.0130 203/500 [===========>..................] - ETA: 2:28 - loss: 0.2765 - regression_loss: 0.2635 - classification_loss: 0.0130 204/500 [===========>..................] - ETA: 2:28 - loss: 0.2765 - regression_loss: 0.2635 - classification_loss: 0.0129 205/500 [===========>..................] - ETA: 2:27 - loss: 0.2759 - regression_loss: 0.2630 - classification_loss: 0.0129 206/500 [===========>..................] - ETA: 2:27 - loss: 0.2753 - regression_loss: 0.2625 - classification_loss: 0.0129 207/500 [===========>..................] - ETA: 2:26 - loss: 0.2752 - regression_loss: 0.2623 - classification_loss: 0.0128 208/500 [===========>..................] - ETA: 2:26 - loss: 0.2752 - regression_loss: 0.2623 - classification_loss: 0.0129 209/500 [===========>..................] - ETA: 2:25 - loss: 0.2752 - regression_loss: 0.2624 - classification_loss: 0.0128 210/500 [===========>..................] - ETA: 2:25 - loss: 0.2750 - regression_loss: 0.2622 - classification_loss: 0.0128 211/500 [===========>..................] - ETA: 2:24 - loss: 0.2756 - regression_loss: 0.2628 - classification_loss: 0.0128 212/500 [===========>..................] - ETA: 2:24 - loss: 0.2757 - regression_loss: 0.2629 - classification_loss: 0.0128 213/500 [===========>..................] - ETA: 2:23 - loss: 0.2764 - regression_loss: 0.2635 - classification_loss: 0.0129 214/500 [===========>..................] - ETA: 2:23 - loss: 0.2761 - regression_loss: 0.2632 - classification_loss: 0.0129 215/500 [===========>..................] - ETA: 2:22 - loss: 0.2754 - regression_loss: 0.2626 - classification_loss: 0.0128 216/500 [===========>..................] - ETA: 2:22 - loss: 0.2765 - regression_loss: 0.2636 - classification_loss: 0.0129 217/500 [============>.................] - ETA: 2:21 - loss: 0.2769 - regression_loss: 0.2640 - classification_loss: 0.0129 218/500 [============>.................] - ETA: 2:21 - loss: 0.2766 - regression_loss: 0.2637 - classification_loss: 0.0129 219/500 [============>.................] - ETA: 2:20 - loss: 0.2769 - regression_loss: 0.2640 - classification_loss: 0.0130 220/500 [============>.................] - ETA: 2:20 - loss: 0.2767 - regression_loss: 0.2637 - classification_loss: 0.0130 221/500 [============>.................] - ETA: 2:19 - loss: 0.2767 - regression_loss: 0.2637 - classification_loss: 0.0130 222/500 [============>.................] - ETA: 2:19 - loss: 0.2766 - regression_loss: 0.2635 - classification_loss: 0.0130 223/500 [============>.................] - ETA: 2:18 - loss: 0.2756 - regression_loss: 0.2626 - classification_loss: 0.0130 224/500 [============>.................] - ETA: 2:18 - loss: 0.2751 - regression_loss: 0.2622 - classification_loss: 0.0129 225/500 [============>.................] - ETA: 2:17 - loss: 0.2747 - regression_loss: 0.2618 - classification_loss: 0.0129 226/500 [============>.................] - ETA: 2:17 - loss: 0.2748 - regression_loss: 0.2619 - classification_loss: 0.0129 227/500 [============>.................] - ETA: 2:16 - loss: 0.2745 - regression_loss: 0.2616 - classification_loss: 0.0129 228/500 [============>.................] - ETA: 2:16 - loss: 0.2749 - regression_loss: 0.2620 - classification_loss: 0.0129 229/500 [============>.................] - ETA: 2:15 - loss: 0.2748 - regression_loss: 0.2620 - classification_loss: 0.0128 230/500 [============>.................] - ETA: 2:15 - loss: 0.2740 - regression_loss: 0.2612 - classification_loss: 0.0128 231/500 [============>.................] - ETA: 2:14 - loss: 0.2738 - regression_loss: 0.2611 - classification_loss: 0.0128 232/500 [============>.................] - ETA: 2:14 - loss: 0.2743 - regression_loss: 0.2615 - classification_loss: 0.0128 233/500 [============>.................] - ETA: 2:13 - loss: 0.2740 - regression_loss: 0.2612 - classification_loss: 0.0128 234/500 [=============>................] - ETA: 2:13 - loss: 0.2736 - regression_loss: 0.2609 - classification_loss: 0.0127 235/500 [=============>................] - ETA: 2:12 - loss: 0.2748 - regression_loss: 0.2620 - classification_loss: 0.0127 236/500 [=============>................] - ETA: 2:12 - loss: 0.2750 - regression_loss: 0.2622 - classification_loss: 0.0128 237/500 [=============>................] - ETA: 2:11 - loss: 0.2751 - regression_loss: 0.2623 - classification_loss: 0.0128 238/500 [=============>................] - ETA: 2:11 - loss: 0.2748 - regression_loss: 0.2621 - classification_loss: 0.0127 239/500 [=============>................] - ETA: 2:10 - loss: 0.2741 - regression_loss: 0.2614 - classification_loss: 0.0127 240/500 [=============>................] - ETA: 2:10 - loss: 0.2743 - regression_loss: 0.2616 - classification_loss: 0.0127 241/500 [=============>................] - ETA: 2:09 - loss: 0.2748 - regression_loss: 0.2621 - classification_loss: 0.0127 242/500 [=============>................] - ETA: 2:09 - loss: 0.2760 - regression_loss: 0.2633 - classification_loss: 0.0126 243/500 [=============>................] - ETA: 2:08 - loss: 0.2752 - regression_loss: 0.2626 - classification_loss: 0.0126 244/500 [=============>................] - ETA: 2:08 - loss: 0.2753 - regression_loss: 0.2627 - classification_loss: 0.0126 245/500 [=============>................] - ETA: 2:07 - loss: 0.2752 - regression_loss: 0.2626 - classification_loss: 0.0126 246/500 [=============>................] - ETA: 2:07 - loss: 0.2760 - regression_loss: 0.2634 - classification_loss: 0.0126 247/500 [=============>................] - ETA: 2:06 - loss: 0.2764 - regression_loss: 0.2637 - classification_loss: 0.0126 248/500 [=============>................] - ETA: 2:06 - loss: 0.2772 - regression_loss: 0.2646 - classification_loss: 0.0126 249/500 [=============>................] - ETA: 2:05 - loss: 0.2786 - regression_loss: 0.2658 - classification_loss: 0.0128 250/500 [==============>...............] - ETA: 2:05 - loss: 0.2794 - regression_loss: 0.2665 - classification_loss: 0.0129 251/500 [==============>...............] - ETA: 2:04 - loss: 0.2789 - regression_loss: 0.2660 - classification_loss: 0.0129 252/500 [==============>...............] - ETA: 2:04 - loss: 0.2796 - regression_loss: 0.2667 - classification_loss: 0.0129 253/500 [==============>...............] - ETA: 2:03 - loss: 0.2799 - regression_loss: 0.2669 - classification_loss: 0.0129 254/500 [==============>...............] - ETA: 2:03 - loss: 0.2808 - regression_loss: 0.2678 - classification_loss: 0.0130 255/500 [==============>...............] - ETA: 2:02 - loss: 0.2812 - regression_loss: 0.2682 - classification_loss: 0.0130 256/500 [==============>...............] - ETA: 2:02 - loss: 0.2809 - regression_loss: 0.2680 - classification_loss: 0.0130 257/500 [==============>...............] - ETA: 2:01 - loss: 0.2803 - regression_loss: 0.2673 - classification_loss: 0.0129 258/500 [==============>...............] - ETA: 2:01 - loss: 0.2801 - regression_loss: 0.2672 - classification_loss: 0.0129 259/500 [==============>...............] - ETA: 2:00 - loss: 0.2799 - regression_loss: 0.2670 - classification_loss: 0.0129 260/500 [==============>...............] - ETA: 2:00 - loss: 0.2794 - regression_loss: 0.2665 - classification_loss: 0.0129 261/500 [==============>...............] - ETA: 1:59 - loss: 0.2791 - regression_loss: 0.2663 - classification_loss: 0.0129 262/500 [==============>...............] - ETA: 1:59 - loss: 0.2790 - regression_loss: 0.2661 - classification_loss: 0.0128 263/500 [==============>...............] - ETA: 1:58 - loss: 0.2795 - regression_loss: 0.2667 - classification_loss: 0.0128 264/500 [==============>...............] - ETA: 1:58 - loss: 0.2796 - regression_loss: 0.2668 - classification_loss: 0.0128 265/500 [==============>...............] - ETA: 1:57 - loss: 0.2795 - regression_loss: 0.2667 - classification_loss: 0.0128 266/500 [==============>...............] - ETA: 1:57 - loss: 0.2796 - regression_loss: 0.2668 - classification_loss: 0.0128 267/500 [===============>..............] - ETA: 1:56 - loss: 0.2798 - regression_loss: 0.2670 - classification_loss: 0.0128 268/500 [===============>..............] - ETA: 1:56 - loss: 0.2792 - regression_loss: 0.2664 - classification_loss: 0.0128 269/500 [===============>..............] - ETA: 1:55 - loss: 0.2793 - regression_loss: 0.2666 - classification_loss: 0.0128 270/500 [===============>..............] - ETA: 1:55 - loss: 0.2793 - regression_loss: 0.2665 - classification_loss: 0.0127 271/500 [===============>..............] - ETA: 1:54 - loss: 0.2791 - regression_loss: 0.2663 - classification_loss: 0.0127 272/500 [===============>..............] - ETA: 1:54 - loss: 0.2786 - regression_loss: 0.2659 - classification_loss: 0.0127 273/500 [===============>..............] - ETA: 1:53 - loss: 0.2785 - regression_loss: 0.2658 - classification_loss: 0.0127 274/500 [===============>..............] - ETA: 1:53 - loss: 0.2779 - regression_loss: 0.2653 - classification_loss: 0.0127 275/500 [===============>..............] - ETA: 1:52 - loss: 0.2780 - regression_loss: 0.2653 - classification_loss: 0.0127 276/500 [===============>..............] - ETA: 1:52 - loss: 0.2778 - regression_loss: 0.2652 - classification_loss: 0.0126 277/500 [===============>..............] - ETA: 1:51 - loss: 0.2774 - regression_loss: 0.2648 - classification_loss: 0.0126 278/500 [===============>..............] - ETA: 1:51 - loss: 0.2776 - regression_loss: 0.2650 - classification_loss: 0.0126 279/500 [===============>..............] - ETA: 1:50 - loss: 0.2778 - regression_loss: 0.2652 - classification_loss: 0.0126 280/500 [===============>..............] - ETA: 1:50 - loss: 0.2779 - regression_loss: 0.2653 - classification_loss: 0.0126 281/500 [===============>..............] - ETA: 1:49 - loss: 0.2781 - regression_loss: 0.2655 - classification_loss: 0.0126 282/500 [===============>..............] - ETA: 1:49 - loss: 0.2782 - regression_loss: 0.2656 - classification_loss: 0.0126 283/500 [===============>..............] - ETA: 1:48 - loss: 0.2780 - regression_loss: 0.2655 - classification_loss: 0.0126 284/500 [================>.............] - ETA: 1:48 - loss: 0.2786 - regression_loss: 0.2660 - classification_loss: 0.0126 285/500 [================>.............] - ETA: 1:47 - loss: 0.2781 - regression_loss: 0.2655 - classification_loss: 0.0125 286/500 [================>.............] - ETA: 1:47 - loss: 0.2789 - regression_loss: 0.2663 - classification_loss: 0.0126 287/500 [================>.............] - ETA: 1:46 - loss: 0.2787 - regression_loss: 0.2661 - classification_loss: 0.0126 288/500 [================>.............] - ETA: 1:46 - loss: 0.2788 - regression_loss: 0.2662 - classification_loss: 0.0126 289/500 [================>.............] - ETA: 1:45 - loss: 0.2786 - regression_loss: 0.2660 - classification_loss: 0.0126 290/500 [================>.............] - ETA: 1:45 - loss: 0.2783 - regression_loss: 0.2657 - classification_loss: 0.0126 291/500 [================>.............] - ETA: 1:44 - loss: 0.2779 - regression_loss: 0.2653 - classification_loss: 0.0126 292/500 [================>.............] - ETA: 1:44 - loss: 0.2774 - regression_loss: 0.2649 - classification_loss: 0.0125 293/500 [================>.............] - ETA: 1:43 - loss: 0.2775 - regression_loss: 0.2650 - classification_loss: 0.0125 294/500 [================>.............] - ETA: 1:43 - loss: 0.2777 - regression_loss: 0.2651 - classification_loss: 0.0126 295/500 [================>.............] - ETA: 1:42 - loss: 0.2770 - regression_loss: 0.2645 - classification_loss: 0.0125 296/500 [================>.............] - ETA: 1:42 - loss: 0.2769 - regression_loss: 0.2643 - classification_loss: 0.0125 297/500 [================>.............] - ETA: 1:41 - loss: 0.2769 - regression_loss: 0.2643 - classification_loss: 0.0126 298/500 [================>.............] - ETA: 1:41 - loss: 0.2764 - regression_loss: 0.2639 - classification_loss: 0.0125 299/500 [================>.............] - ETA: 1:40 - loss: 0.2768 - regression_loss: 0.2641 - classification_loss: 0.0127 300/500 [=================>............] - ETA: 1:40 - loss: 0.2763 - regression_loss: 0.2637 - classification_loss: 0.0127 301/500 [=================>............] - ETA: 1:39 - loss: 0.2759 - regression_loss: 0.2633 - classification_loss: 0.0127 302/500 [=================>............] - ETA: 1:39 - loss: 0.2761 - regression_loss: 0.2635 - classification_loss: 0.0126 303/500 [=================>............] - ETA: 1:38 - loss: 0.2759 - regression_loss: 0.2633 - classification_loss: 0.0126 304/500 [=================>............] - ETA: 1:38 - loss: 0.2755 - regression_loss: 0.2629 - classification_loss: 0.0126 305/500 [=================>............] - ETA: 1:37 - loss: 0.2752 - regression_loss: 0.2626 - classification_loss: 0.0126 306/500 [=================>............] - ETA: 1:37 - loss: 0.2746 - regression_loss: 0.2620 - classification_loss: 0.0125 307/500 [=================>............] - ETA: 1:36 - loss: 0.2746 - regression_loss: 0.2621 - classification_loss: 0.0125 308/500 [=================>............] - ETA: 1:36 - loss: 0.2741 - regression_loss: 0.2616 - classification_loss: 0.0125 309/500 [=================>............] - ETA: 1:35 - loss: 0.2738 - regression_loss: 0.2614 - classification_loss: 0.0125 310/500 [=================>............] - ETA: 1:35 - loss: 0.2733 - regression_loss: 0.2609 - classification_loss: 0.0124 311/500 [=================>............] - ETA: 1:34 - loss: 0.2740 - regression_loss: 0.2615 - classification_loss: 0.0125 312/500 [=================>............] - ETA: 1:34 - loss: 0.2740 - regression_loss: 0.2615 - classification_loss: 0.0125 313/500 [=================>............] - ETA: 1:33 - loss: 0.2741 - regression_loss: 0.2616 - classification_loss: 0.0125 314/500 [=================>............] - ETA: 1:33 - loss: 0.2736 - regression_loss: 0.2612 - classification_loss: 0.0125 315/500 [=================>............] - ETA: 1:32 - loss: 0.2736 - regression_loss: 0.2611 - classification_loss: 0.0125 316/500 [=================>............] - ETA: 1:32 - loss: 0.2736 - regression_loss: 0.2611 - classification_loss: 0.0125 317/500 [==================>...........] - ETA: 1:31 - loss: 0.2737 - regression_loss: 0.2612 - classification_loss: 0.0125 318/500 [==================>...........] - ETA: 1:31 - loss: 0.2738 - regression_loss: 0.2613 - classification_loss: 0.0125 319/500 [==================>...........] - ETA: 1:30 - loss: 0.2735 - regression_loss: 0.2610 - classification_loss: 0.0125 320/500 [==================>...........] - ETA: 1:30 - loss: 0.2730 - regression_loss: 0.2606 - classification_loss: 0.0125 321/500 [==================>...........] - ETA: 1:29 - loss: 0.2728 - regression_loss: 0.2604 - classification_loss: 0.0124 322/500 [==================>...........] - ETA: 1:29 - loss: 0.2723 - regression_loss: 0.2599 - classification_loss: 0.0124 323/500 [==================>...........] - ETA: 1:28 - loss: 0.2725 - regression_loss: 0.2601 - classification_loss: 0.0124 324/500 [==================>...........] - ETA: 1:28 - loss: 0.2724 - regression_loss: 0.2601 - classification_loss: 0.0124 325/500 [==================>...........] - ETA: 1:27 - loss: 0.2729 - regression_loss: 0.2604 - classification_loss: 0.0124 326/500 [==================>...........] - ETA: 1:27 - loss: 0.2725 - regression_loss: 0.2601 - classification_loss: 0.0124 327/500 [==================>...........] - ETA: 1:26 - loss: 0.2724 - regression_loss: 0.2600 - classification_loss: 0.0124 328/500 [==================>...........] - ETA: 1:26 - loss: 0.2729 - regression_loss: 0.2604 - classification_loss: 0.0125 329/500 [==================>...........] - ETA: 1:25 - loss: 0.2725 - regression_loss: 0.2600 - classification_loss: 0.0125 330/500 [==================>...........] - ETA: 1:25 - loss: 0.2724 - regression_loss: 0.2600 - classification_loss: 0.0124 331/500 [==================>...........] - ETA: 1:24 - loss: 0.2721 - regression_loss: 0.2596 - classification_loss: 0.0124 332/500 [==================>...........] - ETA: 1:24 - loss: 0.2720 - regression_loss: 0.2596 - classification_loss: 0.0124 333/500 [==================>...........] - ETA: 1:23 - loss: 0.2723 - regression_loss: 0.2598 - classification_loss: 0.0124 334/500 [===================>..........] - ETA: 1:23 - loss: 0.2728 - regression_loss: 0.2603 - classification_loss: 0.0125 335/500 [===================>..........] - ETA: 1:22 - loss: 0.2725 - regression_loss: 0.2600 - classification_loss: 0.0125 336/500 [===================>..........] - ETA: 1:22 - loss: 0.2725 - regression_loss: 0.2600 - classification_loss: 0.0125 337/500 [===================>..........] - ETA: 1:21 - loss: 0.2723 - regression_loss: 0.2598 - classification_loss: 0.0124 338/500 [===================>..........] - ETA: 1:21 - loss: 0.2722 - regression_loss: 0.2598 - classification_loss: 0.0124 339/500 [===================>..........] - ETA: 1:20 - loss: 0.2718 - regression_loss: 0.2594 - classification_loss: 0.0124 340/500 [===================>..........] - ETA: 1:20 - loss: 0.2719 - regression_loss: 0.2595 - classification_loss: 0.0124 341/500 [===================>..........] - ETA: 1:19 - loss: 0.2716 - regression_loss: 0.2592 - classification_loss: 0.0124 342/500 [===================>..........] - ETA: 1:19 - loss: 0.2715 - regression_loss: 0.2592 - classification_loss: 0.0124 343/500 [===================>..........] - ETA: 1:18 - loss: 0.2713 - regression_loss: 0.2589 - classification_loss: 0.0123 344/500 [===================>..........] - ETA: 1:18 - loss: 0.2712 - regression_loss: 0.2588 - classification_loss: 0.0123 345/500 [===================>..........] - ETA: 1:17 - loss: 0.2713 - regression_loss: 0.2589 - classification_loss: 0.0124 346/500 [===================>..........] - ETA: 1:17 - loss: 0.2716 - regression_loss: 0.2593 - classification_loss: 0.0124 347/500 [===================>..........] - ETA: 1:16 - loss: 0.2718 - regression_loss: 0.2594 - classification_loss: 0.0124 348/500 [===================>..........] - ETA: 1:16 - loss: 0.2720 - regression_loss: 0.2596 - classification_loss: 0.0124 349/500 [===================>..........] - ETA: 1:15 - loss: 0.2727 - regression_loss: 0.2603 - classification_loss: 0.0124 350/500 [====================>.........] - ETA: 1:15 - loss: 0.2731 - regression_loss: 0.2607 - classification_loss: 0.0125 351/500 [====================>.........] - ETA: 1:14 - loss: 0.2728 - regression_loss: 0.2604 - classification_loss: 0.0124 352/500 [====================>.........] - ETA: 1:14 - loss: 0.2725 - regression_loss: 0.2601 - classification_loss: 0.0124 353/500 [====================>.........] - ETA: 1:13 - loss: 0.2722 - regression_loss: 0.2598 - classification_loss: 0.0124 354/500 [====================>.........] - ETA: 1:13 - loss: 0.2724 - regression_loss: 0.2600 - classification_loss: 0.0124 355/500 [====================>.........] - ETA: 1:12 - loss: 0.2722 - regression_loss: 0.2597 - classification_loss: 0.0124 356/500 [====================>.........] - ETA: 1:12 - loss: 0.2717 - regression_loss: 0.2593 - classification_loss: 0.0124 357/500 [====================>.........] - ETA: 1:11 - loss: 0.2714 - regression_loss: 0.2590 - classification_loss: 0.0124 358/500 [====================>.........] - ETA: 1:11 - loss: 0.2719 - regression_loss: 0.2595 - classification_loss: 0.0124 359/500 [====================>.........] - ETA: 1:10 - loss: 0.2717 - regression_loss: 0.2593 - classification_loss: 0.0124 360/500 [====================>.........] - ETA: 1:10 - loss: 0.2717 - regression_loss: 0.2592 - classification_loss: 0.0125 361/500 [====================>.........] - ETA: 1:09 - loss: 0.2718 - regression_loss: 0.2592 - classification_loss: 0.0126 362/500 [====================>.........] - ETA: 1:09 - loss: 0.2715 - regression_loss: 0.2590 - classification_loss: 0.0126 363/500 [====================>.........] - ETA: 1:08 - loss: 0.2716 - regression_loss: 0.2590 - classification_loss: 0.0126 364/500 [====================>.........] - ETA: 1:08 - loss: 0.2715 - regression_loss: 0.2589 - classification_loss: 0.0126 365/500 [====================>.........] - ETA: 1:07 - loss: 0.2713 - regression_loss: 0.2587 - classification_loss: 0.0126 366/500 [====================>.........] - ETA: 1:07 - loss: 0.2707 - regression_loss: 0.2581 - classification_loss: 0.0126 367/500 [=====================>........] - ETA: 1:06 - loss: 0.2708 - regression_loss: 0.2583 - classification_loss: 0.0126 368/500 [=====================>........] - ETA: 1:06 - loss: 0.2710 - regression_loss: 0.2584 - classification_loss: 0.0126 369/500 [=====================>........] - ETA: 1:05 - loss: 0.2708 - regression_loss: 0.2582 - classification_loss: 0.0126 370/500 [=====================>........] - ETA: 1:05 - loss: 0.2706 - regression_loss: 0.2581 - classification_loss: 0.0126 371/500 [=====================>........] - ETA: 1:04 - loss: 0.2704 - regression_loss: 0.2579 - classification_loss: 0.0126 372/500 [=====================>........] - ETA: 1:04 - loss: 0.2704 - regression_loss: 0.2578 - classification_loss: 0.0126 373/500 [=====================>........] - ETA: 1:03 - loss: 0.2704 - regression_loss: 0.2578 - classification_loss: 0.0126 374/500 [=====================>........] - ETA: 1:03 - loss: 0.2704 - regression_loss: 0.2578 - classification_loss: 0.0126 375/500 [=====================>........] - ETA: 1:02 - loss: 0.2702 - regression_loss: 0.2577 - classification_loss: 0.0125 376/500 [=====================>........] - ETA: 1:02 - loss: 0.2703 - regression_loss: 0.2577 - classification_loss: 0.0125 377/500 [=====================>........] - ETA: 1:01 - loss: 0.2702 - regression_loss: 0.2577 - classification_loss: 0.0125 378/500 [=====================>........] - ETA: 1:01 - loss: 0.2699 - regression_loss: 0.2574 - classification_loss: 0.0125 379/500 [=====================>........] - ETA: 1:00 - loss: 0.2701 - regression_loss: 0.2576 - classification_loss: 0.0125 380/500 [=====================>........] - ETA: 1:00 - loss: 0.2697 - regression_loss: 0.2572 - classification_loss: 0.0125 381/500 [=====================>........] - ETA: 59s - loss: 0.2700 - regression_loss: 0.2574 - classification_loss: 0.0125  382/500 [=====================>........] - ETA: 59s - loss: 0.2695 - regression_loss: 0.2570 - classification_loss: 0.0125 383/500 [=====================>........] - ETA: 58s - loss: 0.2694 - regression_loss: 0.2569 - classification_loss: 0.0125 384/500 [======================>.......] - ETA: 58s - loss: 0.2693 - regression_loss: 0.2568 - classification_loss: 0.0125 385/500 [======================>.......] - ETA: 57s - loss: 0.2693 - regression_loss: 0.2568 - classification_loss: 0.0125 386/500 [======================>.......] - ETA: 57s - loss: 0.2691 - regression_loss: 0.2566 - classification_loss: 0.0125 387/500 [======================>.......] - ETA: 56s - loss: 0.2695 - regression_loss: 0.2569 - classification_loss: 0.0125 388/500 [======================>.......] - ETA: 56s - loss: 0.2694 - regression_loss: 0.2569 - classification_loss: 0.0125 389/500 [======================>.......] - ETA: 55s - loss: 0.2691 - regression_loss: 0.2566 - classification_loss: 0.0125 390/500 [======================>.......] - ETA: 55s - loss: 0.2687 - regression_loss: 0.2562 - classification_loss: 0.0125 391/500 [======================>.......] - ETA: 54s - loss: 0.2690 - regression_loss: 0.2566 - classification_loss: 0.0125 392/500 [======================>.......] - ETA: 54s - loss: 0.2690 - regression_loss: 0.2565 - classification_loss: 0.0125 393/500 [======================>.......] - ETA: 53s - loss: 0.2688 - regression_loss: 0.2564 - classification_loss: 0.0125 394/500 [======================>.......] - ETA: 53s - loss: 0.2687 - regression_loss: 0.2562 - classification_loss: 0.0125 395/500 [======================>.......] - ETA: 52s - loss: 0.2693 - regression_loss: 0.2568 - classification_loss: 0.0125 396/500 [======================>.......] - ETA: 52s - loss: 0.2691 - regression_loss: 0.2567 - classification_loss: 0.0124 397/500 [======================>.......] - ETA: 51s - loss: 0.2688 - regression_loss: 0.2564 - classification_loss: 0.0124 398/500 [======================>.......] - ETA: 51s - loss: 0.2688 - regression_loss: 0.2563 - classification_loss: 0.0125 399/500 [======================>.......] - ETA: 50s - loss: 0.2685 - regression_loss: 0.2560 - classification_loss: 0.0125 400/500 [=======================>......] - ETA: 50s - loss: 0.2680 - regression_loss: 0.2556 - classification_loss: 0.0124 401/500 [=======================>......] - ETA: 49s - loss: 0.2683 - regression_loss: 0.2558 - classification_loss: 0.0125 402/500 [=======================>......] - ETA: 49s - loss: 0.2685 - regression_loss: 0.2561 - classification_loss: 0.0125 403/500 [=======================>......] - ETA: 48s - loss: 0.2683 - regression_loss: 0.2558 - classification_loss: 0.0124 404/500 [=======================>......] - ETA: 48s - loss: 0.2686 - regression_loss: 0.2562 - classification_loss: 0.0124 405/500 [=======================>......] - ETA: 47s - loss: 0.2683 - regression_loss: 0.2559 - classification_loss: 0.0124 406/500 [=======================>......] - ETA: 47s - loss: 0.2687 - regression_loss: 0.2562 - classification_loss: 0.0124 407/500 [=======================>......] - ETA: 46s - loss: 0.2686 - regression_loss: 0.2562 - classification_loss: 0.0124 408/500 [=======================>......] - ETA: 46s - loss: 0.2689 - regression_loss: 0.2564 - classification_loss: 0.0125 409/500 [=======================>......] - ETA: 45s - loss: 0.2689 - regression_loss: 0.2563 - classification_loss: 0.0126 410/500 [=======================>......] - ETA: 45s - loss: 0.2687 - regression_loss: 0.2562 - classification_loss: 0.0125 411/500 [=======================>......] - ETA: 44s - loss: 0.2685 - regression_loss: 0.2560 - classification_loss: 0.0125 412/500 [=======================>......] - ETA: 44s - loss: 0.2685 - regression_loss: 0.2559 - classification_loss: 0.0125 413/500 [=======================>......] - ETA: 43s - loss: 0.2684 - regression_loss: 0.2559 - classification_loss: 0.0125 414/500 [=======================>......] - ETA: 43s - loss: 0.2683 - regression_loss: 0.2558 - classification_loss: 0.0125 415/500 [=======================>......] - ETA: 42s - loss: 0.2682 - regression_loss: 0.2557 - classification_loss: 0.0125 416/500 [=======================>......] - ETA: 42s - loss: 0.2681 - regression_loss: 0.2556 - classification_loss: 0.0125 417/500 [========================>.....] - ETA: 41s - loss: 0.2681 - regression_loss: 0.2556 - classification_loss: 0.0124 418/500 [========================>.....] - ETA: 41s - loss: 0.2678 - regression_loss: 0.2554 - classification_loss: 0.0124 419/500 [========================>.....] - ETA: 40s - loss: 0.2678 - regression_loss: 0.2553 - classification_loss: 0.0124 420/500 [========================>.....] - ETA: 40s - loss: 0.2673 - regression_loss: 0.2549 - classification_loss: 0.0124 421/500 [========================>.....] - ETA: 39s - loss: 0.2672 - regression_loss: 0.2548 - classification_loss: 0.0124 422/500 [========================>.....] - ETA: 39s - loss: 0.2670 - regression_loss: 0.2546 - classification_loss: 0.0124 423/500 [========================>.....] - ETA: 38s - loss: 0.2668 - regression_loss: 0.2545 - classification_loss: 0.0124 424/500 [========================>.....] - ETA: 38s - loss: 0.2664 - regression_loss: 0.2540 - classification_loss: 0.0123 425/500 [========================>.....] - ETA: 37s - loss: 0.2670 - regression_loss: 0.2546 - classification_loss: 0.0124 426/500 [========================>.....] - ETA: 37s - loss: 0.2666 - regression_loss: 0.2543 - classification_loss: 0.0123 427/500 [========================>.....] - ETA: 36s - loss: 0.2664 - regression_loss: 0.2540 - classification_loss: 0.0123 428/500 [========================>.....] - ETA: 36s - loss: 0.2669 - regression_loss: 0.2546 - classification_loss: 0.0123 429/500 [========================>.....] - ETA: 35s - loss: 0.2668 - regression_loss: 0.2545 - classification_loss: 0.0123 430/500 [========================>.....] - ETA: 35s - loss: 0.2672 - regression_loss: 0.2549 - classification_loss: 0.0123 431/500 [========================>.....] - ETA: 34s - loss: 0.2674 - regression_loss: 0.2551 - classification_loss: 0.0123 432/500 [========================>.....] - ETA: 34s - loss: 0.2677 - regression_loss: 0.2554 - classification_loss: 0.0124 433/500 [========================>.....] - ETA: 33s - loss: 0.2676 - regression_loss: 0.2552 - classification_loss: 0.0124 434/500 [=========================>....] - ETA: 33s - loss: 0.2680 - regression_loss: 0.2556 - classification_loss: 0.0124 435/500 [=========================>....] - ETA: 32s - loss: 0.2679 - regression_loss: 0.2555 - classification_loss: 0.0123 436/500 [=========================>....] - ETA: 32s - loss: 0.2679 - regression_loss: 0.2556 - classification_loss: 0.0123 437/500 [=========================>....] - ETA: 31s - loss: 0.2678 - regression_loss: 0.2555 - classification_loss: 0.0123 438/500 [=========================>....] - ETA: 31s - loss: 0.2682 - regression_loss: 0.2559 - classification_loss: 0.0123 439/500 [=========================>....] - ETA: 30s - loss: 0.2681 - regression_loss: 0.2558 - classification_loss: 0.0123 440/500 [=========================>....] - ETA: 30s - loss: 0.2679 - regression_loss: 0.2556 - classification_loss: 0.0123 441/500 [=========================>....] - ETA: 29s - loss: 0.2684 - regression_loss: 0.2561 - classification_loss: 0.0123 442/500 [=========================>....] - ETA: 29s - loss: 0.2683 - regression_loss: 0.2560 - classification_loss: 0.0123 443/500 [=========================>....] - ETA: 28s - loss: 0.2683 - regression_loss: 0.2560 - classification_loss: 0.0123 444/500 [=========================>....] - ETA: 28s - loss: 0.2683 - regression_loss: 0.2561 - classification_loss: 0.0123 445/500 [=========================>....] - ETA: 27s - loss: 0.2687 - regression_loss: 0.2564 - classification_loss: 0.0123 446/500 [=========================>....] - ETA: 27s - loss: 0.2683 - regression_loss: 0.2560 - classification_loss: 0.0123 447/500 [=========================>....] - ETA: 26s - loss: 0.2687 - regression_loss: 0.2563 - classification_loss: 0.0123 448/500 [=========================>....] - ETA: 26s - loss: 0.2690 - regression_loss: 0.2567 - classification_loss: 0.0123 449/500 [=========================>....] - ETA: 25s - loss: 0.2690 - regression_loss: 0.2567 - classification_loss: 0.0123 450/500 [==========================>...] - ETA: 25s - loss: 0.2692 - regression_loss: 0.2568 - classification_loss: 0.0123 451/500 [==========================>...] - ETA: 24s - loss: 0.2690 - regression_loss: 0.2567 - classification_loss: 0.0123 452/500 [==========================>...] - ETA: 24s - loss: 0.2688 - regression_loss: 0.2565 - classification_loss: 0.0123 453/500 [==========================>...] - ETA: 23s - loss: 0.2690 - regression_loss: 0.2567 - classification_loss: 0.0123 454/500 [==========================>...] - ETA: 23s - loss: 0.2690 - regression_loss: 0.2567 - classification_loss: 0.0123 455/500 [==========================>...] - ETA: 22s - loss: 0.2686 - regression_loss: 0.2564 - classification_loss: 0.0123 456/500 [==========================>...] - ETA: 22s - loss: 0.2684 - regression_loss: 0.2562 - classification_loss: 0.0122 457/500 [==========================>...] - ETA: 21s - loss: 0.2689 - regression_loss: 0.2567 - classification_loss: 0.0122 458/500 [==========================>...] - ETA: 21s - loss: 0.2691 - regression_loss: 0.2568 - classification_loss: 0.0122 459/500 [==========================>...] - ETA: 20s - loss: 0.2697 - regression_loss: 0.2575 - classification_loss: 0.0123 460/500 [==========================>...] - ETA: 20s - loss: 0.2700 - regression_loss: 0.2577 - classification_loss: 0.0123 461/500 [==========================>...] - ETA: 19s - loss: 0.2698 - regression_loss: 0.2576 - classification_loss: 0.0123 462/500 [==========================>...] - ETA: 19s - loss: 0.2698 - regression_loss: 0.2576 - classification_loss: 0.0123 463/500 [==========================>...] - ETA: 18s - loss: 0.2697 - regression_loss: 0.2575 - classification_loss: 0.0123 464/500 [==========================>...] - ETA: 18s - loss: 0.2695 - regression_loss: 0.2573 - classification_loss: 0.0122 465/500 [==========================>...] - ETA: 17s - loss: 0.2695 - regression_loss: 0.2573 - classification_loss: 0.0122 466/500 [==========================>...] - ETA: 17s - loss: 0.2696 - regression_loss: 0.2574 - classification_loss: 0.0122 467/500 [===========================>..] - ETA: 16s - loss: 0.2700 - regression_loss: 0.2577 - classification_loss: 0.0123 468/500 [===========================>..] - ETA: 16s - loss: 0.2699 - regression_loss: 0.2576 - classification_loss: 0.0123 469/500 [===========================>..] - ETA: 15s - loss: 0.2697 - regression_loss: 0.2574 - classification_loss: 0.0122 470/500 [===========================>..] - ETA: 15s - loss: 0.2695 - regression_loss: 0.2572 - classification_loss: 0.0122 471/500 [===========================>..] - ETA: 14s - loss: 0.2694 - regression_loss: 0.2572 - classification_loss: 0.0122 472/500 [===========================>..] - ETA: 14s - loss: 0.2694 - regression_loss: 0.2572 - classification_loss: 0.0122 473/500 [===========================>..] - ETA: 13s - loss: 0.2692 - regression_loss: 0.2570 - classification_loss: 0.0122 474/500 [===========================>..] - ETA: 13s - loss: 0.2694 - regression_loss: 0.2572 - classification_loss: 0.0122 475/500 [===========================>..] - ETA: 12s - loss: 0.2692 - regression_loss: 0.2570 - classification_loss: 0.0122 476/500 [===========================>..] - ETA: 12s - loss: 0.2690 - regression_loss: 0.2569 - classification_loss: 0.0122 477/500 [===========================>..] - ETA: 11s - loss: 0.2692 - regression_loss: 0.2570 - classification_loss: 0.0121 478/500 [===========================>..] - ETA: 11s - loss: 0.2689 - regression_loss: 0.2568 - classification_loss: 0.0121 479/500 [===========================>..] - ETA: 10s - loss: 0.2687 - regression_loss: 0.2565 - classification_loss: 0.0121 480/500 [===========================>..] - ETA: 10s - loss: 0.2688 - regression_loss: 0.2566 - classification_loss: 0.0121 481/500 [===========================>..] - ETA: 9s - loss: 0.2689 - regression_loss: 0.2568 - classification_loss: 0.0121  482/500 [===========================>..] - ETA: 9s - loss: 0.2690 - regression_loss: 0.2568 - classification_loss: 0.0121 483/500 [===========================>..] - ETA: 8s - loss: 0.2687 - regression_loss: 0.2566 - classification_loss: 0.0121 484/500 [============================>.] - ETA: 8s - loss: 0.2684 - regression_loss: 0.2563 - classification_loss: 0.0121 485/500 [============================>.] - ETA: 7s - loss: 0.2683 - regression_loss: 0.2562 - classification_loss: 0.0121 486/500 [============================>.] - ETA: 7s - loss: 0.2684 - regression_loss: 0.2563 - classification_loss: 0.0121 487/500 [============================>.] - ETA: 6s - loss: 0.2680 - regression_loss: 0.2560 - classification_loss: 0.0121 488/500 [============================>.] - ETA: 6s - loss: 0.2679 - regression_loss: 0.2558 - classification_loss: 0.0121 489/500 [============================>.] - ETA: 5s - loss: 0.2678 - regression_loss: 0.2558 - classification_loss: 0.0120 490/500 [============================>.] - ETA: 5s - loss: 0.2680 - regression_loss: 0.2560 - classification_loss: 0.0120 491/500 [============================>.] - ETA: 4s - loss: 0.2684 - regression_loss: 0.2563 - classification_loss: 0.0121 492/500 [============================>.] - ETA: 4s - loss: 0.2685 - regression_loss: 0.2565 - classification_loss: 0.0121 493/500 [============================>.] - ETA: 3s - loss: 0.2690 - regression_loss: 0.2569 - classification_loss: 0.0121 494/500 [============================>.] - ETA: 3s - loss: 0.2691 - regression_loss: 0.2570 - classification_loss: 0.0121 495/500 [============================>.] - ETA: 2s - loss: 0.2687 - regression_loss: 0.2566 - classification_loss: 0.0121 496/500 [============================>.] - ETA: 2s - loss: 0.2689 - regression_loss: 0.2567 - classification_loss: 0.0121 497/500 [============================>.] - ETA: 1s - loss: 0.2688 - regression_loss: 0.2567 - classification_loss: 0.0121 498/500 [============================>.] - ETA: 1s - loss: 0.2693 - regression_loss: 0.2571 - classification_loss: 0.0122 499/500 [============================>.] - ETA: 0s - loss: 0.2692 - regression_loss: 0.2570 - classification_loss: 0.0121 500/500 [==============================] - 251s 502ms/step - loss: 0.2690 - regression_loss: 0.2568 - classification_loss: 0.0121 1172 instances of class plum with average precision: 0.7185 mAP: 0.7185 Epoch 00028: saving model to ./training/snapshots/resnet101_pascal_28.h5 Epoch 29/150 1/500 [..............................] - ETA: 4:09 - loss: 0.3097 - regression_loss: 0.3012 - classification_loss: 0.0085 2/500 [..............................] - ETA: 4:07 - loss: 0.5531 - regression_loss: 0.5209 - classification_loss: 0.0322 3/500 [..............................] - ETA: 4:04 - loss: 0.5345 - regression_loss: 0.5077 - classification_loss: 0.0269 4/500 [..............................] - ETA: 4:04 - loss: 0.5057 - regression_loss: 0.4763 - classification_loss: 0.0293 5/500 [..............................] - ETA: 4:06 - loss: 0.4804 - regression_loss: 0.4503 - classification_loss: 0.0300 6/500 [..............................] - ETA: 4:06 - loss: 0.4749 - regression_loss: 0.4476 - classification_loss: 0.0272 7/500 [..............................] - ETA: 4:04 - loss: 0.5065 - regression_loss: 0.4718 - classification_loss: 0.0347 8/500 [..............................] - ETA: 4:04 - loss: 0.4852 - regression_loss: 0.4535 - classification_loss: 0.0317 9/500 [..............................] - ETA: 4:03 - loss: 0.4622 - regression_loss: 0.4326 - classification_loss: 0.0295 10/500 [..............................] - ETA: 4:05 - loss: 0.4447 - regression_loss: 0.4147 - classification_loss: 0.0300 11/500 [..............................] - ETA: 4:05 - loss: 0.4197 - regression_loss: 0.3917 - classification_loss: 0.0280 12/500 [..............................] - ETA: 4:04 - loss: 0.3997 - regression_loss: 0.3734 - classification_loss: 0.0262 13/500 [..............................] - ETA: 4:03 - loss: 0.3811 - regression_loss: 0.3563 - classification_loss: 0.0248 14/500 [..............................] - ETA: 4:03 - loss: 0.3706 - regression_loss: 0.3468 - classification_loss: 0.0238 15/500 [..............................] - ETA: 4:02 - loss: 0.3615 - regression_loss: 0.3381 - classification_loss: 0.0234 16/500 [..............................] - ETA: 4:02 - loss: 0.3490 - regression_loss: 0.3265 - classification_loss: 0.0225 17/500 [>.............................] - ETA: 4:01 - loss: 0.3383 - regression_loss: 0.3168 - classification_loss: 0.0215 18/500 [>.............................] - ETA: 4:02 - loss: 0.3290 - regression_loss: 0.3074 - classification_loss: 0.0216 19/500 [>.............................] - ETA: 4:01 - loss: 0.3247 - regression_loss: 0.3034 - classification_loss: 0.0213 20/500 [>.............................] - ETA: 4:01 - loss: 0.3176 - regression_loss: 0.2971 - classification_loss: 0.0204 21/500 [>.............................] - ETA: 4:00 - loss: 0.3127 - regression_loss: 0.2927 - classification_loss: 0.0200 22/500 [>.............................] - ETA: 3:59 - loss: 0.3104 - regression_loss: 0.2904 - classification_loss: 0.0200 23/500 [>.............................] - ETA: 3:59 - loss: 0.3034 - regression_loss: 0.2839 - classification_loss: 0.0195 24/500 [>.............................] - ETA: 3:58 - loss: 0.2982 - regression_loss: 0.2791 - classification_loss: 0.0190 25/500 [>.............................] - ETA: 3:57 - loss: 0.2956 - regression_loss: 0.2769 - classification_loss: 0.0187 26/500 [>.............................] - ETA: 3:57 - loss: 0.2876 - regression_loss: 0.2694 - classification_loss: 0.0182 27/500 [>.............................] - ETA: 3:57 - loss: 0.2889 - regression_loss: 0.2706 - classification_loss: 0.0183 28/500 [>.............................] - ETA: 3:56 - loss: 0.2851 - regression_loss: 0.2670 - classification_loss: 0.0182 29/500 [>.............................] - ETA: 3:56 - loss: 0.2853 - regression_loss: 0.2670 - classification_loss: 0.0184 30/500 [>.............................] - ETA: 3:56 - loss: 0.2858 - regression_loss: 0.2677 - classification_loss: 0.0181 31/500 [>.............................] - ETA: 3:55 - loss: 0.2835 - regression_loss: 0.2656 - classification_loss: 0.0179 32/500 [>.............................] - ETA: 3:55 - loss: 0.2793 - regression_loss: 0.2619 - classification_loss: 0.0174 33/500 [>.............................] - ETA: 3:54 - loss: 0.2806 - regression_loss: 0.2631 - classification_loss: 0.0175 34/500 [=>............................] - ETA: 3:53 - loss: 0.2813 - regression_loss: 0.2637 - classification_loss: 0.0176 35/500 [=>............................] - ETA: 3:53 - loss: 0.2832 - regression_loss: 0.2659 - classification_loss: 0.0173 36/500 [=>............................] - ETA: 3:53 - loss: 0.2779 - regression_loss: 0.2610 - classification_loss: 0.0169 37/500 [=>............................] - ETA: 3:52 - loss: 0.2786 - regression_loss: 0.2618 - classification_loss: 0.0168 38/500 [=>............................] - ETA: 3:52 - loss: 0.2783 - regression_loss: 0.2616 - classification_loss: 0.0167 39/500 [=>............................] - ETA: 3:51 - loss: 0.2743 - regression_loss: 0.2578 - classification_loss: 0.0165 40/500 [=>............................] - ETA: 3:50 - loss: 0.2718 - regression_loss: 0.2555 - classification_loss: 0.0163 41/500 [=>............................] - ETA: 3:50 - loss: 0.2696 - regression_loss: 0.2534 - classification_loss: 0.0161 42/500 [=>............................] - ETA: 3:49 - loss: 0.2661 - regression_loss: 0.2502 - classification_loss: 0.0159 43/500 [=>............................] - ETA: 3:49 - loss: 0.2632 - regression_loss: 0.2475 - classification_loss: 0.0157 44/500 [=>............................] - ETA: 3:48 - loss: 0.2613 - regression_loss: 0.2459 - classification_loss: 0.0155 45/500 [=>............................] - ETA: 3:48 - loss: 0.2641 - regression_loss: 0.2486 - classification_loss: 0.0154 46/500 [=>............................] - ETA: 3:47 - loss: 0.2677 - regression_loss: 0.2523 - classification_loss: 0.0154 47/500 [=>............................] - ETA: 3:47 - loss: 0.2713 - regression_loss: 0.2555 - classification_loss: 0.0158 48/500 [=>............................] - ETA: 3:47 - loss: 0.2716 - regression_loss: 0.2562 - classification_loss: 0.0155 49/500 [=>............................] - ETA: 3:46 - loss: 0.2706 - regression_loss: 0.2554 - classification_loss: 0.0153 50/500 [==>...........................] - ETA: 3:46 - loss: 0.2686 - regression_loss: 0.2534 - classification_loss: 0.0152 51/500 [==>...........................] - ETA: 3:46 - loss: 0.2690 - regression_loss: 0.2540 - classification_loss: 0.0151 52/500 [==>...........................] - ETA: 3:45 - loss: 0.2702 - regression_loss: 0.2548 - classification_loss: 0.0154 53/500 [==>...........................] - ETA: 3:45 - loss: 0.2681 - regression_loss: 0.2530 - classification_loss: 0.0151 54/500 [==>...........................] - ETA: 3:44 - loss: 0.2645 - regression_loss: 0.2496 - classification_loss: 0.0149 55/500 [==>...........................] - ETA: 3:44 - loss: 0.2618 - regression_loss: 0.2472 - classification_loss: 0.0147 56/500 [==>...........................] - ETA: 3:43 - loss: 0.2583 - regression_loss: 0.2438 - classification_loss: 0.0145 57/500 [==>...........................] - ETA: 3:43 - loss: 0.2556 - regression_loss: 0.2413 - classification_loss: 0.0143 58/500 [==>...........................] - ETA: 3:42 - loss: 0.2585 - regression_loss: 0.2443 - classification_loss: 0.0142 59/500 [==>...........................] - ETA: 3:42 - loss: 0.2591 - regression_loss: 0.2448 - classification_loss: 0.0143 60/500 [==>...........................] - ETA: 3:41 - loss: 0.2628 - regression_loss: 0.2484 - classification_loss: 0.0144 61/500 [==>...........................] - ETA: 3:41 - loss: 0.2597 - regression_loss: 0.2455 - classification_loss: 0.0143 62/500 [==>...........................] - ETA: 3:40 - loss: 0.2592 - regression_loss: 0.2450 - classification_loss: 0.0142 63/500 [==>...........................] - ETA: 3:39 - loss: 0.2602 - regression_loss: 0.2460 - classification_loss: 0.0142 64/500 [==>...........................] - ETA: 3:39 - loss: 0.2601 - regression_loss: 0.2458 - classification_loss: 0.0143 65/500 [==>...........................] - ETA: 3:38 - loss: 0.2592 - regression_loss: 0.2450 - classification_loss: 0.0142 66/500 [==>...........................] - ETA: 3:38 - loss: 0.2573 - regression_loss: 0.2432 - classification_loss: 0.0140 67/500 [===>..........................] - ETA: 3:37 - loss: 0.2553 - regression_loss: 0.2415 - classification_loss: 0.0139 68/500 [===>..........................] - ETA: 3:36 - loss: 0.2559 - regression_loss: 0.2420 - classification_loss: 0.0139 69/500 [===>..........................] - ETA: 3:36 - loss: 0.2552 - regression_loss: 0.2411 - classification_loss: 0.0141 70/500 [===>..........................] - ETA: 3:36 - loss: 0.2555 - regression_loss: 0.2414 - classification_loss: 0.0141 71/500 [===>..........................] - ETA: 3:35 - loss: 0.2546 - regression_loss: 0.2407 - classification_loss: 0.0140 72/500 [===>..........................] - ETA: 3:34 - loss: 0.2536 - regression_loss: 0.2397 - classification_loss: 0.0140 73/500 [===>..........................] - ETA: 3:34 - loss: 0.2528 - regression_loss: 0.2390 - classification_loss: 0.0139 74/500 [===>..........................] - ETA: 3:33 - loss: 0.2529 - regression_loss: 0.2390 - classification_loss: 0.0139 75/500 [===>..........................] - ETA: 3:32 - loss: 0.2523 - regression_loss: 0.2384 - classification_loss: 0.0138 76/500 [===>..........................] - ETA: 3:32 - loss: 0.2535 - regression_loss: 0.2396 - classification_loss: 0.0139 77/500 [===>..........................] - ETA: 3:31 - loss: 0.2543 - regression_loss: 0.2401 - classification_loss: 0.0141 78/500 [===>..........................] - ETA: 3:31 - loss: 0.2535 - regression_loss: 0.2396 - classification_loss: 0.0140 79/500 [===>..........................] - ETA: 3:31 - loss: 0.2535 - regression_loss: 0.2395 - classification_loss: 0.0139 80/500 [===>..........................] - ETA: 3:30 - loss: 0.2535 - regression_loss: 0.2395 - classification_loss: 0.0140 81/500 [===>..........................] - ETA: 3:30 - loss: 0.2528 - regression_loss: 0.2389 - classification_loss: 0.0139 82/500 [===>..........................] - ETA: 3:29 - loss: 0.2566 - regression_loss: 0.2427 - classification_loss: 0.0139 83/500 [===>..........................] - ETA: 3:29 - loss: 0.2563 - regression_loss: 0.2425 - classification_loss: 0.0138 84/500 [====>.........................] - ETA: 3:28 - loss: 0.2542 - regression_loss: 0.2405 - classification_loss: 0.0136 85/500 [====>.........................] - ETA: 3:28 - loss: 0.2530 - regression_loss: 0.2395 - classification_loss: 0.0135 86/500 [====>.........................] - ETA: 3:27 - loss: 0.2532 - regression_loss: 0.2397 - classification_loss: 0.0135 87/500 [====>.........................] - ETA: 3:27 - loss: 0.2540 - regression_loss: 0.2405 - classification_loss: 0.0135 88/500 [====>.........................] - ETA: 3:26 - loss: 0.2534 - regression_loss: 0.2399 - classification_loss: 0.0135 89/500 [====>.........................] - ETA: 3:25 - loss: 0.2521 - regression_loss: 0.2387 - classification_loss: 0.0134 90/500 [====>.........................] - ETA: 3:25 - loss: 0.2512 - regression_loss: 0.2379 - classification_loss: 0.0133 91/500 [====>.........................] - ETA: 3:24 - loss: 0.2503 - regression_loss: 0.2371 - classification_loss: 0.0132 92/500 [====>.........................] - ETA: 3:24 - loss: 0.2498 - regression_loss: 0.2366 - classification_loss: 0.0132 93/500 [====>.........................] - ETA: 3:23 - loss: 0.2490 - regression_loss: 0.2359 - classification_loss: 0.0131 94/500 [====>.........................] - ETA: 3:23 - loss: 0.2494 - regression_loss: 0.2363 - classification_loss: 0.0131 95/500 [====>.........................] - ETA: 3:22 - loss: 0.2497 - regression_loss: 0.2367 - classification_loss: 0.0130 96/500 [====>.........................] - ETA: 3:22 - loss: 0.2499 - regression_loss: 0.2369 - classification_loss: 0.0130 97/500 [====>.........................] - ETA: 3:21 - loss: 0.2496 - regression_loss: 0.2367 - classification_loss: 0.0130 98/500 [====>.........................] - ETA: 3:21 - loss: 0.2500 - regression_loss: 0.2369 - classification_loss: 0.0131 99/500 [====>.........................] - ETA: 3:20 - loss: 0.2486 - regression_loss: 0.2356 - classification_loss: 0.0130 100/500 [=====>........................] - ETA: 3:20 - loss: 0.2497 - regression_loss: 0.2368 - classification_loss: 0.0130 101/500 [=====>........................] - ETA: 3:19 - loss: 0.2499 - regression_loss: 0.2370 - classification_loss: 0.0129 102/500 [=====>........................] - ETA: 3:19 - loss: 0.2484 - regression_loss: 0.2356 - classification_loss: 0.0128 103/500 [=====>........................] - ETA: 3:18 - loss: 0.2490 - regression_loss: 0.2361 - classification_loss: 0.0129 104/500 [=====>........................] - ETA: 3:18 - loss: 0.2485 - regression_loss: 0.2356 - classification_loss: 0.0129 105/500 [=====>........................] - ETA: 3:17 - loss: 0.2499 - regression_loss: 0.2368 - classification_loss: 0.0130 106/500 [=====>........................] - ETA: 3:17 - loss: 0.2506 - regression_loss: 0.2376 - classification_loss: 0.0130 107/500 [=====>........................] - ETA: 3:16 - loss: 0.2501 - regression_loss: 0.2371 - classification_loss: 0.0130 108/500 [=====>........................] - ETA: 3:16 - loss: 0.2507 - regression_loss: 0.2377 - classification_loss: 0.0130 109/500 [=====>........................] - ETA: 3:15 - loss: 0.2499 - regression_loss: 0.2370 - classification_loss: 0.0129 110/500 [=====>........................] - ETA: 3:15 - loss: 0.2500 - regression_loss: 0.2371 - classification_loss: 0.0128 111/500 [=====>........................] - ETA: 3:14 - loss: 0.2522 - regression_loss: 0.2392 - classification_loss: 0.0130 112/500 [=====>........................] - ETA: 3:14 - loss: 0.2544 - regression_loss: 0.2413 - classification_loss: 0.0131 113/500 [=====>........................] - ETA: 3:13 - loss: 0.2561 - regression_loss: 0.2428 - classification_loss: 0.0133 114/500 [=====>........................] - ETA: 3:13 - loss: 0.2568 - regression_loss: 0.2435 - classification_loss: 0.0133 115/500 [=====>........................] - ETA: 3:12 - loss: 0.2573 - regression_loss: 0.2441 - classification_loss: 0.0132 116/500 [=====>........................] - ETA: 3:12 - loss: 0.2558 - regression_loss: 0.2426 - classification_loss: 0.0131 117/500 [======>.......................] - ETA: 3:11 - loss: 0.2563 - regression_loss: 0.2432 - classification_loss: 0.0131 118/500 [======>.......................] - ETA: 3:11 - loss: 0.2551 - regression_loss: 0.2420 - classification_loss: 0.0131 119/500 [======>.......................] - ETA: 3:10 - loss: 0.2546 - regression_loss: 0.2416 - classification_loss: 0.0130 120/500 [======>.......................] - ETA: 3:10 - loss: 0.2559 - regression_loss: 0.2429 - classification_loss: 0.0130 121/500 [======>.......................] - ETA: 3:09 - loss: 0.2566 - regression_loss: 0.2436 - classification_loss: 0.0130 122/500 [======>.......................] - ETA: 3:09 - loss: 0.2575 - regression_loss: 0.2445 - classification_loss: 0.0130 123/500 [======>.......................] - ETA: 3:08 - loss: 0.2564 - regression_loss: 0.2435 - classification_loss: 0.0129 124/500 [======>.......................] - ETA: 3:08 - loss: 0.2561 - regression_loss: 0.2432 - classification_loss: 0.0129 125/500 [======>.......................] - ETA: 3:07 - loss: 0.2556 - regression_loss: 0.2427 - classification_loss: 0.0128 126/500 [======>.......................] - ETA: 3:07 - loss: 0.2544 - regression_loss: 0.2416 - classification_loss: 0.0128 127/500 [======>.......................] - ETA: 3:06 - loss: 0.2534 - regression_loss: 0.2407 - classification_loss: 0.0127 128/500 [======>.......................] - ETA: 3:06 - loss: 0.2548 - regression_loss: 0.2419 - classification_loss: 0.0129 129/500 [======>.......................] - ETA: 3:06 - loss: 0.2553 - regression_loss: 0.2425 - classification_loss: 0.0128 130/500 [======>.......................] - ETA: 3:05 - loss: 0.2551 - regression_loss: 0.2422 - classification_loss: 0.0129 131/500 [======>.......................] - ETA: 3:04 - loss: 0.2551 - regression_loss: 0.2422 - classification_loss: 0.0128 132/500 [======>.......................] - ETA: 3:04 - loss: 0.2545 - regression_loss: 0.2418 - classification_loss: 0.0128 133/500 [======>.......................] - ETA: 3:03 - loss: 0.2548 - regression_loss: 0.2420 - classification_loss: 0.0128 134/500 [=======>......................] - ETA: 3:03 - loss: 0.2544 - regression_loss: 0.2416 - classification_loss: 0.0128 135/500 [=======>......................] - ETA: 3:02 - loss: 0.2544 - regression_loss: 0.2417 - classification_loss: 0.0127 136/500 [=======>......................] - ETA: 3:02 - loss: 0.2554 - regression_loss: 0.2427 - classification_loss: 0.0127 137/500 [=======>......................] - ETA: 3:01 - loss: 0.2563 - regression_loss: 0.2436 - classification_loss: 0.0127 138/500 [=======>......................] - ETA: 3:01 - loss: 0.2567 - regression_loss: 0.2440 - classification_loss: 0.0127 139/500 [=======>......................] - ETA: 3:00 - loss: 0.2580 - regression_loss: 0.2453 - classification_loss: 0.0127 140/500 [=======>......................] - ETA: 3:00 - loss: 0.2585 - regression_loss: 0.2457 - classification_loss: 0.0128 141/500 [=======>......................] - ETA: 3:00 - loss: 0.2590 - regression_loss: 0.2462 - classification_loss: 0.0128 142/500 [=======>......................] - ETA: 2:59 - loss: 0.2589 - regression_loss: 0.2461 - classification_loss: 0.0127 143/500 [=======>......................] - ETA: 2:58 - loss: 0.2582 - regression_loss: 0.2455 - classification_loss: 0.0127 144/500 [=======>......................] - ETA: 2:58 - loss: 0.2582 - regression_loss: 0.2454 - classification_loss: 0.0128 145/500 [=======>......................] - ETA: 2:57 - loss: 0.2596 - regression_loss: 0.2468 - classification_loss: 0.0129 146/500 [=======>......................] - ETA: 2:57 - loss: 0.2591 - regression_loss: 0.2463 - classification_loss: 0.0128 147/500 [=======>......................] - ETA: 2:56 - loss: 0.2580 - regression_loss: 0.2452 - classification_loss: 0.0128 148/500 [=======>......................] - ETA: 2:56 - loss: 0.2581 - regression_loss: 0.2454 - classification_loss: 0.0127 149/500 [=======>......................] - ETA: 2:55 - loss: 0.2601 - regression_loss: 0.2472 - classification_loss: 0.0128 150/500 [========>.....................] - ETA: 2:55 - loss: 0.2613 - regression_loss: 0.2483 - classification_loss: 0.0130 151/500 [========>.....................] - ETA: 2:54 - loss: 0.2613 - regression_loss: 0.2483 - classification_loss: 0.0129 152/500 [========>.....................] - ETA: 2:54 - loss: 0.2615 - regression_loss: 0.2486 - classification_loss: 0.0129 153/500 [========>.....................] - ETA: 2:53 - loss: 0.2612 - regression_loss: 0.2483 - classification_loss: 0.0129 154/500 [========>.....................] - ETA: 2:53 - loss: 0.2604 - regression_loss: 0.2476 - classification_loss: 0.0128 155/500 [========>.....................] - ETA: 2:52 - loss: 0.2597 - regression_loss: 0.2469 - classification_loss: 0.0128 156/500 [========>.....................] - ETA: 2:52 - loss: 0.2609 - regression_loss: 0.2480 - classification_loss: 0.0129 157/500 [========>.....................] - ETA: 2:52 - loss: 0.2618 - regression_loss: 0.2488 - classification_loss: 0.0130 158/500 [========>.....................] - ETA: 2:51 - loss: 0.2614 - regression_loss: 0.2484 - classification_loss: 0.0129 159/500 [========>.....................] - ETA: 2:50 - loss: 0.2607 - regression_loss: 0.2478 - classification_loss: 0.0129 160/500 [========>.....................] - ETA: 2:50 - loss: 0.2599 - regression_loss: 0.2471 - classification_loss: 0.0128 161/500 [========>.....................] - ETA: 2:50 - loss: 0.2598 - regression_loss: 0.2470 - classification_loss: 0.0128 162/500 [========>.....................] - ETA: 2:49 - loss: 0.2601 - regression_loss: 0.2474 - classification_loss: 0.0127 163/500 [========>.....................] - ETA: 2:48 - loss: 0.2593 - regression_loss: 0.2466 - classification_loss: 0.0127 164/500 [========>.....................] - ETA: 2:48 - loss: 0.2588 - regression_loss: 0.2462 - classification_loss: 0.0126 165/500 [========>.....................] - ETA: 2:47 - loss: 0.2589 - regression_loss: 0.2463 - classification_loss: 0.0126 166/500 [========>.....................] - ETA: 2:47 - loss: 0.2583 - regression_loss: 0.2458 - classification_loss: 0.0126 167/500 [=========>....................] - ETA: 2:46 - loss: 0.2592 - regression_loss: 0.2466 - classification_loss: 0.0127 168/500 [=========>....................] - ETA: 2:46 - loss: 0.2607 - regression_loss: 0.2480 - classification_loss: 0.0127 169/500 [=========>....................] - ETA: 2:45 - loss: 0.2604 - regression_loss: 0.2477 - classification_loss: 0.0127 170/500 [=========>....................] - ETA: 2:45 - loss: 0.2617 - regression_loss: 0.2489 - classification_loss: 0.0129 171/500 [=========>....................] - ETA: 2:44 - loss: 0.2618 - regression_loss: 0.2489 - classification_loss: 0.0129 172/500 [=========>....................] - ETA: 2:44 - loss: 0.2623 - regression_loss: 0.2493 - classification_loss: 0.0129 173/500 [=========>....................] - ETA: 2:43 - loss: 0.2621 - regression_loss: 0.2492 - classification_loss: 0.0129 174/500 [=========>....................] - ETA: 2:43 - loss: 0.2616 - regression_loss: 0.2488 - classification_loss: 0.0129 175/500 [=========>....................] - ETA: 2:42 - loss: 0.2616 - regression_loss: 0.2488 - classification_loss: 0.0129 176/500 [=========>....................] - ETA: 2:42 - loss: 0.2625 - regression_loss: 0.2496 - classification_loss: 0.0130 177/500 [=========>....................] - ETA: 2:41 - loss: 0.2628 - regression_loss: 0.2498 - classification_loss: 0.0130 178/500 [=========>....................] - ETA: 2:41 - loss: 0.2630 - regression_loss: 0.2500 - classification_loss: 0.0130 179/500 [=========>....................] - ETA: 2:40 - loss: 0.2621 - regression_loss: 0.2492 - classification_loss: 0.0130 180/500 [=========>....................] - ETA: 2:40 - loss: 0.2612 - regression_loss: 0.2482 - classification_loss: 0.0129 181/500 [=========>....................] - ETA: 2:39 - loss: 0.2612 - regression_loss: 0.2483 - classification_loss: 0.0129 182/500 [=========>....................] - ETA: 2:39 - loss: 0.2613 - regression_loss: 0.2484 - classification_loss: 0.0129 183/500 [=========>....................] - ETA: 2:38 - loss: 0.2609 - regression_loss: 0.2481 - classification_loss: 0.0128 184/500 [==========>...................] - ETA: 2:38 - loss: 0.2612 - regression_loss: 0.2484 - classification_loss: 0.0128 185/500 [==========>...................] - ETA: 2:37 - loss: 0.2612 - regression_loss: 0.2484 - classification_loss: 0.0128 186/500 [==========>...................] - ETA: 2:37 - loss: 0.2614 - regression_loss: 0.2486 - classification_loss: 0.0128 187/500 [==========>...................] - ETA: 2:36 - loss: 0.2613 - regression_loss: 0.2485 - classification_loss: 0.0128 188/500 [==========>...................] - ETA: 2:36 - loss: 0.2607 - regression_loss: 0.2479 - classification_loss: 0.0128 189/500 [==========>...................] - ETA: 2:35 - loss: 0.2611 - regression_loss: 0.2483 - classification_loss: 0.0128 190/500 [==========>...................] - ETA: 2:35 - loss: 0.2607 - regression_loss: 0.2479 - classification_loss: 0.0128 191/500 [==========>...................] - ETA: 2:34 - loss: 0.2608 - regression_loss: 0.2480 - classification_loss: 0.0128 192/500 [==========>...................] - ETA: 2:34 - loss: 0.2604 - regression_loss: 0.2476 - classification_loss: 0.0128 193/500 [==========>...................] - ETA: 2:33 - loss: 0.2604 - regression_loss: 0.2475 - classification_loss: 0.0129 194/500 [==========>...................] - ETA: 2:33 - loss: 0.2602 - regression_loss: 0.2473 - classification_loss: 0.0129 195/500 [==========>...................] - ETA: 2:32 - loss: 0.2601 - regression_loss: 0.2472 - classification_loss: 0.0129 196/500 [==========>...................] - ETA: 2:32 - loss: 0.2597 - regression_loss: 0.2469 - classification_loss: 0.0129 197/500 [==========>...................] - ETA: 2:31 - loss: 0.2591 - regression_loss: 0.2463 - classification_loss: 0.0128 198/500 [==========>...................] - ETA: 2:31 - loss: 0.2588 - regression_loss: 0.2460 - classification_loss: 0.0128 199/500 [==========>...................] - ETA: 2:30 - loss: 0.2581 - regression_loss: 0.2454 - classification_loss: 0.0127 200/500 [===========>..................] - ETA: 2:30 - loss: 0.2576 - regression_loss: 0.2449 - classification_loss: 0.0127 201/500 [===========>..................] - ETA: 2:29 - loss: 0.2564 - regression_loss: 0.2437 - classification_loss: 0.0126 202/500 [===========>..................] - ETA: 2:29 - loss: 0.2571 - regression_loss: 0.2444 - classification_loss: 0.0126 203/500 [===========>..................] - ETA: 2:28 - loss: 0.2570 - regression_loss: 0.2444 - classification_loss: 0.0126 204/500 [===========>..................] - ETA: 2:28 - loss: 0.2571 - regression_loss: 0.2445 - classification_loss: 0.0126 205/500 [===========>..................] - ETA: 2:27 - loss: 0.2565 - regression_loss: 0.2439 - classification_loss: 0.0126 206/500 [===========>..................] - ETA: 2:27 - loss: 0.2570 - regression_loss: 0.2444 - classification_loss: 0.0126 207/500 [===========>..................] - ETA: 2:26 - loss: 0.2564 - regression_loss: 0.2439 - classification_loss: 0.0125 208/500 [===========>..................] - ETA: 2:26 - loss: 0.2560 - regression_loss: 0.2435 - classification_loss: 0.0125 209/500 [===========>..................] - ETA: 2:25 - loss: 0.2566 - regression_loss: 0.2440 - classification_loss: 0.0126 210/500 [===========>..................] - ETA: 2:25 - loss: 0.2560 - regression_loss: 0.2434 - classification_loss: 0.0125 211/500 [===========>..................] - ETA: 2:24 - loss: 0.2558 - regression_loss: 0.2433 - classification_loss: 0.0126 212/500 [===========>..................] - ETA: 2:24 - loss: 0.2565 - regression_loss: 0.2439 - classification_loss: 0.0126 213/500 [===========>..................] - ETA: 2:23 - loss: 0.2563 - regression_loss: 0.2437 - classification_loss: 0.0126 214/500 [===========>..................] - ETA: 2:23 - loss: 0.2557 - regression_loss: 0.2432 - classification_loss: 0.0126 215/500 [===========>..................] - ETA: 2:22 - loss: 0.2555 - regression_loss: 0.2429 - classification_loss: 0.0125 216/500 [===========>..................] - ETA: 2:22 - loss: 0.2555 - regression_loss: 0.2430 - classification_loss: 0.0125 217/500 [============>.................] - ETA: 2:21 - loss: 0.2552 - regression_loss: 0.2427 - classification_loss: 0.0125 218/500 [============>.................] - ETA: 2:21 - loss: 0.2554 - regression_loss: 0.2429 - classification_loss: 0.0125 219/500 [============>.................] - ETA: 2:20 - loss: 0.2560 - regression_loss: 0.2434 - classification_loss: 0.0126 220/500 [============>.................] - ETA: 2:20 - loss: 0.2558 - regression_loss: 0.2432 - classification_loss: 0.0126 221/500 [============>.................] - ETA: 2:19 - loss: 0.2551 - regression_loss: 0.2426 - classification_loss: 0.0125 222/500 [============>.................] - ETA: 2:19 - loss: 0.2548 - regression_loss: 0.2423 - classification_loss: 0.0125 223/500 [============>.................] - ETA: 2:18 - loss: 0.2548 - regression_loss: 0.2423 - classification_loss: 0.0125 224/500 [============>.................] - ETA: 2:18 - loss: 0.2549 - regression_loss: 0.2424 - classification_loss: 0.0126 225/500 [============>.................] - ETA: 2:17 - loss: 0.2547 - regression_loss: 0.2421 - classification_loss: 0.0125 226/500 [============>.................] - ETA: 2:17 - loss: 0.2544 - regression_loss: 0.2419 - classification_loss: 0.0125 227/500 [============>.................] - ETA: 2:16 - loss: 0.2549 - regression_loss: 0.2422 - classification_loss: 0.0127 228/500 [============>.................] - ETA: 2:16 - loss: 0.2548 - regression_loss: 0.2421 - classification_loss: 0.0127 229/500 [============>.................] - ETA: 2:15 - loss: 0.2547 - regression_loss: 0.2420 - classification_loss: 0.0127 230/500 [============>.................] - ETA: 2:15 - loss: 0.2541 - regression_loss: 0.2414 - classification_loss: 0.0127 231/500 [============>.................] - ETA: 2:14 - loss: 0.2537 - regression_loss: 0.2411 - classification_loss: 0.0126 232/500 [============>.................] - ETA: 2:14 - loss: 0.2549 - regression_loss: 0.2423 - classification_loss: 0.0126 233/500 [============>.................] - ETA: 2:13 - loss: 0.2545 - regression_loss: 0.2420 - classification_loss: 0.0126 234/500 [=============>................] - ETA: 2:13 - loss: 0.2541 - regression_loss: 0.2415 - classification_loss: 0.0126 235/500 [=============>................] - ETA: 2:12 - loss: 0.2539 - regression_loss: 0.2413 - classification_loss: 0.0125 236/500 [=============>................] - ETA: 2:12 - loss: 0.2534 - regression_loss: 0.2408 - classification_loss: 0.0125 237/500 [=============>................] - ETA: 2:11 - loss: 0.2534 - regression_loss: 0.2409 - classification_loss: 0.0125 238/500 [=============>................] - ETA: 2:11 - loss: 0.2532 - regression_loss: 0.2407 - classification_loss: 0.0125 239/500 [=============>................] - ETA: 2:10 - loss: 0.2527 - regression_loss: 0.2402 - classification_loss: 0.0125 240/500 [=============>................] - ETA: 2:10 - loss: 0.2524 - regression_loss: 0.2399 - classification_loss: 0.0125 241/500 [=============>................] - ETA: 2:09 - loss: 0.2520 - regression_loss: 0.2396 - classification_loss: 0.0124 242/500 [=============>................] - ETA: 2:09 - loss: 0.2520 - regression_loss: 0.2395 - classification_loss: 0.0124 243/500 [=============>................] - ETA: 2:08 - loss: 0.2519 - regression_loss: 0.2394 - classification_loss: 0.0124 244/500 [=============>................] - ETA: 2:08 - loss: 0.2517 - regression_loss: 0.2392 - classification_loss: 0.0124 245/500 [=============>................] - ETA: 2:07 - loss: 0.2513 - regression_loss: 0.2389 - classification_loss: 0.0124 246/500 [=============>................] - ETA: 2:07 - loss: 0.2516 - regression_loss: 0.2391 - classification_loss: 0.0125 247/500 [=============>................] - ETA: 2:06 - loss: 0.2516 - regression_loss: 0.2392 - classification_loss: 0.0125 248/500 [=============>................] - ETA: 2:06 - loss: 0.2520 - regression_loss: 0.2394 - classification_loss: 0.0125 249/500 [=============>................] - ETA: 2:05 - loss: 0.2523 - regression_loss: 0.2398 - classification_loss: 0.0125 250/500 [==============>...............] - ETA: 2:05 - loss: 0.2518 - regression_loss: 0.2394 - classification_loss: 0.0125 251/500 [==============>...............] - ETA: 2:04 - loss: 0.2519 - regression_loss: 0.2395 - classification_loss: 0.0124 252/500 [==============>...............] - ETA: 2:04 - loss: 0.2519 - regression_loss: 0.2395 - classification_loss: 0.0124 253/500 [==============>...............] - ETA: 2:03 - loss: 0.2518 - regression_loss: 0.2394 - classification_loss: 0.0124 254/500 [==============>...............] - ETA: 2:03 - loss: 0.2517 - regression_loss: 0.2393 - classification_loss: 0.0124 255/500 [==============>...............] - ETA: 2:02 - loss: 0.2512 - regression_loss: 0.2389 - classification_loss: 0.0124 256/500 [==============>...............] - ETA: 2:02 - loss: 0.2510 - regression_loss: 0.2386 - classification_loss: 0.0123 257/500 [==============>...............] - ETA: 2:01 - loss: 0.2511 - regression_loss: 0.2387 - classification_loss: 0.0124 258/500 [==============>...............] - ETA: 2:01 - loss: 0.2508 - regression_loss: 0.2384 - classification_loss: 0.0124 259/500 [==============>...............] - ETA: 2:00 - loss: 0.2508 - regression_loss: 0.2384 - classification_loss: 0.0124 260/500 [==============>...............] - ETA: 2:00 - loss: 0.2503 - regression_loss: 0.2379 - classification_loss: 0.0123 261/500 [==============>...............] - ETA: 1:59 - loss: 0.2501 - regression_loss: 0.2378 - classification_loss: 0.0123 262/500 [==============>...............] - ETA: 1:59 - loss: 0.2503 - regression_loss: 0.2380 - classification_loss: 0.0123 263/500 [==============>...............] - ETA: 1:58 - loss: 0.2503 - regression_loss: 0.2379 - classification_loss: 0.0123 264/500 [==============>...............] - ETA: 1:58 - loss: 0.2498 - regression_loss: 0.2375 - classification_loss: 0.0123 265/500 [==============>...............] - ETA: 1:57 - loss: 0.2498 - regression_loss: 0.2375 - classification_loss: 0.0123 266/500 [==============>...............] - ETA: 1:57 - loss: 0.2499 - regression_loss: 0.2376 - classification_loss: 0.0123 267/500 [===============>..............] - ETA: 1:56 - loss: 0.2500 - regression_loss: 0.2377 - classification_loss: 0.0123 268/500 [===============>..............] - ETA: 1:56 - loss: 0.2506 - regression_loss: 0.2382 - classification_loss: 0.0124 269/500 [===============>..............] - ETA: 1:55 - loss: 0.2506 - regression_loss: 0.2382 - classification_loss: 0.0124 270/500 [===============>..............] - ETA: 1:55 - loss: 0.2501 - regression_loss: 0.2377 - classification_loss: 0.0124 271/500 [===============>..............] - ETA: 1:54 - loss: 0.2500 - regression_loss: 0.2376 - classification_loss: 0.0123 272/500 [===============>..............] - ETA: 1:54 - loss: 0.2495 - regression_loss: 0.2372 - classification_loss: 0.0123 273/500 [===============>..............] - ETA: 1:53 - loss: 0.2495 - regression_loss: 0.2372 - classification_loss: 0.0123 274/500 [===============>..............] - ETA: 1:53 - loss: 0.2495 - regression_loss: 0.2372 - classification_loss: 0.0123 275/500 [===============>..............] - ETA: 1:52 - loss: 0.2493 - regression_loss: 0.2370 - classification_loss: 0.0122 276/500 [===============>..............] - ETA: 1:52 - loss: 0.2494 - regression_loss: 0.2372 - classification_loss: 0.0122 277/500 [===============>..............] - ETA: 1:51 - loss: 0.2491 - regression_loss: 0.2369 - classification_loss: 0.0122 278/500 [===============>..............] - ETA: 1:51 - loss: 0.2489 - regression_loss: 0.2367 - classification_loss: 0.0122 279/500 [===============>..............] - ETA: 1:50 - loss: 0.2489 - regression_loss: 0.2368 - classification_loss: 0.0122 280/500 [===============>..............] - ETA: 1:50 - loss: 0.2488 - regression_loss: 0.2367 - classification_loss: 0.0121 281/500 [===============>..............] - ETA: 1:49 - loss: 0.2489 - regression_loss: 0.2368 - classification_loss: 0.0121 282/500 [===============>..............] - ETA: 1:49 - loss: 0.2487 - regression_loss: 0.2366 - classification_loss: 0.0121 283/500 [===============>..............] - ETA: 1:48 - loss: 0.2494 - regression_loss: 0.2365 - classification_loss: 0.0128 284/500 [================>.............] - ETA: 1:48 - loss: 0.2489 - regression_loss: 0.2361 - classification_loss: 0.0128 285/500 [================>.............] - ETA: 1:47 - loss: 0.2489 - regression_loss: 0.2361 - classification_loss: 0.0128 286/500 [================>.............] - ETA: 1:47 - loss: 0.2488 - regression_loss: 0.2361 - classification_loss: 0.0127 287/500 [================>.............] - ETA: 1:46 - loss: 0.2487 - regression_loss: 0.2360 - classification_loss: 0.0127 288/500 [================>.............] - ETA: 1:46 - loss: 0.2482 - regression_loss: 0.2355 - classification_loss: 0.0127 289/500 [================>.............] - ETA: 1:45 - loss: 0.2478 - regression_loss: 0.2351 - classification_loss: 0.0127 290/500 [================>.............] - ETA: 1:45 - loss: 0.2481 - regression_loss: 0.2355 - classification_loss: 0.0126 291/500 [================>.............] - ETA: 1:44 - loss: 0.2479 - regression_loss: 0.2353 - classification_loss: 0.0126 292/500 [================>.............] - ETA: 1:44 - loss: 0.2484 - regression_loss: 0.2357 - classification_loss: 0.0126 293/500 [================>.............] - ETA: 1:43 - loss: 0.2485 - regression_loss: 0.2359 - classification_loss: 0.0127 294/500 [================>.............] - ETA: 1:43 - loss: 0.2484 - regression_loss: 0.2357 - classification_loss: 0.0127 295/500 [================>.............] - ETA: 1:42 - loss: 0.2482 - regression_loss: 0.2355 - classification_loss: 0.0126 296/500 [================>.............] - ETA: 1:42 - loss: 0.2486 - regression_loss: 0.2359 - classification_loss: 0.0127 297/500 [================>.............] - ETA: 1:41 - loss: 0.2484 - regression_loss: 0.2357 - classification_loss: 0.0127 298/500 [================>.............] - ETA: 1:41 - loss: 0.2486 - regression_loss: 0.2358 - classification_loss: 0.0127 299/500 [================>.............] - ETA: 1:40 - loss: 0.2485 - regression_loss: 0.2358 - classification_loss: 0.0127 300/500 [=================>............] - ETA: 1:40 - loss: 0.2482 - regression_loss: 0.2355 - classification_loss: 0.0127 301/500 [=================>............] - ETA: 1:39 - loss: 0.2480 - regression_loss: 0.2353 - classification_loss: 0.0127 302/500 [=================>............] - ETA: 1:39 - loss: 0.2482 - regression_loss: 0.2355 - classification_loss: 0.0127 303/500 [=================>............] - ETA: 1:38 - loss: 0.2479 - regression_loss: 0.2353 - classification_loss: 0.0126 304/500 [=================>............] - ETA: 1:38 - loss: 0.2478 - regression_loss: 0.2352 - classification_loss: 0.0126 305/500 [=================>............] - ETA: 1:37 - loss: 0.2481 - regression_loss: 0.2355 - classification_loss: 0.0126 306/500 [=================>............] - ETA: 1:37 - loss: 0.2477 - regression_loss: 0.2351 - classification_loss: 0.0126 307/500 [=================>............] - ETA: 1:36 - loss: 0.2474 - regression_loss: 0.2348 - classification_loss: 0.0126 308/500 [=================>............] - ETA: 1:36 - loss: 0.2473 - regression_loss: 0.2348 - classification_loss: 0.0126 309/500 [=================>............] - ETA: 1:35 - loss: 0.2471 - regression_loss: 0.2345 - classification_loss: 0.0126 310/500 [=================>............] - ETA: 1:35 - loss: 0.2471 - regression_loss: 0.2346 - classification_loss: 0.0125 311/500 [=================>............] - ETA: 1:34 - loss: 0.2469 - regression_loss: 0.2344 - classification_loss: 0.0125 312/500 [=================>............] - ETA: 1:34 - loss: 0.2465 - regression_loss: 0.2341 - classification_loss: 0.0125 313/500 [=================>............] - ETA: 1:33 - loss: 0.2465 - regression_loss: 0.2341 - classification_loss: 0.0125 314/500 [=================>............] - ETA: 1:33 - loss: 0.2463 - regression_loss: 0.2339 - classification_loss: 0.0125 315/500 [=================>............] - ETA: 1:32 - loss: 0.2458 - regression_loss: 0.2334 - classification_loss: 0.0124 316/500 [=================>............] - ETA: 1:32 - loss: 0.2456 - regression_loss: 0.2332 - classification_loss: 0.0124 317/500 [==================>...........] - ETA: 1:31 - loss: 0.2457 - regression_loss: 0.2332 - classification_loss: 0.0125 318/500 [==================>...........] - ETA: 1:31 - loss: 0.2461 - regression_loss: 0.2336 - classification_loss: 0.0125 319/500 [==================>...........] - ETA: 1:30 - loss: 0.2458 - regression_loss: 0.2333 - classification_loss: 0.0124 320/500 [==================>...........] - ETA: 1:30 - loss: 0.2459 - regression_loss: 0.2335 - classification_loss: 0.0124 321/500 [==================>...........] - ETA: 1:29 - loss: 0.2462 - regression_loss: 0.2338 - classification_loss: 0.0125 322/500 [==================>...........] - ETA: 1:29 - loss: 0.2461 - regression_loss: 0.2336 - classification_loss: 0.0124 323/500 [==================>...........] - ETA: 1:28 - loss: 0.2461 - regression_loss: 0.2337 - classification_loss: 0.0124 324/500 [==================>...........] - ETA: 1:28 - loss: 0.2465 - regression_loss: 0.2340 - classification_loss: 0.0124 325/500 [==================>...........] - ETA: 1:27 - loss: 0.2463 - regression_loss: 0.2339 - classification_loss: 0.0124 326/500 [==================>...........] - ETA: 1:27 - loss: 0.2459 - regression_loss: 0.2336 - classification_loss: 0.0124 327/500 [==================>...........] - ETA: 1:26 - loss: 0.2460 - regression_loss: 0.2336 - classification_loss: 0.0124 328/500 [==================>...........] - ETA: 1:26 - loss: 0.2460 - regression_loss: 0.2336 - classification_loss: 0.0124 329/500 [==================>...........] - ETA: 1:25 - loss: 0.2459 - regression_loss: 0.2336 - classification_loss: 0.0124 330/500 [==================>...........] - ETA: 1:25 - loss: 0.2456 - regression_loss: 0.2333 - classification_loss: 0.0123 331/500 [==================>...........] - ETA: 1:24 - loss: 0.2459 - regression_loss: 0.2335 - classification_loss: 0.0124 332/500 [==================>...........] - ETA: 1:24 - loss: 0.2457 - regression_loss: 0.2333 - classification_loss: 0.0124 333/500 [==================>...........] - ETA: 1:23 - loss: 0.2462 - regression_loss: 0.2338 - classification_loss: 0.0124 334/500 [===================>..........] - ETA: 1:23 - loss: 0.2461 - regression_loss: 0.2337 - classification_loss: 0.0124 335/500 [===================>..........] - ETA: 1:22 - loss: 0.2462 - regression_loss: 0.2338 - classification_loss: 0.0124 336/500 [===================>..........] - ETA: 1:22 - loss: 0.2466 - regression_loss: 0.2342 - classification_loss: 0.0124 337/500 [===================>..........] - ETA: 1:21 - loss: 0.2468 - regression_loss: 0.2344 - classification_loss: 0.0124 338/500 [===================>..........] - ETA: 1:21 - loss: 0.2466 - regression_loss: 0.2343 - classification_loss: 0.0124 339/500 [===================>..........] - ETA: 1:20 - loss: 0.2470 - regression_loss: 0.2346 - classification_loss: 0.0124 340/500 [===================>..........] - ETA: 1:20 - loss: 0.2470 - regression_loss: 0.2346 - classification_loss: 0.0124 341/500 [===================>..........] - ETA: 1:19 - loss: 0.2470 - regression_loss: 0.2346 - classification_loss: 0.0124 342/500 [===================>..........] - ETA: 1:19 - loss: 0.2473 - regression_loss: 0.2350 - classification_loss: 0.0124 343/500 [===================>..........] - ETA: 1:18 - loss: 0.2475 - regression_loss: 0.2351 - classification_loss: 0.0124 344/500 [===================>..........] - ETA: 1:18 - loss: 0.2470 - regression_loss: 0.2346 - classification_loss: 0.0123 345/500 [===================>..........] - ETA: 1:17 - loss: 0.2473 - regression_loss: 0.2350 - classification_loss: 0.0123 346/500 [===================>..........] - ETA: 1:17 - loss: 0.2473 - regression_loss: 0.2350 - classification_loss: 0.0123 347/500 [===================>..........] - ETA: 1:16 - loss: 0.2467 - regression_loss: 0.2345 - classification_loss: 0.0123 348/500 [===================>..........] - ETA: 1:16 - loss: 0.2469 - regression_loss: 0.2346 - classification_loss: 0.0123 349/500 [===================>..........] - ETA: 1:15 - loss: 0.2472 - regression_loss: 0.2348 - classification_loss: 0.0124 350/500 [====================>.........] - ETA: 1:15 - loss: 0.2477 - regression_loss: 0.2353 - classification_loss: 0.0124 351/500 [====================>.........] - ETA: 1:14 - loss: 0.2476 - regression_loss: 0.2351 - classification_loss: 0.0124 352/500 [====================>.........] - ETA: 1:14 - loss: 0.2475 - regression_loss: 0.2351 - classification_loss: 0.0124 353/500 [====================>.........] - ETA: 1:13 - loss: 0.2476 - regression_loss: 0.2351 - classification_loss: 0.0124 354/500 [====================>.........] - ETA: 1:13 - loss: 0.2474 - regression_loss: 0.2350 - classification_loss: 0.0124 355/500 [====================>.........] - ETA: 1:12 - loss: 0.2473 - regression_loss: 0.2349 - classification_loss: 0.0124 356/500 [====================>.........] - ETA: 1:12 - loss: 0.2474 - regression_loss: 0.2350 - classification_loss: 0.0124 357/500 [====================>.........] - ETA: 1:11 - loss: 0.2474 - regression_loss: 0.2350 - classification_loss: 0.0124 358/500 [====================>.........] - ETA: 1:11 - loss: 0.2473 - regression_loss: 0.2349 - classification_loss: 0.0124 359/500 [====================>.........] - ETA: 1:10 - loss: 0.2469 - regression_loss: 0.2345 - classification_loss: 0.0124 360/500 [====================>.........] - ETA: 1:10 - loss: 0.2471 - regression_loss: 0.2348 - classification_loss: 0.0123 361/500 [====================>.........] - ETA: 1:09 - loss: 0.2472 - regression_loss: 0.2348 - classification_loss: 0.0124 362/500 [====================>.........] - ETA: 1:09 - loss: 0.2471 - regression_loss: 0.2347 - classification_loss: 0.0124 363/500 [====================>.........] - ETA: 1:08 - loss: 0.2475 - regression_loss: 0.2351 - classification_loss: 0.0124 364/500 [====================>.........] - ETA: 1:08 - loss: 0.2478 - regression_loss: 0.2354 - classification_loss: 0.0124 365/500 [====================>.........] - ETA: 1:07 - loss: 0.2476 - regression_loss: 0.2353 - classification_loss: 0.0124 366/500 [====================>.........] - ETA: 1:07 - loss: 0.2473 - regression_loss: 0.2349 - classification_loss: 0.0124 367/500 [=====================>........] - ETA: 1:06 - loss: 0.2472 - regression_loss: 0.2347 - classification_loss: 0.0124 368/500 [=====================>........] - ETA: 1:06 - loss: 0.2475 - regression_loss: 0.2349 - classification_loss: 0.0126 369/500 [=====================>........] - ETA: 1:05 - loss: 0.2473 - regression_loss: 0.2347 - classification_loss: 0.0126 370/500 [=====================>........] - ETA: 1:05 - loss: 0.2471 - regression_loss: 0.2345 - classification_loss: 0.0126 371/500 [=====================>........] - ETA: 1:04 - loss: 0.2468 - regression_loss: 0.2342 - classification_loss: 0.0125 372/500 [=====================>........] - ETA: 1:04 - loss: 0.2466 - regression_loss: 0.2340 - classification_loss: 0.0125 373/500 [=====================>........] - ETA: 1:03 - loss: 0.2463 - regression_loss: 0.2338 - classification_loss: 0.0125 374/500 [=====================>........] - ETA: 1:03 - loss: 0.2465 - regression_loss: 0.2340 - classification_loss: 0.0125 375/500 [=====================>........] - ETA: 1:02 - loss: 0.2467 - regression_loss: 0.2342 - classification_loss: 0.0125 376/500 [=====================>........] - ETA: 1:02 - loss: 0.2467 - regression_loss: 0.2342 - classification_loss: 0.0125 377/500 [=====================>........] - ETA: 1:01 - loss: 0.2467 - regression_loss: 0.2343 - classification_loss: 0.0125 378/500 [=====================>........] - ETA: 1:01 - loss: 0.2469 - regression_loss: 0.2344 - classification_loss: 0.0125 379/500 [=====================>........] - ETA: 1:00 - loss: 0.2474 - regression_loss: 0.2349 - classification_loss: 0.0125 380/500 [=====================>........] - ETA: 1:00 - loss: 0.2477 - regression_loss: 0.2352 - classification_loss: 0.0125 381/500 [=====================>........] - ETA: 59s - loss: 0.2482 - regression_loss: 0.2357 - classification_loss: 0.0126  382/500 [=====================>........] - ETA: 59s - loss: 0.2480 - regression_loss: 0.2354 - classification_loss: 0.0125 383/500 [=====================>........] - ETA: 58s - loss: 0.2483 - regression_loss: 0.2358 - classification_loss: 0.0126 384/500 [======================>.......] - ETA: 58s - loss: 0.2482 - regression_loss: 0.2357 - classification_loss: 0.0126 385/500 [======================>.......] - ETA: 57s - loss: 0.2483 - regression_loss: 0.2358 - classification_loss: 0.0126 386/500 [======================>.......] - ETA: 57s - loss: 0.2482 - regression_loss: 0.2357 - classification_loss: 0.0125 387/500 [======================>.......] - ETA: 56s - loss: 0.2484 - regression_loss: 0.2359 - classification_loss: 0.0125 388/500 [======================>.......] - ETA: 56s - loss: 0.2483 - regression_loss: 0.2358 - classification_loss: 0.0125 389/500 [======================>.......] - ETA: 55s - loss: 0.2484 - regression_loss: 0.2359 - classification_loss: 0.0126 390/500 [======================>.......] - ETA: 55s - loss: 0.2486 - regression_loss: 0.2361 - classification_loss: 0.0125 391/500 [======================>.......] - ETA: 54s - loss: 0.2485 - regression_loss: 0.2359 - classification_loss: 0.0125 392/500 [======================>.......] - ETA: 54s - loss: 0.2483 - regression_loss: 0.2358 - classification_loss: 0.0125 393/500 [======================>.......] - ETA: 53s - loss: 0.2481 - regression_loss: 0.2356 - classification_loss: 0.0125 394/500 [======================>.......] - ETA: 53s - loss: 0.2481 - regression_loss: 0.2356 - classification_loss: 0.0125 395/500 [======================>.......] - ETA: 52s - loss: 0.2479 - regression_loss: 0.2355 - classification_loss: 0.0125 396/500 [======================>.......] - ETA: 52s - loss: 0.2479 - regression_loss: 0.2354 - classification_loss: 0.0125 397/500 [======================>.......] - ETA: 51s - loss: 0.2476 - regression_loss: 0.2351 - classification_loss: 0.0125 398/500 [======================>.......] - ETA: 51s - loss: 0.2475 - regression_loss: 0.2351 - classification_loss: 0.0124 399/500 [======================>.......] - ETA: 50s - loss: 0.2476 - regression_loss: 0.2352 - classification_loss: 0.0124 400/500 [=======================>......] - ETA: 50s - loss: 0.2473 - regression_loss: 0.2349 - classification_loss: 0.0124 401/500 [=======================>......] - ETA: 49s - loss: 0.2474 - regression_loss: 0.2349 - classification_loss: 0.0124 402/500 [=======================>......] - ETA: 49s - loss: 0.2470 - regression_loss: 0.2346 - classification_loss: 0.0124 403/500 [=======================>......] - ETA: 48s - loss: 0.2468 - regression_loss: 0.2345 - classification_loss: 0.0124 404/500 [=======================>......] - ETA: 48s - loss: 0.2472 - regression_loss: 0.2348 - classification_loss: 0.0124 405/500 [=======================>......] - ETA: 47s - loss: 0.2471 - regression_loss: 0.2347 - classification_loss: 0.0124 406/500 [=======================>......] - ETA: 47s - loss: 0.2470 - regression_loss: 0.2346 - classification_loss: 0.0124 407/500 [=======================>......] - ETA: 46s - loss: 0.2472 - regression_loss: 0.2348 - classification_loss: 0.0124 408/500 [=======================>......] - ETA: 46s - loss: 0.2468 - regression_loss: 0.2344 - classification_loss: 0.0123 409/500 [=======================>......] - ETA: 45s - loss: 0.2464 - regression_loss: 0.2341 - classification_loss: 0.0123 410/500 [=======================>......] - ETA: 45s - loss: 0.2463 - regression_loss: 0.2340 - classification_loss: 0.0123 411/500 [=======================>......] - ETA: 44s - loss: 0.2463 - regression_loss: 0.2340 - classification_loss: 0.0123 412/500 [=======================>......] - ETA: 44s - loss: 0.2462 - regression_loss: 0.2339 - classification_loss: 0.0123 413/500 [=======================>......] - ETA: 43s - loss: 0.2457 - regression_loss: 0.2334 - classification_loss: 0.0123 414/500 [=======================>......] - ETA: 43s - loss: 0.2457 - regression_loss: 0.2334 - classification_loss: 0.0123 415/500 [=======================>......] - ETA: 42s - loss: 0.2458 - regression_loss: 0.2335 - classification_loss: 0.0123 416/500 [=======================>......] - ETA: 42s - loss: 0.2461 - regression_loss: 0.2338 - classification_loss: 0.0123 417/500 [========================>.....] - ETA: 41s - loss: 0.2468 - regression_loss: 0.2344 - classification_loss: 0.0124 418/500 [========================>.....] - ETA: 41s - loss: 0.2469 - regression_loss: 0.2346 - classification_loss: 0.0123 419/500 [========================>.....] - ETA: 40s - loss: 0.2471 - regression_loss: 0.2348 - classification_loss: 0.0124 420/500 [========================>.....] - ETA: 40s - loss: 0.2472 - regression_loss: 0.2349 - classification_loss: 0.0124 421/500 [========================>.....] - ETA: 39s - loss: 0.2473 - regression_loss: 0.2350 - classification_loss: 0.0123 422/500 [========================>.....] - ETA: 39s - loss: 0.2473 - regression_loss: 0.2349 - classification_loss: 0.0123 423/500 [========================>.....] - ETA: 38s - loss: 0.2473 - regression_loss: 0.2350 - classification_loss: 0.0123 424/500 [========================>.....] - ETA: 38s - loss: 0.2470 - regression_loss: 0.2347 - classification_loss: 0.0123 425/500 [========================>.....] - ETA: 37s - loss: 0.2473 - regression_loss: 0.2350 - classification_loss: 0.0123 426/500 [========================>.....] - ETA: 37s - loss: 0.2470 - regression_loss: 0.2347 - classification_loss: 0.0123 427/500 [========================>.....] - ETA: 36s - loss: 0.2472 - regression_loss: 0.2348 - classification_loss: 0.0124 428/500 [========================>.....] - ETA: 36s - loss: 0.2473 - regression_loss: 0.2349 - classification_loss: 0.0124 429/500 [========================>.....] - ETA: 35s - loss: 0.2471 - regression_loss: 0.2347 - classification_loss: 0.0124 430/500 [========================>.....] - ETA: 35s - loss: 0.2470 - regression_loss: 0.2346 - classification_loss: 0.0124 431/500 [========================>.....] - ETA: 34s - loss: 0.2470 - regression_loss: 0.2347 - classification_loss: 0.0123 432/500 [========================>.....] - ETA: 34s - loss: 0.2469 - regression_loss: 0.2346 - classification_loss: 0.0124 433/500 [========================>.....] - ETA: 33s - loss: 0.2470 - regression_loss: 0.2346 - classification_loss: 0.0123 434/500 [=========================>....] - ETA: 33s - loss: 0.2470 - regression_loss: 0.2346 - classification_loss: 0.0123 435/500 [=========================>....] - ETA: 32s - loss: 0.2472 - regression_loss: 0.2348 - classification_loss: 0.0124 436/500 [=========================>....] - ETA: 32s - loss: 0.2471 - regression_loss: 0.2347 - classification_loss: 0.0123 437/500 [=========================>....] - ETA: 31s - loss: 0.2468 - regression_loss: 0.2345 - classification_loss: 0.0123 438/500 [=========================>....] - ETA: 31s - loss: 0.2467 - regression_loss: 0.2344 - classification_loss: 0.0123 439/500 [=========================>....] - ETA: 30s - loss: 0.2468 - regression_loss: 0.2345 - classification_loss: 0.0123 440/500 [=========================>....] - ETA: 30s - loss: 0.2467 - regression_loss: 0.2343 - classification_loss: 0.0123 441/500 [=========================>....] - ETA: 29s - loss: 0.2466 - regression_loss: 0.2343 - classification_loss: 0.0123 442/500 [=========================>....] - ETA: 29s - loss: 0.2466 - regression_loss: 0.2343 - classification_loss: 0.0123 443/500 [=========================>....] - ETA: 28s - loss: 0.2465 - regression_loss: 0.2342 - classification_loss: 0.0123 444/500 [=========================>....] - ETA: 28s - loss: 0.2462 - regression_loss: 0.2340 - classification_loss: 0.0123 445/500 [=========================>....] - ETA: 27s - loss: 0.2460 - regression_loss: 0.2337 - classification_loss: 0.0123 446/500 [=========================>....] - ETA: 27s - loss: 0.2462 - regression_loss: 0.2340 - classification_loss: 0.0123 447/500 [=========================>....] - ETA: 26s - loss: 0.2461 - regression_loss: 0.2339 - classification_loss: 0.0123 448/500 [=========================>....] - ETA: 26s - loss: 0.2464 - regression_loss: 0.2341 - classification_loss: 0.0123 449/500 [=========================>....] - ETA: 25s - loss: 0.2465 - regression_loss: 0.2342 - classification_loss: 0.0123 450/500 [==========================>...] - ETA: 25s - loss: 0.2462 - regression_loss: 0.2339 - classification_loss: 0.0123 451/500 [==========================>...] - ETA: 24s - loss: 0.2464 - regression_loss: 0.2340 - classification_loss: 0.0123 452/500 [==========================>...] - ETA: 24s - loss: 0.2463 - regression_loss: 0.2340 - classification_loss: 0.0123 453/500 [==========================>...] - ETA: 23s - loss: 0.2462 - regression_loss: 0.2339 - classification_loss: 0.0123 454/500 [==========================>...] - ETA: 23s - loss: 0.2459 - regression_loss: 0.2336 - classification_loss: 0.0123 455/500 [==========================>...] - ETA: 22s - loss: 0.2458 - regression_loss: 0.2335 - classification_loss: 0.0123 456/500 [==========================>...] - ETA: 22s - loss: 0.2457 - regression_loss: 0.2334 - classification_loss: 0.0123 457/500 [==========================>...] - ETA: 21s - loss: 0.2463 - regression_loss: 0.2340 - classification_loss: 0.0123 458/500 [==========================>...] - ETA: 21s - loss: 0.2462 - regression_loss: 0.2339 - classification_loss: 0.0123 459/500 [==========================>...] - ETA: 20s - loss: 0.2459 - regression_loss: 0.2337 - classification_loss: 0.0123 460/500 [==========================>...] - ETA: 20s - loss: 0.2458 - regression_loss: 0.2335 - classification_loss: 0.0122 461/500 [==========================>...] - ETA: 19s - loss: 0.2458 - regression_loss: 0.2335 - classification_loss: 0.0122 462/500 [==========================>...] - ETA: 19s - loss: 0.2460 - regression_loss: 0.2337 - classification_loss: 0.0123 463/500 [==========================>...] - ETA: 18s - loss: 0.2462 - regression_loss: 0.2340 - classification_loss: 0.0123 464/500 [==========================>...] - ETA: 18s - loss: 0.2463 - regression_loss: 0.2340 - classification_loss: 0.0123 465/500 [==========================>...] - ETA: 17s - loss: 0.2461 - regression_loss: 0.2338 - classification_loss: 0.0123 466/500 [==========================>...] - ETA: 17s - loss: 0.2464 - regression_loss: 0.2341 - classification_loss: 0.0123 467/500 [===========================>..] - ETA: 16s - loss: 0.2467 - regression_loss: 0.2345 - classification_loss: 0.0123 468/500 [===========================>..] - ETA: 16s - loss: 0.2468 - regression_loss: 0.2345 - classification_loss: 0.0123 469/500 [===========================>..] - ETA: 15s - loss: 0.2469 - regression_loss: 0.2346 - classification_loss: 0.0123 470/500 [===========================>..] - ETA: 15s - loss: 0.2466 - regression_loss: 0.2343 - classification_loss: 0.0122 471/500 [===========================>..] - ETA: 14s - loss: 0.2466 - regression_loss: 0.2343 - classification_loss: 0.0123 472/500 [===========================>..] - ETA: 14s - loss: 0.2475 - regression_loss: 0.2352 - classification_loss: 0.0123 473/500 [===========================>..] - ETA: 13s - loss: 0.2478 - regression_loss: 0.2355 - classification_loss: 0.0123 474/500 [===========================>..] - ETA: 13s - loss: 0.2476 - regression_loss: 0.2353 - classification_loss: 0.0123 475/500 [===========================>..] - ETA: 12s - loss: 0.2475 - regression_loss: 0.2353 - classification_loss: 0.0122 476/500 [===========================>..] - ETA: 12s - loss: 0.2477 - regression_loss: 0.2354 - classification_loss: 0.0123 477/500 [===========================>..] - ETA: 11s - loss: 0.2476 - regression_loss: 0.2353 - classification_loss: 0.0122 478/500 [===========================>..] - ETA: 11s - loss: 0.2479 - regression_loss: 0.2357 - classification_loss: 0.0123 479/500 [===========================>..] - ETA: 10s - loss: 0.2478 - regression_loss: 0.2356 - classification_loss: 0.0123 480/500 [===========================>..] - ETA: 10s - loss: 0.2478 - regression_loss: 0.2356 - classification_loss: 0.0122 481/500 [===========================>..] - ETA: 9s - loss: 0.2476 - regression_loss: 0.2353 - classification_loss: 0.0122  482/500 [===========================>..] - ETA: 9s - loss: 0.2476 - regression_loss: 0.2353 - classification_loss: 0.0122 483/500 [===========================>..] - ETA: 8s - loss: 0.2479 - regression_loss: 0.2356 - classification_loss: 0.0122 484/500 [============================>.] - ETA: 8s - loss: 0.2477 - regression_loss: 0.2355 - classification_loss: 0.0122 485/500 [============================>.] - ETA: 7s - loss: 0.2479 - regression_loss: 0.2357 - classification_loss: 0.0122 486/500 [============================>.] - ETA: 7s - loss: 0.2481 - regression_loss: 0.2358 - classification_loss: 0.0122 487/500 [============================>.] - ETA: 6s - loss: 0.2478 - regression_loss: 0.2356 - classification_loss: 0.0122 488/500 [============================>.] - ETA: 6s - loss: 0.2481 - regression_loss: 0.2358 - classification_loss: 0.0122 489/500 [============================>.] - ETA: 5s - loss: 0.2478 - regression_loss: 0.2356 - classification_loss: 0.0122 490/500 [============================>.] - ETA: 5s - loss: 0.2478 - regression_loss: 0.2356 - classification_loss: 0.0122 491/500 [============================>.] - ETA: 4s - loss: 0.2477 - regression_loss: 0.2354 - classification_loss: 0.0122 492/500 [============================>.] - ETA: 4s - loss: 0.2481 - regression_loss: 0.2358 - classification_loss: 0.0122 493/500 [============================>.] - ETA: 3s - loss: 0.2484 - regression_loss: 0.2362 - classification_loss: 0.0122 494/500 [============================>.] - ETA: 3s - loss: 0.2486 - regression_loss: 0.2364 - classification_loss: 0.0122 495/500 [============================>.] - ETA: 2s - loss: 0.2488 - regression_loss: 0.2365 - classification_loss: 0.0122 496/500 [============================>.] - ETA: 2s - loss: 0.2487 - regression_loss: 0.2364 - classification_loss: 0.0123 497/500 [============================>.] - ETA: 1s - loss: 0.2485 - regression_loss: 0.2363 - classification_loss: 0.0122 498/500 [============================>.] - ETA: 1s - loss: 0.2484 - regression_loss: 0.2362 - classification_loss: 0.0122 499/500 [============================>.] - ETA: 0s - loss: 0.2482 - regression_loss: 0.2360 - classification_loss: 0.0122 500/500 [==============================] - 251s 502ms/step - loss: 0.2481 - regression_loss: 0.2359 - classification_loss: 0.0122 1172 instances of class plum with average precision: 0.7405 mAP: 0.7405 Epoch 00029: saving model to ./training/snapshots/resnet101_pascal_29.h5 Epoch 30/150 1/500 [..............................] - ETA: 3:57 - loss: 0.6200 - regression_loss: 0.5998 - classification_loss: 0.0202 2/500 [..............................] - ETA: 4:03 - loss: 0.3786 - regression_loss: 0.3681 - classification_loss: 0.0105 3/500 [..............................] - ETA: 4:07 - loss: 0.4854 - regression_loss: 0.4664 - classification_loss: 0.0191 4/500 [..............................] - ETA: 4:07 - loss: 0.5044 - regression_loss: 0.4716 - classification_loss: 0.0328 5/500 [..............................] - ETA: 4:09 - loss: 0.4634 - regression_loss: 0.4345 - classification_loss: 0.0289 6/500 [..............................] - ETA: 4:09 - loss: 0.4076 - regression_loss: 0.3831 - classification_loss: 0.0245 7/500 [..............................] - ETA: 4:07 - loss: 0.3977 - regression_loss: 0.3730 - classification_loss: 0.0247 8/500 [..............................] - ETA: 4:06 - loss: 0.3646 - regression_loss: 0.3427 - classification_loss: 0.0218 9/500 [..............................] - ETA: 4:06 - loss: 0.3641 - regression_loss: 0.3443 - classification_loss: 0.0199 10/500 [..............................] - ETA: 4:06 - loss: 0.3383 - regression_loss: 0.3201 - classification_loss: 0.0183 11/500 [..............................] - ETA: 4:05 - loss: 0.3404 - regression_loss: 0.3223 - classification_loss: 0.0181 12/500 [..............................] - ETA: 4:04 - loss: 0.3471 - regression_loss: 0.3297 - classification_loss: 0.0175 13/500 [..............................] - ETA: 4:02 - loss: 0.3254 - regression_loss: 0.3091 - classification_loss: 0.0164 14/500 [..............................] - ETA: 4:02 - loss: 0.3179 - regression_loss: 0.3023 - classification_loss: 0.0156 15/500 [..............................] - ETA: 4:01 - loss: 0.3056 - regression_loss: 0.2907 - classification_loss: 0.0148 16/500 [..............................] - ETA: 4:00 - loss: 0.2890 - regression_loss: 0.2748 - classification_loss: 0.0142 17/500 [>.............................] - ETA: 3:59 - loss: 0.2921 - regression_loss: 0.2776 - classification_loss: 0.0144 18/500 [>.............................] - ETA: 3:58 - loss: 0.2842 - regression_loss: 0.2696 - classification_loss: 0.0146 19/500 [>.............................] - ETA: 3:58 - loss: 0.2808 - regression_loss: 0.2665 - classification_loss: 0.0143 20/500 [>.............................] - ETA: 3:58 - loss: 0.2729 - regression_loss: 0.2591 - classification_loss: 0.0139 21/500 [>.............................] - ETA: 3:58 - loss: 0.2660 - regression_loss: 0.2527 - classification_loss: 0.0134 22/500 [>.............................] - ETA: 3:57 - loss: 0.2597 - regression_loss: 0.2469 - classification_loss: 0.0129 23/500 [>.............................] - ETA: 3:57 - loss: 0.2619 - regression_loss: 0.2493 - classification_loss: 0.0126 24/500 [>.............................] - ETA: 3:57 - loss: 0.2622 - regression_loss: 0.2498 - classification_loss: 0.0125 25/500 [>.............................] - ETA: 3:56 - loss: 0.2662 - regression_loss: 0.2537 - classification_loss: 0.0125 26/500 [>.............................] - ETA: 3:56 - loss: 0.2681 - regression_loss: 0.2550 - classification_loss: 0.0130 27/500 [>.............................] - ETA: 3:56 - loss: 0.2619 - regression_loss: 0.2491 - classification_loss: 0.0129 28/500 [>.............................] - ETA: 3:55 - loss: 0.2563 - regression_loss: 0.2439 - classification_loss: 0.0124 29/500 [>.............................] - ETA: 3:54 - loss: 0.2557 - regression_loss: 0.2432 - classification_loss: 0.0126 30/500 [>.............................] - ETA: 3:54 - loss: 0.2578 - regression_loss: 0.2454 - classification_loss: 0.0124 31/500 [>.............................] - ETA: 3:53 - loss: 0.2588 - regression_loss: 0.2460 - classification_loss: 0.0129 32/500 [>.............................] - ETA: 3:53 - loss: 0.2580 - regression_loss: 0.2453 - classification_loss: 0.0128 33/500 [>.............................] - ETA: 3:53 - loss: 0.2536 - regression_loss: 0.2412 - classification_loss: 0.0125 34/500 [=>............................] - ETA: 3:52 - loss: 0.2517 - regression_loss: 0.2393 - classification_loss: 0.0124 35/500 [=>............................] - ETA: 3:52 - loss: 0.2509 - regression_loss: 0.2386 - classification_loss: 0.0123 36/500 [=>............................] - ETA: 3:51 - loss: 0.2549 - regression_loss: 0.2426 - classification_loss: 0.0123 37/500 [=>............................] - ETA: 3:51 - loss: 0.2507 - regression_loss: 0.2387 - classification_loss: 0.0121 38/500 [=>............................] - ETA: 3:50 - loss: 0.2487 - regression_loss: 0.2368 - classification_loss: 0.0119 39/500 [=>............................] - ETA: 3:50 - loss: 0.2465 - regression_loss: 0.2347 - classification_loss: 0.0118 40/500 [=>............................] - ETA: 3:49 - loss: 0.2443 - regression_loss: 0.2326 - classification_loss: 0.0117 41/500 [=>............................] - ETA: 3:49 - loss: 0.2431 - regression_loss: 0.2315 - classification_loss: 0.0116 42/500 [=>............................] - ETA: 3:49 - loss: 0.2432 - regression_loss: 0.2316 - classification_loss: 0.0116 43/500 [=>............................] - ETA: 3:48 - loss: 0.2414 - regression_loss: 0.2300 - classification_loss: 0.0114 44/500 [=>............................] - ETA: 3:48 - loss: 0.2404 - regression_loss: 0.2290 - classification_loss: 0.0113 45/500 [=>............................] - ETA: 3:47 - loss: 0.2367 - regression_loss: 0.2255 - classification_loss: 0.0111 46/500 [=>............................] - ETA: 3:47 - loss: 0.2368 - regression_loss: 0.2257 - classification_loss: 0.0111 47/500 [=>............................] - ETA: 3:46 - loss: 0.2378 - regression_loss: 0.2263 - classification_loss: 0.0115 48/500 [=>............................] - ETA: 3:46 - loss: 0.2373 - regression_loss: 0.2256 - classification_loss: 0.0117 49/500 [=>............................] - ETA: 3:45 - loss: 0.2370 - regression_loss: 0.2253 - classification_loss: 0.0116 50/500 [==>...........................] - ETA: 3:45 - loss: 0.2345 - regression_loss: 0.2230 - classification_loss: 0.0115 51/500 [==>...........................] - ETA: 3:44 - loss: 0.2412 - regression_loss: 0.2295 - classification_loss: 0.0117 52/500 [==>...........................] - ETA: 3:44 - loss: 0.2428 - regression_loss: 0.2312 - classification_loss: 0.0116 53/500 [==>...........................] - ETA: 3:44 - loss: 0.2426 - regression_loss: 0.2311 - classification_loss: 0.0115 54/500 [==>...........................] - ETA: 3:43 - loss: 0.2444 - regression_loss: 0.2328 - classification_loss: 0.0117 55/500 [==>...........................] - ETA: 3:43 - loss: 0.2430 - regression_loss: 0.2315 - classification_loss: 0.0115 56/500 [==>...........................] - ETA: 3:43 - loss: 0.2422 - regression_loss: 0.2307 - classification_loss: 0.0115 57/500 [==>...........................] - ETA: 3:42 - loss: 0.2404 - regression_loss: 0.2290 - classification_loss: 0.0114 58/500 [==>...........................] - ETA: 3:42 - loss: 0.2405 - regression_loss: 0.2291 - classification_loss: 0.0113 59/500 [==>...........................] - ETA: 3:41 - loss: 0.2382 - regression_loss: 0.2268 - classification_loss: 0.0114 60/500 [==>...........................] - ETA: 3:41 - loss: 0.2359 - regression_loss: 0.2247 - classification_loss: 0.0112 61/500 [==>...........................] - ETA: 3:40 - loss: 0.2366 - regression_loss: 0.2253 - classification_loss: 0.0113 62/500 [==>...........................] - ETA: 3:40 - loss: 0.2372 - regression_loss: 0.2257 - classification_loss: 0.0114 63/500 [==>...........................] - ETA: 3:39 - loss: 0.2365 - regression_loss: 0.2252 - classification_loss: 0.0113 64/500 [==>...........................] - ETA: 3:39 - loss: 0.2359 - regression_loss: 0.2247 - classification_loss: 0.0112 65/500 [==>...........................] - ETA: 3:38 - loss: 0.2359 - regression_loss: 0.2247 - classification_loss: 0.0112 66/500 [==>...........................] - ETA: 3:38 - loss: 0.2375 - regression_loss: 0.2263 - classification_loss: 0.0112 67/500 [===>..........................] - ETA: 3:37 - loss: 0.2384 - regression_loss: 0.2272 - classification_loss: 0.0112 68/500 [===>..........................] - ETA: 3:37 - loss: 0.2371 - regression_loss: 0.2260 - classification_loss: 0.0111 69/500 [===>..........................] - ETA: 3:36 - loss: 0.2349 - regression_loss: 0.2239 - classification_loss: 0.0110 70/500 [===>..........................] - ETA: 3:36 - loss: 0.2370 - regression_loss: 0.2258 - classification_loss: 0.0111 71/500 [===>..........................] - ETA: 3:35 - loss: 0.2356 - regression_loss: 0.2246 - classification_loss: 0.0110 72/500 [===>..........................] - ETA: 3:35 - loss: 0.2377 - regression_loss: 0.2267 - classification_loss: 0.0111 73/500 [===>..........................] - ETA: 3:34 - loss: 0.2373 - regression_loss: 0.2264 - classification_loss: 0.0110 74/500 [===>..........................] - ETA: 3:33 - loss: 0.2379 - regression_loss: 0.2270 - classification_loss: 0.0109 75/500 [===>..........................] - ETA: 3:33 - loss: 0.2372 - regression_loss: 0.2264 - classification_loss: 0.0108 76/500 [===>..........................] - ETA: 3:33 - loss: 0.2374 - regression_loss: 0.2267 - classification_loss: 0.0107 77/500 [===>..........................] - ETA: 3:32 - loss: 0.2386 - regression_loss: 0.2278 - classification_loss: 0.0107 78/500 [===>..........................] - ETA: 3:32 - loss: 0.2400 - regression_loss: 0.2291 - classification_loss: 0.0109 79/500 [===>..........................] - ETA: 3:31 - loss: 0.2436 - regression_loss: 0.2327 - classification_loss: 0.0109 80/500 [===>..........................] - ETA: 3:31 - loss: 0.2428 - regression_loss: 0.2319 - classification_loss: 0.0109 81/500 [===>..........................] - ETA: 3:30 - loss: 0.2423 - regression_loss: 0.2314 - classification_loss: 0.0109 82/500 [===>..........................] - ETA: 3:30 - loss: 0.2420 - regression_loss: 0.2312 - classification_loss: 0.0108 83/500 [===>..........................] - ETA: 3:29 - loss: 0.2405 - regression_loss: 0.2298 - classification_loss: 0.0107 84/500 [====>.........................] - ETA: 3:29 - loss: 0.2423 - regression_loss: 0.2315 - classification_loss: 0.0108 85/500 [====>.........................] - ETA: 3:28 - loss: 0.2418 - regression_loss: 0.2311 - classification_loss: 0.0107 86/500 [====>.........................] - ETA: 3:28 - loss: 0.2426 - regression_loss: 0.2320 - classification_loss: 0.0106 87/500 [====>.........................] - ETA: 3:27 - loss: 0.2422 - regression_loss: 0.2316 - classification_loss: 0.0106 88/500 [====>.........................] - ETA: 3:27 - loss: 0.2424 - regression_loss: 0.2318 - classification_loss: 0.0106 89/500 [====>.........................] - ETA: 3:26 - loss: 0.2429 - regression_loss: 0.2324 - classification_loss: 0.0105 90/500 [====>.........................] - ETA: 3:25 - loss: 0.2432 - regression_loss: 0.2327 - classification_loss: 0.0105 91/500 [====>.........................] - ETA: 3:25 - loss: 0.2425 - regression_loss: 0.2319 - classification_loss: 0.0106 92/500 [====>.........................] - ETA: 3:25 - loss: 0.2435 - regression_loss: 0.2330 - classification_loss: 0.0105 93/500 [====>.........................] - ETA: 3:24 - loss: 0.2441 - regression_loss: 0.2335 - classification_loss: 0.0106 94/500 [====>.........................] - ETA: 3:24 - loss: 0.2440 - regression_loss: 0.2333 - classification_loss: 0.0106 95/500 [====>.........................] - ETA: 3:23 - loss: 0.2453 - regression_loss: 0.2346 - classification_loss: 0.0107 96/500 [====>.........................] - ETA: 3:23 - loss: 0.2446 - regression_loss: 0.2340 - classification_loss: 0.0106 97/500 [====>.........................] - ETA: 3:22 - loss: 0.2435 - regression_loss: 0.2330 - classification_loss: 0.0105 98/500 [====>.........................] - ETA: 3:21 - loss: 0.2427 - regression_loss: 0.2323 - classification_loss: 0.0105 99/500 [====>.........................] - ETA: 3:21 - loss: 0.2435 - regression_loss: 0.2329 - classification_loss: 0.0105 100/500 [=====>........................] - ETA: 3:21 - loss: 0.2441 - regression_loss: 0.2336 - classification_loss: 0.0105 101/500 [=====>........................] - ETA: 3:20 - loss: 0.2432 - regression_loss: 0.2328 - classification_loss: 0.0104 102/500 [=====>........................] - ETA: 3:20 - loss: 0.2422 - regression_loss: 0.2319 - classification_loss: 0.0104 103/500 [=====>........................] - ETA: 3:19 - loss: 0.2418 - regression_loss: 0.2313 - classification_loss: 0.0104 104/500 [=====>........................] - ETA: 3:19 - loss: 0.2407 - regression_loss: 0.2303 - classification_loss: 0.0104 105/500 [=====>........................] - ETA: 3:18 - loss: 0.2404 - regression_loss: 0.2301 - classification_loss: 0.0103 106/500 [=====>........................] - ETA: 3:18 - loss: 0.2420 - regression_loss: 0.2316 - classification_loss: 0.0104 107/500 [=====>........................] - ETA: 3:17 - loss: 0.2421 - regression_loss: 0.2317 - classification_loss: 0.0104 108/500 [=====>........................] - ETA: 3:16 - loss: 0.2421 - regression_loss: 0.2317 - classification_loss: 0.0104 109/500 [=====>........................] - ETA: 3:16 - loss: 0.2420 - regression_loss: 0.2316 - classification_loss: 0.0103 110/500 [=====>........................] - ETA: 3:15 - loss: 0.2415 - regression_loss: 0.2312 - classification_loss: 0.0103 111/500 [=====>........................] - ETA: 3:15 - loss: 0.2453 - regression_loss: 0.2350 - classification_loss: 0.0103 112/500 [=====>........................] - ETA: 3:14 - loss: 0.2458 - regression_loss: 0.2356 - classification_loss: 0.0103 113/500 [=====>........................] - ETA: 3:14 - loss: 0.2457 - regression_loss: 0.2354 - classification_loss: 0.0103 114/500 [=====>........................] - ETA: 3:13 - loss: 0.2452 - regression_loss: 0.2350 - classification_loss: 0.0102 115/500 [=====>........................] - ETA: 3:13 - loss: 0.2453 - regression_loss: 0.2351 - classification_loss: 0.0102 116/500 [=====>........................] - ETA: 3:12 - loss: 0.2447 - regression_loss: 0.2346 - classification_loss: 0.0101 117/500 [======>.......................] - ETA: 3:12 - loss: 0.2451 - regression_loss: 0.2349 - classification_loss: 0.0101 118/500 [======>.......................] - ETA: 3:11 - loss: 0.2463 - regression_loss: 0.2361 - classification_loss: 0.0102 119/500 [======>.......................] - ETA: 3:11 - loss: 0.2452 - regression_loss: 0.2351 - classification_loss: 0.0101 120/500 [======>.......................] - ETA: 3:10 - loss: 0.2451 - regression_loss: 0.2350 - classification_loss: 0.0101 121/500 [======>.......................] - ETA: 3:10 - loss: 0.2454 - regression_loss: 0.2353 - classification_loss: 0.0102 122/500 [======>.......................] - ETA: 3:09 - loss: 0.2460 - regression_loss: 0.2358 - classification_loss: 0.0102 123/500 [======>.......................] - ETA: 3:09 - loss: 0.2454 - regression_loss: 0.2352 - classification_loss: 0.0102 124/500 [======>.......................] - ETA: 3:08 - loss: 0.2447 - regression_loss: 0.2345 - classification_loss: 0.0102 125/500 [======>.......................] - ETA: 3:08 - loss: 0.2446 - regression_loss: 0.2344 - classification_loss: 0.0102 126/500 [======>.......................] - ETA: 3:07 - loss: 0.2451 - regression_loss: 0.2349 - classification_loss: 0.0102 127/500 [======>.......................] - ETA: 3:07 - loss: 0.2461 - regression_loss: 0.2358 - classification_loss: 0.0103 128/500 [======>.......................] - ETA: 3:06 - loss: 0.2458 - regression_loss: 0.2355 - classification_loss: 0.0103 129/500 [======>.......................] - ETA: 3:06 - loss: 0.2467 - regression_loss: 0.2363 - classification_loss: 0.0104 130/500 [======>.......................] - ETA: 3:05 - loss: 0.2461 - regression_loss: 0.2356 - classification_loss: 0.0104 131/500 [======>.......................] - ETA: 3:05 - loss: 0.2461 - regression_loss: 0.2357 - classification_loss: 0.0104 132/500 [======>.......................] - ETA: 3:04 - loss: 0.2469 - regression_loss: 0.2364 - classification_loss: 0.0105 133/500 [======>.......................] - ETA: 3:03 - loss: 0.2463 - regression_loss: 0.2358 - classification_loss: 0.0104 134/500 [=======>......................] - ETA: 3:03 - loss: 0.2454 - regression_loss: 0.2350 - classification_loss: 0.0104 135/500 [=======>......................] - ETA: 3:02 - loss: 0.2449 - regression_loss: 0.2345 - classification_loss: 0.0104 136/500 [=======>......................] - ETA: 3:02 - loss: 0.2442 - regression_loss: 0.2338 - classification_loss: 0.0103 137/500 [=======>......................] - ETA: 3:01 - loss: 0.2449 - regression_loss: 0.2346 - classification_loss: 0.0103 138/500 [=======>......................] - ETA: 3:01 - loss: 0.2446 - regression_loss: 0.2342 - classification_loss: 0.0103 139/500 [=======>......................] - ETA: 3:00 - loss: 0.2444 - regression_loss: 0.2340 - classification_loss: 0.0104 140/500 [=======>......................] - ETA: 3:00 - loss: 0.2434 - regression_loss: 0.2330 - classification_loss: 0.0103 141/500 [=======>......................] - ETA: 2:59 - loss: 0.2436 - regression_loss: 0.2332 - classification_loss: 0.0103 142/500 [=======>......................] - ETA: 2:59 - loss: 0.2430 - regression_loss: 0.2326 - classification_loss: 0.0104 143/500 [=======>......................] - ETA: 2:59 - loss: 0.2434 - regression_loss: 0.2330 - classification_loss: 0.0104 144/500 [=======>......................] - ETA: 2:58 - loss: 0.2461 - regression_loss: 0.2355 - classification_loss: 0.0106 145/500 [=======>......................] - ETA: 2:58 - loss: 0.2467 - regression_loss: 0.2361 - classification_loss: 0.0106 146/500 [=======>......................] - ETA: 2:57 - loss: 0.2458 - regression_loss: 0.2352 - classification_loss: 0.0105 147/500 [=======>......................] - ETA: 2:57 - loss: 0.2469 - regression_loss: 0.2363 - classification_loss: 0.0106 148/500 [=======>......................] - ETA: 2:56 - loss: 0.2461 - regression_loss: 0.2356 - classification_loss: 0.0105 149/500 [=======>......................] - ETA: 2:55 - loss: 0.2458 - regression_loss: 0.2353 - classification_loss: 0.0105 150/500 [========>.....................] - ETA: 2:55 - loss: 0.2463 - regression_loss: 0.2358 - classification_loss: 0.0105 151/500 [========>.....................] - ETA: 2:54 - loss: 0.2468 - regression_loss: 0.2362 - classification_loss: 0.0106 152/500 [========>.....................] - ETA: 2:54 - loss: 0.2468 - regression_loss: 0.2361 - classification_loss: 0.0106 153/500 [========>.....................] - ETA: 2:53 - loss: 0.2463 - regression_loss: 0.2357 - classification_loss: 0.0106 154/500 [========>.....................] - ETA: 2:53 - loss: 0.2460 - regression_loss: 0.2355 - classification_loss: 0.0105 155/500 [========>.....................] - ETA: 2:53 - loss: 0.2450 - regression_loss: 0.2345 - classification_loss: 0.0105 156/500 [========>.....................] - ETA: 2:52 - loss: 0.2449 - regression_loss: 0.2344 - classification_loss: 0.0105 157/500 [========>.....................] - ETA: 2:52 - loss: 0.2455 - regression_loss: 0.2350 - classification_loss: 0.0105 158/500 [========>.....................] - ETA: 2:51 - loss: 0.2448 - regression_loss: 0.2343 - classification_loss: 0.0105 159/500 [========>.....................] - ETA: 2:51 - loss: 0.2447 - regression_loss: 0.2341 - classification_loss: 0.0106 160/500 [========>.....................] - ETA: 2:50 - loss: 0.2448 - regression_loss: 0.2341 - classification_loss: 0.0106 161/500 [========>.....................] - ETA: 2:49 - loss: 0.2441 - regression_loss: 0.2335 - classification_loss: 0.0106 162/500 [========>.....................] - ETA: 2:49 - loss: 0.2449 - regression_loss: 0.2341 - classification_loss: 0.0108 163/500 [========>.....................] - ETA: 2:48 - loss: 0.2456 - regression_loss: 0.2348 - classification_loss: 0.0108 164/500 [========>.....................] - ETA: 2:48 - loss: 0.2453 - regression_loss: 0.2345 - classification_loss: 0.0107 165/500 [========>.....................] - ETA: 2:47 - loss: 0.2445 - regression_loss: 0.2338 - classification_loss: 0.0107 166/500 [========>.....................] - ETA: 2:47 - loss: 0.2438 - regression_loss: 0.2332 - classification_loss: 0.0106 167/500 [=========>....................] - ETA: 2:47 - loss: 0.2460 - regression_loss: 0.2351 - classification_loss: 0.0108 168/500 [=========>....................] - ETA: 2:46 - loss: 0.2465 - regression_loss: 0.2356 - classification_loss: 0.0109 169/500 [=========>....................] - ETA: 2:46 - loss: 0.2458 - regression_loss: 0.2350 - classification_loss: 0.0108 170/500 [=========>....................] - ETA: 2:45 - loss: 0.2459 - regression_loss: 0.2351 - classification_loss: 0.0108 171/500 [=========>....................] - ETA: 2:45 - loss: 0.2462 - regression_loss: 0.2354 - classification_loss: 0.0108 172/500 [=========>....................] - ETA: 2:44 - loss: 0.2465 - regression_loss: 0.2357 - classification_loss: 0.0108 173/500 [=========>....................] - ETA: 2:44 - loss: 0.2481 - regression_loss: 0.2372 - classification_loss: 0.0109 174/500 [=========>....................] - ETA: 2:43 - loss: 0.2476 - regression_loss: 0.2368 - classification_loss: 0.0108 175/500 [=========>....................] - ETA: 2:43 - loss: 0.2479 - regression_loss: 0.2371 - classification_loss: 0.0108 176/500 [=========>....................] - ETA: 2:42 - loss: 0.2477 - regression_loss: 0.2370 - classification_loss: 0.0107 177/500 [=========>....................] - ETA: 2:42 - loss: 0.2482 - regression_loss: 0.2375 - classification_loss: 0.0107 178/500 [=========>....................] - ETA: 2:41 - loss: 0.2481 - regression_loss: 0.2373 - classification_loss: 0.0108 179/500 [=========>....................] - ETA: 2:41 - loss: 0.2485 - regression_loss: 0.2378 - classification_loss: 0.0108 180/500 [=========>....................] - ETA: 2:40 - loss: 0.2481 - regression_loss: 0.2373 - classification_loss: 0.0108 181/500 [=========>....................] - ETA: 2:40 - loss: 0.2474 - regression_loss: 0.2367 - classification_loss: 0.0107 182/500 [=========>....................] - ETA: 2:39 - loss: 0.2473 - regression_loss: 0.2365 - classification_loss: 0.0107 183/500 [=========>....................] - ETA: 2:39 - loss: 0.2465 - regression_loss: 0.2358 - classification_loss: 0.0107 184/500 [==========>...................] - ETA: 2:38 - loss: 0.2465 - regression_loss: 0.2358 - classification_loss: 0.0107 185/500 [==========>...................] - ETA: 2:38 - loss: 0.2463 - regression_loss: 0.2356 - classification_loss: 0.0107 186/500 [==========>...................] - ETA: 2:37 - loss: 0.2474 - regression_loss: 0.2367 - classification_loss: 0.0107 187/500 [==========>...................] - ETA: 2:37 - loss: 0.2471 - regression_loss: 0.2364 - classification_loss: 0.0107 188/500 [==========>...................] - ETA: 2:36 - loss: 0.2477 - regression_loss: 0.2370 - classification_loss: 0.0107 189/500 [==========>...................] - ETA: 2:36 - loss: 0.2475 - regression_loss: 0.2368 - classification_loss: 0.0107 190/500 [==========>...................] - ETA: 2:35 - loss: 0.2480 - regression_loss: 0.2373 - classification_loss: 0.0107 191/500 [==========>...................] - ETA: 2:35 - loss: 0.2476 - regression_loss: 0.2370 - classification_loss: 0.0107 192/500 [==========>...................] - ETA: 2:34 - loss: 0.2481 - regression_loss: 0.2374 - classification_loss: 0.0107 193/500 [==========>...................] - ETA: 2:34 - loss: 0.2479 - regression_loss: 0.2373 - classification_loss: 0.0107 194/500 [==========>...................] - ETA: 2:33 - loss: 0.2473 - regression_loss: 0.2366 - classification_loss: 0.0107 195/500 [==========>...................] - ETA: 2:33 - loss: 0.2470 - regression_loss: 0.2363 - classification_loss: 0.0107 196/500 [==========>...................] - ETA: 2:32 - loss: 0.2470 - regression_loss: 0.2363 - classification_loss: 0.0107 197/500 [==========>...................] - ETA: 2:32 - loss: 0.2465 - regression_loss: 0.2359 - classification_loss: 0.0106 198/500 [==========>...................] - ETA: 2:31 - loss: 0.2459 - regression_loss: 0.2353 - classification_loss: 0.0106 199/500 [==========>...................] - ETA: 2:31 - loss: 0.2462 - regression_loss: 0.2354 - classification_loss: 0.0107 200/500 [===========>..................] - ETA: 2:30 - loss: 0.2469 - regression_loss: 0.2360 - classification_loss: 0.0109 201/500 [===========>..................] - ETA: 2:30 - loss: 0.2466 - regression_loss: 0.2357 - classification_loss: 0.0109 202/500 [===========>..................] - ETA: 2:29 - loss: 0.2463 - regression_loss: 0.2355 - classification_loss: 0.0108 203/500 [===========>..................] - ETA: 2:29 - loss: 0.2459 - regression_loss: 0.2350 - classification_loss: 0.0108 204/500 [===========>..................] - ETA: 2:28 - loss: 0.2455 - regression_loss: 0.2347 - classification_loss: 0.0108 205/500 [===========>..................] - ETA: 2:28 - loss: 0.2455 - regression_loss: 0.2347 - classification_loss: 0.0108 206/500 [===========>..................] - ETA: 2:27 - loss: 0.2453 - regression_loss: 0.2345 - classification_loss: 0.0108 207/500 [===========>..................] - ETA: 2:27 - loss: 0.2447 - regression_loss: 0.2339 - classification_loss: 0.0108 208/500 [===========>..................] - ETA: 2:26 - loss: 0.2448 - regression_loss: 0.2340 - classification_loss: 0.0108 209/500 [===========>..................] - ETA: 2:26 - loss: 0.2447 - regression_loss: 0.2339 - classification_loss: 0.0108 210/500 [===========>..................] - ETA: 2:25 - loss: 0.2447 - regression_loss: 0.2339 - classification_loss: 0.0108 211/500 [===========>..................] - ETA: 2:25 - loss: 0.2445 - regression_loss: 0.2338 - classification_loss: 0.0108 212/500 [===========>..................] - ETA: 2:24 - loss: 0.2443 - regression_loss: 0.2336 - classification_loss: 0.0108 213/500 [===========>..................] - ETA: 2:24 - loss: 0.2439 - regression_loss: 0.2332 - classification_loss: 0.0108 214/500 [===========>..................] - ETA: 2:23 - loss: 0.2436 - regression_loss: 0.2328 - classification_loss: 0.0108 215/500 [===========>..................] - ETA: 2:23 - loss: 0.2437 - regression_loss: 0.2328 - classification_loss: 0.0108 216/500 [===========>..................] - ETA: 2:22 - loss: 0.2436 - regression_loss: 0.2328 - classification_loss: 0.0108 217/500 [============>.................] - ETA: 2:22 - loss: 0.2439 - regression_loss: 0.2331 - classification_loss: 0.0108 218/500 [============>.................] - ETA: 2:21 - loss: 0.2435 - regression_loss: 0.2327 - classification_loss: 0.0108 219/500 [============>.................] - ETA: 2:21 - loss: 0.2442 - regression_loss: 0.2334 - classification_loss: 0.0108 220/500 [============>.................] - ETA: 2:20 - loss: 0.2437 - regression_loss: 0.2329 - classification_loss: 0.0108 221/500 [============>.................] - ETA: 2:20 - loss: 0.2433 - regression_loss: 0.2325 - classification_loss: 0.0107 222/500 [============>.................] - ETA: 2:19 - loss: 0.2433 - regression_loss: 0.2325 - classification_loss: 0.0107 223/500 [============>.................] - ETA: 2:19 - loss: 0.2439 - regression_loss: 0.2331 - classification_loss: 0.0108 224/500 [============>.................] - ETA: 2:18 - loss: 0.2431 - regression_loss: 0.2324 - classification_loss: 0.0107 225/500 [============>.................] - ETA: 2:18 - loss: 0.2427 - regression_loss: 0.2320 - classification_loss: 0.0107 226/500 [============>.................] - ETA: 2:17 - loss: 0.2425 - regression_loss: 0.2318 - classification_loss: 0.0107 227/500 [============>.................] - ETA: 2:17 - loss: 0.2420 - regression_loss: 0.2314 - classification_loss: 0.0106 228/500 [============>.................] - ETA: 2:16 - loss: 0.2418 - regression_loss: 0.2312 - classification_loss: 0.0106 229/500 [============>.................] - ETA: 2:16 - loss: 0.2425 - regression_loss: 0.2318 - classification_loss: 0.0106 230/500 [============>.................] - ETA: 2:15 - loss: 0.2426 - regression_loss: 0.2320 - classification_loss: 0.0106 231/500 [============>.................] - ETA: 2:15 - loss: 0.2421 - regression_loss: 0.2315 - classification_loss: 0.0106 232/500 [============>.................] - ETA: 2:14 - loss: 0.2417 - regression_loss: 0.2311 - classification_loss: 0.0106 233/500 [============>.................] - ETA: 2:14 - loss: 0.2416 - regression_loss: 0.2310 - classification_loss: 0.0105 234/500 [=============>................] - ETA: 2:13 - loss: 0.2409 - regression_loss: 0.2304 - classification_loss: 0.0105 235/500 [=============>................] - ETA: 2:13 - loss: 0.2405 - regression_loss: 0.2301 - classification_loss: 0.0105 236/500 [=============>................] - ETA: 2:12 - loss: 0.2407 - regression_loss: 0.2302 - classification_loss: 0.0105 237/500 [=============>................] - ETA: 2:12 - loss: 0.2409 - regression_loss: 0.2304 - classification_loss: 0.0105 238/500 [=============>................] - ETA: 2:11 - loss: 0.2415 - regression_loss: 0.2308 - classification_loss: 0.0107 239/500 [=============>................] - ETA: 2:11 - loss: 0.2416 - regression_loss: 0.2308 - classification_loss: 0.0107 240/500 [=============>................] - ETA: 2:10 - loss: 0.2415 - regression_loss: 0.2307 - classification_loss: 0.0107 241/500 [=============>................] - ETA: 2:10 - loss: 0.2413 - regression_loss: 0.2306 - classification_loss: 0.0107 242/500 [=============>................] - ETA: 2:09 - loss: 0.2410 - regression_loss: 0.2303 - classification_loss: 0.0107 243/500 [=============>................] - ETA: 2:09 - loss: 0.2405 - regression_loss: 0.2299 - classification_loss: 0.0107 244/500 [=============>................] - ETA: 2:08 - loss: 0.2409 - regression_loss: 0.2302 - classification_loss: 0.0108 245/500 [=============>................] - ETA: 2:08 - loss: 0.2416 - regression_loss: 0.2308 - classification_loss: 0.0108 246/500 [=============>................] - ETA: 2:07 - loss: 0.2415 - regression_loss: 0.2307 - classification_loss: 0.0108 247/500 [=============>................] - ETA: 2:07 - loss: 0.2420 - regression_loss: 0.2312 - classification_loss: 0.0108 248/500 [=============>................] - ETA: 2:06 - loss: 0.2420 - regression_loss: 0.2312 - classification_loss: 0.0108 249/500 [=============>................] - ETA: 2:06 - loss: 0.2421 - regression_loss: 0.2312 - classification_loss: 0.0108 250/500 [==============>...............] - ETA: 2:05 - loss: 0.2422 - regression_loss: 0.2314 - classification_loss: 0.0109 251/500 [==============>...............] - ETA: 2:05 - loss: 0.2421 - regression_loss: 0.2312 - classification_loss: 0.0108 252/500 [==============>...............] - ETA: 2:04 - loss: 0.2421 - regression_loss: 0.2312 - classification_loss: 0.0108 253/500 [==============>...............] - ETA: 2:04 - loss: 0.2414 - regression_loss: 0.2306 - classification_loss: 0.0108 254/500 [==============>...............] - ETA: 2:03 - loss: 0.2419 - regression_loss: 0.2311 - classification_loss: 0.0108 255/500 [==============>...............] - ETA: 2:02 - loss: 0.2419 - regression_loss: 0.2311 - classification_loss: 0.0108 256/500 [==============>...............] - ETA: 2:02 - loss: 0.2413 - regression_loss: 0.2305 - classification_loss: 0.0108 257/500 [==============>...............] - ETA: 2:02 - loss: 0.2413 - regression_loss: 0.2305 - classification_loss: 0.0108 258/500 [==============>...............] - ETA: 2:01 - loss: 0.2418 - regression_loss: 0.2310 - classification_loss: 0.0108 259/500 [==============>...............] - ETA: 2:01 - loss: 0.2414 - regression_loss: 0.2307 - classification_loss: 0.0108 260/500 [==============>...............] - ETA: 2:00 - loss: 0.2409 - regression_loss: 0.2302 - classification_loss: 0.0107 261/500 [==============>...............] - ETA: 2:00 - loss: 0.2403 - regression_loss: 0.2296 - classification_loss: 0.0107 262/500 [==============>...............] - ETA: 1:59 - loss: 0.2402 - regression_loss: 0.2295 - classification_loss: 0.0107 263/500 [==============>...............] - ETA: 1:59 - loss: 0.2403 - regression_loss: 0.2296 - classification_loss: 0.0107 264/500 [==============>...............] - ETA: 1:58 - loss: 0.2406 - regression_loss: 0.2299 - classification_loss: 0.0107 265/500 [==============>...............] - ETA: 1:58 - loss: 0.2410 - regression_loss: 0.2304 - classification_loss: 0.0107 266/500 [==============>...............] - ETA: 1:57 - loss: 0.2420 - regression_loss: 0.2313 - classification_loss: 0.0107 267/500 [===============>..............] - ETA: 1:56 - loss: 0.2421 - regression_loss: 0.2314 - classification_loss: 0.0107 268/500 [===============>..............] - ETA: 1:56 - loss: 0.2421 - regression_loss: 0.2314 - classification_loss: 0.0107 269/500 [===============>..............] - ETA: 1:56 - loss: 0.2422 - regression_loss: 0.2315 - classification_loss: 0.0107 270/500 [===============>..............] - ETA: 1:55 - loss: 0.2423 - regression_loss: 0.2316 - classification_loss: 0.0107 271/500 [===============>..............] - ETA: 1:54 - loss: 0.2424 - regression_loss: 0.2317 - classification_loss: 0.0107 272/500 [===============>..............] - ETA: 1:54 - loss: 0.2431 - regression_loss: 0.2324 - classification_loss: 0.0107 273/500 [===============>..............] - ETA: 1:53 - loss: 0.2430 - regression_loss: 0.2322 - classification_loss: 0.0107 274/500 [===============>..............] - ETA: 1:53 - loss: 0.2426 - regression_loss: 0.2319 - classification_loss: 0.0107 275/500 [===============>..............] - ETA: 1:52 - loss: 0.2421 - regression_loss: 0.2315 - classification_loss: 0.0106 276/500 [===============>..............] - ETA: 1:52 - loss: 0.2420 - regression_loss: 0.2314 - classification_loss: 0.0106 277/500 [===============>..............] - ETA: 1:51 - loss: 0.2419 - regression_loss: 0.2312 - classification_loss: 0.0106 278/500 [===============>..............] - ETA: 1:51 - loss: 0.2421 - regression_loss: 0.2315 - classification_loss: 0.0106 279/500 [===============>..............] - ETA: 1:50 - loss: 0.2422 - regression_loss: 0.2316 - classification_loss: 0.0106 280/500 [===============>..............] - ETA: 1:50 - loss: 0.2422 - regression_loss: 0.2316 - classification_loss: 0.0106 281/500 [===============>..............] - ETA: 1:49 - loss: 0.2417 - regression_loss: 0.2311 - classification_loss: 0.0106 282/500 [===============>..............] - ETA: 1:49 - loss: 0.2413 - regression_loss: 0.2308 - classification_loss: 0.0105 283/500 [===============>..............] - ETA: 1:48 - loss: 0.2411 - regression_loss: 0.2306 - classification_loss: 0.0105 284/500 [================>.............] - ETA: 1:48 - loss: 0.2406 - regression_loss: 0.2302 - classification_loss: 0.0105 285/500 [================>.............] - ETA: 1:48 - loss: 0.2402 - regression_loss: 0.2297 - classification_loss: 0.0105 286/500 [================>.............] - ETA: 1:47 - loss: 0.2403 - regression_loss: 0.2299 - classification_loss: 0.0104 287/500 [================>.............] - ETA: 1:46 - loss: 0.2403 - regression_loss: 0.2298 - classification_loss: 0.0104 288/500 [================>.............] - ETA: 1:46 - loss: 0.2404 - regression_loss: 0.2300 - classification_loss: 0.0104 289/500 [================>.............] - ETA: 1:45 - loss: 0.2405 - regression_loss: 0.2300 - classification_loss: 0.0104 290/500 [================>.............] - ETA: 1:45 - loss: 0.2409 - regression_loss: 0.2304 - classification_loss: 0.0105 291/500 [================>.............] - ETA: 1:44 - loss: 0.2412 - regression_loss: 0.2307 - classification_loss: 0.0105 292/500 [================>.............] - ETA: 1:44 - loss: 0.2411 - regression_loss: 0.2307 - classification_loss: 0.0105 293/500 [================>.............] - ETA: 1:43 - loss: 0.2417 - regression_loss: 0.2312 - classification_loss: 0.0105 294/500 [================>.............] - ETA: 1:43 - loss: 0.2417 - regression_loss: 0.2312 - classification_loss: 0.0105 295/500 [================>.............] - ETA: 1:42 - loss: 0.2413 - regression_loss: 0.2308 - classification_loss: 0.0105 296/500 [================>.............] - ETA: 1:42 - loss: 0.2415 - regression_loss: 0.2310 - classification_loss: 0.0105 297/500 [================>.............] - ETA: 1:41 - loss: 0.2411 - regression_loss: 0.2307 - classification_loss: 0.0105 298/500 [================>.............] - ETA: 1:41 - loss: 0.2413 - regression_loss: 0.2308 - classification_loss: 0.0105 299/500 [================>.............] - ETA: 1:40 - loss: 0.2409 - regression_loss: 0.2305 - classification_loss: 0.0104 300/500 [=================>............] - ETA: 1:40 - loss: 0.2408 - regression_loss: 0.2304 - classification_loss: 0.0104 301/500 [=================>............] - ETA: 1:39 - loss: 0.2409 - regression_loss: 0.2304 - classification_loss: 0.0104 302/500 [=================>............] - ETA: 1:39 - loss: 0.2411 - regression_loss: 0.2306 - classification_loss: 0.0104 303/500 [=================>............] - ETA: 1:38 - loss: 0.2414 - regression_loss: 0.2309 - classification_loss: 0.0105 304/500 [=================>............] - ETA: 1:38 - loss: 0.2412 - regression_loss: 0.2307 - classification_loss: 0.0104 305/500 [=================>............] - ETA: 1:37 - loss: 0.2408 - regression_loss: 0.2304 - classification_loss: 0.0104 306/500 [=================>............] - ETA: 1:37 - loss: 0.2413 - regression_loss: 0.2308 - classification_loss: 0.0105 307/500 [=================>............] - ETA: 1:36 - loss: 0.2409 - regression_loss: 0.2304 - classification_loss: 0.0105 308/500 [=================>............] - ETA: 1:36 - loss: 0.2406 - regression_loss: 0.2302 - classification_loss: 0.0105 309/500 [=================>............] - ETA: 1:35 - loss: 0.2414 - regression_loss: 0.2309 - classification_loss: 0.0105 310/500 [=================>............] - ETA: 1:35 - loss: 0.2415 - regression_loss: 0.2310 - classification_loss: 0.0105 311/500 [=================>............] - ETA: 1:34 - loss: 0.2412 - regression_loss: 0.2307 - classification_loss: 0.0105 312/500 [=================>............] - ETA: 1:34 - loss: 0.2410 - regression_loss: 0.2304 - classification_loss: 0.0105 313/500 [=================>............] - ETA: 1:33 - loss: 0.2407 - regression_loss: 0.2302 - classification_loss: 0.0105 314/500 [=================>............] - ETA: 1:33 - loss: 0.2410 - regression_loss: 0.2304 - classification_loss: 0.0105 315/500 [=================>............] - ETA: 1:32 - loss: 0.2410 - regression_loss: 0.2305 - classification_loss: 0.0105 316/500 [=================>............] - ETA: 1:32 - loss: 0.2408 - regression_loss: 0.2303 - classification_loss: 0.0105 317/500 [==================>...........] - ETA: 1:31 - loss: 0.2409 - regression_loss: 0.2304 - classification_loss: 0.0105 318/500 [==================>...........] - ETA: 1:31 - loss: 0.2406 - regression_loss: 0.2301 - classification_loss: 0.0105 319/500 [==================>...........] - ETA: 1:30 - loss: 0.2407 - regression_loss: 0.2302 - classification_loss: 0.0105 320/500 [==================>...........] - ETA: 1:30 - loss: 0.2407 - regression_loss: 0.2301 - classification_loss: 0.0105 321/500 [==================>...........] - ETA: 1:29 - loss: 0.2404 - regression_loss: 0.2299 - classification_loss: 0.0105 322/500 [==================>...........] - ETA: 1:29 - loss: 0.2400 - regression_loss: 0.2295 - classification_loss: 0.0105 323/500 [==================>...........] - ETA: 1:28 - loss: 0.2400 - regression_loss: 0.2295 - classification_loss: 0.0105 324/500 [==================>...........] - ETA: 1:28 - loss: 0.2396 - regression_loss: 0.2292 - classification_loss: 0.0105 325/500 [==================>...........] - ETA: 1:27 - loss: 0.2392 - regression_loss: 0.2288 - classification_loss: 0.0105 326/500 [==================>...........] - ETA: 1:27 - loss: 0.2390 - regression_loss: 0.2286 - classification_loss: 0.0105 327/500 [==================>...........] - ETA: 1:26 - loss: 0.2386 - regression_loss: 0.2282 - classification_loss: 0.0104 328/500 [==================>...........] - ETA: 1:26 - loss: 0.2387 - regression_loss: 0.2282 - classification_loss: 0.0104 329/500 [==================>...........] - ETA: 1:25 - loss: 0.2391 - regression_loss: 0.2285 - classification_loss: 0.0106 330/500 [==================>...........] - ETA: 1:25 - loss: 0.2389 - regression_loss: 0.2283 - classification_loss: 0.0106 331/500 [==================>...........] - ETA: 1:24 - loss: 0.2391 - regression_loss: 0.2284 - classification_loss: 0.0106 332/500 [==================>...........] - ETA: 1:24 - loss: 0.2390 - regression_loss: 0.2284 - classification_loss: 0.0106 333/500 [==================>...........] - ETA: 1:23 - loss: 0.2388 - regression_loss: 0.2282 - classification_loss: 0.0106 334/500 [===================>..........] - ETA: 1:23 - loss: 0.2386 - regression_loss: 0.2280 - classification_loss: 0.0106 335/500 [===================>..........] - ETA: 1:22 - loss: 0.2384 - regression_loss: 0.2277 - classification_loss: 0.0106 336/500 [===================>..........] - ETA: 1:22 - loss: 0.2382 - regression_loss: 0.2276 - classification_loss: 0.0106 337/500 [===================>..........] - ETA: 1:21 - loss: 0.2381 - regression_loss: 0.2275 - classification_loss: 0.0106 338/500 [===================>..........] - ETA: 1:21 - loss: 0.2378 - regression_loss: 0.2272 - classification_loss: 0.0106 339/500 [===================>..........] - ETA: 1:20 - loss: 0.2376 - regression_loss: 0.2271 - classification_loss: 0.0106 340/500 [===================>..........] - ETA: 1:20 - loss: 0.2381 - regression_loss: 0.2275 - classification_loss: 0.0106 341/500 [===================>..........] - ETA: 1:19 - loss: 0.2386 - regression_loss: 0.2280 - classification_loss: 0.0106 342/500 [===================>..........] - ETA: 1:19 - loss: 0.2384 - regression_loss: 0.2278 - classification_loss: 0.0106 343/500 [===================>..........] - ETA: 1:18 - loss: 0.2382 - regression_loss: 0.2276 - classification_loss: 0.0106 344/500 [===================>..........] - ETA: 1:18 - loss: 0.2379 - regression_loss: 0.2273 - classification_loss: 0.0106 345/500 [===================>..........] - ETA: 1:17 - loss: 0.2387 - regression_loss: 0.2281 - classification_loss: 0.0106 346/500 [===================>..........] - ETA: 1:17 - loss: 0.2387 - regression_loss: 0.2281 - classification_loss: 0.0106 347/500 [===================>..........] - ETA: 1:16 - loss: 0.2390 - regression_loss: 0.2283 - classification_loss: 0.0106 348/500 [===================>..........] - ETA: 1:16 - loss: 0.2388 - regression_loss: 0.2282 - classification_loss: 0.0106 349/500 [===================>..........] - ETA: 1:15 - loss: 0.2391 - regression_loss: 0.2284 - classification_loss: 0.0106 350/500 [====================>.........] - ETA: 1:15 - loss: 0.2389 - regression_loss: 0.2282 - classification_loss: 0.0106 351/500 [====================>.........] - ETA: 1:14 - loss: 0.2385 - regression_loss: 0.2279 - classification_loss: 0.0106 352/500 [====================>.........] - ETA: 1:14 - loss: 0.2382 - regression_loss: 0.2276 - classification_loss: 0.0106 353/500 [====================>.........] - ETA: 1:13 - loss: 0.2385 - regression_loss: 0.2279 - classification_loss: 0.0106 354/500 [====================>.........] - ETA: 1:13 - loss: 0.2383 - regression_loss: 0.2277 - classification_loss: 0.0106 355/500 [====================>.........] - ETA: 1:12 - loss: 0.2379 - regression_loss: 0.2273 - classification_loss: 0.0106 356/500 [====================>.........] - ETA: 1:12 - loss: 0.2376 - regression_loss: 0.2270 - classification_loss: 0.0106 357/500 [====================>.........] - ETA: 1:11 - loss: 0.2378 - regression_loss: 0.2272 - classification_loss: 0.0106 358/500 [====================>.........] - ETA: 1:11 - loss: 0.2376 - regression_loss: 0.2270 - classification_loss: 0.0106 359/500 [====================>.........] - ETA: 1:10 - loss: 0.2379 - regression_loss: 0.2273 - classification_loss: 0.0106 360/500 [====================>.........] - ETA: 1:10 - loss: 0.2375 - regression_loss: 0.2269 - classification_loss: 0.0106 361/500 [====================>.........] - ETA: 1:09 - loss: 0.2374 - regression_loss: 0.2268 - classification_loss: 0.0106 362/500 [====================>.........] - ETA: 1:09 - loss: 0.2371 - regression_loss: 0.2265 - classification_loss: 0.0106 363/500 [====================>.........] - ETA: 1:08 - loss: 0.2368 - regression_loss: 0.2263 - classification_loss: 0.0106 364/500 [====================>.........] - ETA: 1:08 - loss: 0.2368 - regression_loss: 0.2263 - classification_loss: 0.0106 365/500 [====================>.........] - ETA: 1:07 - loss: 0.2371 - regression_loss: 0.2266 - classification_loss: 0.0106 366/500 [====================>.........] - ETA: 1:07 - loss: 0.2369 - regression_loss: 0.2263 - classification_loss: 0.0106 367/500 [=====================>........] - ETA: 1:06 - loss: 0.2368 - regression_loss: 0.2263 - classification_loss: 0.0105 368/500 [=====================>........] - ETA: 1:06 - loss: 0.2370 - regression_loss: 0.2264 - classification_loss: 0.0106 369/500 [=====================>........] - ETA: 1:05 - loss: 0.2368 - regression_loss: 0.2262 - classification_loss: 0.0105 370/500 [=====================>........] - ETA: 1:05 - loss: 0.2368 - regression_loss: 0.2262 - classification_loss: 0.0106 371/500 [=====================>........] - ETA: 1:04 - loss: 0.2364 - regression_loss: 0.2259 - classification_loss: 0.0105 372/500 [=====================>........] - ETA: 1:04 - loss: 0.2363 - regression_loss: 0.2258 - classification_loss: 0.0105 373/500 [=====================>........] - ETA: 1:03 - loss: 0.2358 - regression_loss: 0.2253 - classification_loss: 0.0105 374/500 [=====================>........] - ETA: 1:03 - loss: 0.2356 - regression_loss: 0.2252 - classification_loss: 0.0105 375/500 [=====================>........] - ETA: 1:02 - loss: 0.2355 - regression_loss: 0.2251 - classification_loss: 0.0105 376/500 [=====================>........] - ETA: 1:02 - loss: 0.2355 - regression_loss: 0.2250 - classification_loss: 0.0105 377/500 [=====================>........] - ETA: 1:01 - loss: 0.2357 - regression_loss: 0.2252 - classification_loss: 0.0105 378/500 [=====================>........] - ETA: 1:01 - loss: 0.2356 - regression_loss: 0.2251 - classification_loss: 0.0105 379/500 [=====================>........] - ETA: 1:00 - loss: 0.2357 - regression_loss: 0.2252 - classification_loss: 0.0105 380/500 [=====================>........] - ETA: 1:00 - loss: 0.2359 - regression_loss: 0.2254 - classification_loss: 0.0105 381/500 [=====================>........] - ETA: 59s - loss: 0.2359 - regression_loss: 0.2254 - classification_loss: 0.0105  382/500 [=====================>........] - ETA: 59s - loss: 0.2356 - regression_loss: 0.2251 - classification_loss: 0.0105 383/500 [=====================>........] - ETA: 58s - loss: 0.2356 - regression_loss: 0.2251 - classification_loss: 0.0105 384/500 [======================>.......] - ETA: 58s - loss: 0.2355 - regression_loss: 0.2250 - classification_loss: 0.0105 385/500 [======================>.......] - ETA: 57s - loss: 0.2353 - regression_loss: 0.2248 - classification_loss: 0.0105 386/500 [======================>.......] - ETA: 57s - loss: 0.2350 - regression_loss: 0.2246 - classification_loss: 0.0104 387/500 [======================>.......] - ETA: 56s - loss: 0.2354 - regression_loss: 0.2249 - classification_loss: 0.0105 388/500 [======================>.......] - ETA: 56s - loss: 0.2356 - regression_loss: 0.2251 - classification_loss: 0.0105 389/500 [======================>.......] - ETA: 55s - loss: 0.2363 - regression_loss: 0.2258 - classification_loss: 0.0105 390/500 [======================>.......] - ETA: 55s - loss: 0.2365 - regression_loss: 0.2260 - classification_loss: 0.0105 391/500 [======================>.......] - ETA: 54s - loss: 0.2362 - regression_loss: 0.2257 - classification_loss: 0.0105 392/500 [======================>.......] - ETA: 54s - loss: 0.2361 - regression_loss: 0.2256 - classification_loss: 0.0105 393/500 [======================>.......] - ETA: 53s - loss: 0.2361 - regression_loss: 0.2257 - classification_loss: 0.0105 394/500 [======================>.......] - ETA: 53s - loss: 0.2362 - regression_loss: 0.2257 - classification_loss: 0.0105 395/500 [======================>.......] - ETA: 52s - loss: 0.2362 - regression_loss: 0.2258 - classification_loss: 0.0105 396/500 [======================>.......] - ETA: 52s - loss: 0.2367 - regression_loss: 0.2261 - classification_loss: 0.0105 397/500 [======================>.......] - ETA: 51s - loss: 0.2366 - regression_loss: 0.2261 - classification_loss: 0.0105 398/500 [======================>.......] - ETA: 51s - loss: 0.2373 - regression_loss: 0.2267 - classification_loss: 0.0106 399/500 [======================>.......] - ETA: 50s - loss: 0.2374 - regression_loss: 0.2268 - classification_loss: 0.0106 400/500 [=======================>......] - ETA: 50s - loss: 0.2377 - regression_loss: 0.2271 - classification_loss: 0.0106 401/500 [=======================>......] - ETA: 49s - loss: 0.2375 - regression_loss: 0.2269 - classification_loss: 0.0106 402/500 [=======================>......] - ETA: 49s - loss: 0.2373 - regression_loss: 0.2268 - classification_loss: 0.0106 403/500 [=======================>......] - ETA: 48s - loss: 0.2375 - regression_loss: 0.2270 - classification_loss: 0.0106 404/500 [=======================>......] - ETA: 48s - loss: 0.2374 - regression_loss: 0.2269 - classification_loss: 0.0105 405/500 [=======================>......] - ETA: 47s - loss: 0.2379 - regression_loss: 0.2274 - classification_loss: 0.0105 406/500 [=======================>......] - ETA: 47s - loss: 0.2380 - regression_loss: 0.2275 - classification_loss: 0.0105 407/500 [=======================>......] - ETA: 46s - loss: 0.2377 - regression_loss: 0.2272 - classification_loss: 0.0105 408/500 [=======================>......] - ETA: 46s - loss: 0.2379 - regression_loss: 0.2274 - classification_loss: 0.0105 409/500 [=======================>......] - ETA: 45s - loss: 0.2378 - regression_loss: 0.2273 - classification_loss: 0.0105 410/500 [=======================>......] - ETA: 45s - loss: 0.2379 - regression_loss: 0.2274 - classification_loss: 0.0105 411/500 [=======================>......] - ETA: 44s - loss: 0.2379 - regression_loss: 0.2274 - classification_loss: 0.0105 412/500 [=======================>......] - ETA: 44s - loss: 0.2383 - regression_loss: 0.2277 - classification_loss: 0.0105 413/500 [=======================>......] - ETA: 43s - loss: 0.2384 - regression_loss: 0.2279 - classification_loss: 0.0105 414/500 [=======================>......] - ETA: 43s - loss: 0.2384 - regression_loss: 0.2279 - classification_loss: 0.0105 415/500 [=======================>......] - ETA: 42s - loss: 0.2386 - regression_loss: 0.2281 - classification_loss: 0.0106 416/500 [=======================>......] - ETA: 42s - loss: 0.2387 - regression_loss: 0.2282 - classification_loss: 0.0106 417/500 [========================>.....] - ETA: 41s - loss: 0.2396 - regression_loss: 0.2291 - classification_loss: 0.0106 418/500 [========================>.....] - ETA: 41s - loss: 0.2398 - regression_loss: 0.2292 - classification_loss: 0.0106 419/500 [========================>.....] - ETA: 40s - loss: 0.2397 - regression_loss: 0.2291 - classification_loss: 0.0106 420/500 [========================>.....] - ETA: 40s - loss: 0.2395 - regression_loss: 0.2289 - classification_loss: 0.0106 421/500 [========================>.....] - ETA: 39s - loss: 0.2396 - regression_loss: 0.2290 - classification_loss: 0.0106 422/500 [========================>.....] - ETA: 39s - loss: 0.2394 - regression_loss: 0.2288 - classification_loss: 0.0106 423/500 [========================>.....] - ETA: 38s - loss: 0.2395 - regression_loss: 0.2288 - classification_loss: 0.0106 424/500 [========================>.....] - ETA: 38s - loss: 0.2395 - regression_loss: 0.2289 - classification_loss: 0.0106 425/500 [========================>.....] - ETA: 37s - loss: 0.2393 - regression_loss: 0.2286 - classification_loss: 0.0106 426/500 [========================>.....] - ETA: 37s - loss: 0.2390 - regression_loss: 0.2284 - classification_loss: 0.0106 427/500 [========================>.....] - ETA: 36s - loss: 0.2389 - regression_loss: 0.2283 - classification_loss: 0.0106 428/500 [========================>.....] - ETA: 36s - loss: 0.2387 - regression_loss: 0.2281 - classification_loss: 0.0106 429/500 [========================>.....] - ETA: 35s - loss: 0.2389 - regression_loss: 0.2283 - classification_loss: 0.0106 430/500 [========================>.....] - ETA: 35s - loss: 0.2387 - regression_loss: 0.2281 - classification_loss: 0.0106 431/500 [========================>.....] - ETA: 34s - loss: 0.2388 - regression_loss: 0.2282 - classification_loss: 0.0106 432/500 [========================>.....] - ETA: 34s - loss: 0.2390 - regression_loss: 0.2284 - classification_loss: 0.0106 433/500 [========================>.....] - ETA: 33s - loss: 0.2392 - regression_loss: 0.2286 - classification_loss: 0.0107 434/500 [=========================>....] - ETA: 33s - loss: 0.2390 - regression_loss: 0.2283 - classification_loss: 0.0106 435/500 [=========================>....] - ETA: 32s - loss: 0.2389 - regression_loss: 0.2283 - classification_loss: 0.0106 436/500 [=========================>....] - ETA: 32s - loss: 0.2388 - regression_loss: 0.2282 - classification_loss: 0.0106 437/500 [=========================>....] - ETA: 31s - loss: 0.2386 - regression_loss: 0.2280 - classification_loss: 0.0106 438/500 [=========================>....] - ETA: 31s - loss: 0.2388 - regression_loss: 0.2282 - classification_loss: 0.0106 439/500 [=========================>....] - ETA: 30s - loss: 0.2385 - regression_loss: 0.2279 - classification_loss: 0.0106 440/500 [=========================>....] - ETA: 30s - loss: 0.2387 - regression_loss: 0.2281 - classification_loss: 0.0106 441/500 [=========================>....] - ETA: 29s - loss: 0.2384 - regression_loss: 0.2278 - classification_loss: 0.0106 442/500 [=========================>....] - ETA: 29s - loss: 0.2381 - regression_loss: 0.2276 - classification_loss: 0.0106 443/500 [=========================>....] - ETA: 28s - loss: 0.2380 - regression_loss: 0.2275 - classification_loss: 0.0106 444/500 [=========================>....] - ETA: 28s - loss: 0.2382 - regression_loss: 0.2275 - classification_loss: 0.0106 445/500 [=========================>....] - ETA: 27s - loss: 0.2379 - regression_loss: 0.2273 - classification_loss: 0.0106 446/500 [=========================>....] - ETA: 27s - loss: 0.2379 - regression_loss: 0.2273 - classification_loss: 0.0106 447/500 [=========================>....] - ETA: 26s - loss: 0.2380 - regression_loss: 0.2274 - classification_loss: 0.0106 448/500 [=========================>....] - ETA: 26s - loss: 0.2377 - regression_loss: 0.2271 - classification_loss: 0.0106 449/500 [=========================>....] - ETA: 25s - loss: 0.2378 - regression_loss: 0.2272 - classification_loss: 0.0106 450/500 [==========================>...] - ETA: 25s - loss: 0.2378 - regression_loss: 0.2272 - classification_loss: 0.0106 451/500 [==========================>...] - ETA: 24s - loss: 0.2375 - regression_loss: 0.2270 - classification_loss: 0.0106 452/500 [==========================>...] - ETA: 24s - loss: 0.2374 - regression_loss: 0.2269 - classification_loss: 0.0105 453/500 [==========================>...] - ETA: 23s - loss: 0.2372 - regression_loss: 0.2266 - classification_loss: 0.0105 454/500 [==========================>...] - ETA: 23s - loss: 0.2369 - regression_loss: 0.2264 - classification_loss: 0.0105 455/500 [==========================>...] - ETA: 22s - loss: 0.2368 - regression_loss: 0.2263 - classification_loss: 0.0105 456/500 [==========================>...] - ETA: 22s - loss: 0.2366 - regression_loss: 0.2261 - classification_loss: 0.0105 457/500 [==========================>...] - ETA: 21s - loss: 0.2364 - regression_loss: 0.2259 - classification_loss: 0.0105 458/500 [==========================>...] - ETA: 21s - loss: 0.2362 - regression_loss: 0.2258 - classification_loss: 0.0105 459/500 [==========================>...] - ETA: 20s - loss: 0.2365 - regression_loss: 0.2260 - classification_loss: 0.0105 460/500 [==========================>...] - ETA: 20s - loss: 0.2362 - regression_loss: 0.2258 - classification_loss: 0.0105 461/500 [==========================>...] - ETA: 19s - loss: 0.2363 - regression_loss: 0.2259 - classification_loss: 0.0105 462/500 [==========================>...] - ETA: 19s - loss: 0.2365 - regression_loss: 0.2260 - classification_loss: 0.0105 463/500 [==========================>...] - ETA: 18s - loss: 0.2364 - regression_loss: 0.2259 - classification_loss: 0.0105 464/500 [==========================>...] - ETA: 18s - loss: 0.2362 - regression_loss: 0.2257 - classification_loss: 0.0105 465/500 [==========================>...] - ETA: 17s - loss: 0.2360 - regression_loss: 0.2255 - classification_loss: 0.0105 466/500 [==========================>...] - ETA: 17s - loss: 0.2361 - regression_loss: 0.2256 - classification_loss: 0.0105 467/500 [===========================>..] - ETA: 16s - loss: 0.2360 - regression_loss: 0.2256 - classification_loss: 0.0105 468/500 [===========================>..] - ETA: 16s - loss: 0.2360 - regression_loss: 0.2255 - classification_loss: 0.0105 469/500 [===========================>..] - ETA: 15s - loss: 0.2359 - regression_loss: 0.2255 - classification_loss: 0.0105 470/500 [===========================>..] - ETA: 15s - loss: 0.2356 - regression_loss: 0.2252 - classification_loss: 0.0105 471/500 [===========================>..] - ETA: 14s - loss: 0.2357 - regression_loss: 0.2252 - classification_loss: 0.0105 472/500 [===========================>..] - ETA: 14s - loss: 0.2353 - regression_loss: 0.2249 - classification_loss: 0.0104 473/500 [===========================>..] - ETA: 13s - loss: 0.2351 - regression_loss: 0.2246 - classification_loss: 0.0104 474/500 [===========================>..] - ETA: 13s - loss: 0.2351 - regression_loss: 0.2247 - classification_loss: 0.0104 475/500 [===========================>..] - ETA: 12s - loss: 0.2355 - regression_loss: 0.2251 - classification_loss: 0.0104 476/500 [===========================>..] - ETA: 12s - loss: 0.2358 - regression_loss: 0.2253 - classification_loss: 0.0104 477/500 [===========================>..] - ETA: 11s - loss: 0.2357 - regression_loss: 0.2253 - classification_loss: 0.0104 478/500 [===========================>..] - ETA: 11s - loss: 0.2357 - regression_loss: 0.2253 - classification_loss: 0.0104 479/500 [===========================>..] - ETA: 10s - loss: 0.2355 - regression_loss: 0.2251 - classification_loss: 0.0104 480/500 [===========================>..] - ETA: 10s - loss: 0.2354 - regression_loss: 0.2249 - classification_loss: 0.0104 481/500 [===========================>..] - ETA: 9s - loss: 0.2355 - regression_loss: 0.2251 - classification_loss: 0.0104  482/500 [===========================>..] - ETA: 9s - loss: 0.2356 - regression_loss: 0.2252 - classification_loss: 0.0104 483/500 [===========================>..] - ETA: 8s - loss: 0.2356 - regression_loss: 0.2253 - classification_loss: 0.0104 484/500 [============================>.] - ETA: 8s - loss: 0.2354 - regression_loss: 0.2250 - classification_loss: 0.0104 485/500 [============================>.] - ETA: 7s - loss: 0.2354 - regression_loss: 0.2250 - classification_loss: 0.0104 486/500 [============================>.] - ETA: 7s - loss: 0.2353 - regression_loss: 0.2250 - classification_loss: 0.0104 487/500 [============================>.] - ETA: 6s - loss: 0.2352 - regression_loss: 0.2249 - classification_loss: 0.0104 488/500 [============================>.] - ETA: 6s - loss: 0.2356 - regression_loss: 0.2252 - classification_loss: 0.0104 489/500 [============================>.] - ETA: 5s - loss: 0.2357 - regression_loss: 0.2253 - classification_loss: 0.0104 490/500 [============================>.] - ETA: 5s - loss: 0.2356 - regression_loss: 0.2252 - classification_loss: 0.0104 491/500 [============================>.] - ETA: 4s - loss: 0.2356 - regression_loss: 0.2252 - classification_loss: 0.0104 492/500 [============================>.] - ETA: 4s - loss: 0.2364 - regression_loss: 0.2259 - classification_loss: 0.0104 493/500 [============================>.] - ETA: 3s - loss: 0.2363 - regression_loss: 0.2259 - classification_loss: 0.0104 494/500 [============================>.] - ETA: 3s - loss: 0.2362 - regression_loss: 0.2258 - classification_loss: 0.0104 495/500 [============================>.] - ETA: 2s - loss: 0.2361 - regression_loss: 0.2257 - classification_loss: 0.0104 496/500 [============================>.] - ETA: 2s - loss: 0.2364 - regression_loss: 0.2260 - classification_loss: 0.0104 497/500 [============================>.] - ETA: 1s - loss: 0.2362 - regression_loss: 0.2258 - classification_loss: 0.0104 498/500 [============================>.] - ETA: 1s - loss: 0.2363 - regression_loss: 0.2258 - classification_loss: 0.0104 499/500 [============================>.] - ETA: 0s - loss: 0.2361 - regression_loss: 0.2257 - classification_loss: 0.0104 500/500 [==============================] - 251s 503ms/step - loss: 0.2358 - regression_loss: 0.2254 - classification_loss: 0.0104 1172 instances of class plum with average precision: 0.7599 mAP: 0.7599 Epoch 00030: saving model to ./training/snapshots/resnet101_pascal_30.h5 Epoch 31/150 1/500 [..............................] - ETA: 4:18 - loss: 0.1773 - regression_loss: 0.1726 - classification_loss: 0.0047 2/500 [..............................] - ETA: 4:15 - loss: 0.2005 - regression_loss: 0.1933 - classification_loss: 0.0072 3/500 [..............................] - ETA: 4:09 - loss: 0.3258 - regression_loss: 0.3172 - classification_loss: 0.0087 4/500 [..............................] - ETA: 4:09 - loss: 0.2978 - regression_loss: 0.2886 - classification_loss: 0.0093 5/500 [..............................] - ETA: 4:11 - loss: 0.2818 - regression_loss: 0.2723 - classification_loss: 0.0095 6/500 [..............................] - ETA: 4:10 - loss: 0.2585 - regression_loss: 0.2503 - classification_loss: 0.0082 7/500 [..............................] - ETA: 4:08 - loss: 0.2474 - regression_loss: 0.2395 - classification_loss: 0.0079 8/500 [..............................] - ETA: 4:08 - loss: 0.2639 - regression_loss: 0.2548 - classification_loss: 0.0091 9/500 [..............................] - ETA: 4:07 - loss: 0.2579 - regression_loss: 0.2491 - classification_loss: 0.0088 10/500 [..............................] - ETA: 4:06 - loss: 0.2455 - regression_loss: 0.2369 - classification_loss: 0.0086 11/500 [..............................] - ETA: 4:05 - loss: 0.2450 - regression_loss: 0.2356 - classification_loss: 0.0093 12/500 [..............................] - ETA: 4:05 - loss: 0.2426 - regression_loss: 0.2332 - classification_loss: 0.0094 13/500 [..............................] - ETA: 4:05 - loss: 0.2341 - regression_loss: 0.2252 - classification_loss: 0.0089 14/500 [..............................] - ETA: 4:04 - loss: 0.2304 - regression_loss: 0.2218 - classification_loss: 0.0086 15/500 [..............................] - ETA: 4:03 - loss: 0.2307 - regression_loss: 0.2217 - classification_loss: 0.0090 16/500 [..............................] - ETA: 4:02 - loss: 0.2255 - regression_loss: 0.2165 - classification_loss: 0.0091 17/500 [>.............................] - ETA: 4:02 - loss: 0.2408 - regression_loss: 0.2318 - classification_loss: 0.0090 18/500 [>.............................] - ETA: 4:01 - loss: 0.2443 - regression_loss: 0.2343 - classification_loss: 0.0100 19/500 [>.............................] - ETA: 4:01 - loss: 0.2384 - regression_loss: 0.2286 - classification_loss: 0.0098 20/500 [>.............................] - ETA: 4:01 - loss: 0.2462 - regression_loss: 0.2357 - classification_loss: 0.0105 21/500 [>.............................] - ETA: 4:01 - loss: 0.2452 - regression_loss: 0.2348 - classification_loss: 0.0104 22/500 [>.............................] - ETA: 4:00 - loss: 0.2497 - regression_loss: 0.2391 - classification_loss: 0.0106 23/500 [>.............................] - ETA: 3:59 - loss: 0.2444 - regression_loss: 0.2341 - classification_loss: 0.0103 24/500 [>.............................] - ETA: 3:59 - loss: 0.2411 - regression_loss: 0.2310 - classification_loss: 0.0102 25/500 [>.............................] - ETA: 3:59 - loss: 0.2457 - regression_loss: 0.2349 - classification_loss: 0.0108 26/500 [>.............................] - ETA: 3:59 - loss: 0.2432 - regression_loss: 0.2327 - classification_loss: 0.0105 27/500 [>.............................] - ETA: 3:58 - loss: 0.2416 - regression_loss: 0.2312 - classification_loss: 0.0105 28/500 [>.............................] - ETA: 3:57 - loss: 0.2382 - regression_loss: 0.2279 - classification_loss: 0.0102 29/500 [>.............................] - ETA: 3:57 - loss: 0.2326 - regression_loss: 0.2226 - classification_loss: 0.0100 30/500 [>.............................] - ETA: 3:56 - loss: 0.2316 - regression_loss: 0.2217 - classification_loss: 0.0099 31/500 [>.............................] - ETA: 3:56 - loss: 0.2280 - regression_loss: 0.2183 - classification_loss: 0.0097 32/500 [>.............................] - ETA: 3:55 - loss: 0.2326 - regression_loss: 0.2227 - classification_loss: 0.0099 33/500 [>.............................] - ETA: 3:55 - loss: 0.2325 - regression_loss: 0.2225 - classification_loss: 0.0099 34/500 [=>............................] - ETA: 3:54 - loss: 0.2308 - regression_loss: 0.2210 - classification_loss: 0.0098 35/500 [=>............................] - ETA: 3:54 - loss: 0.2279 - regression_loss: 0.2184 - classification_loss: 0.0096 36/500 [=>............................] - ETA: 3:53 - loss: 0.2275 - regression_loss: 0.2178 - classification_loss: 0.0097 37/500 [=>............................] - ETA: 3:52 - loss: 0.2292 - regression_loss: 0.2193 - classification_loss: 0.0098 38/500 [=>............................] - ETA: 3:52 - loss: 0.2278 - regression_loss: 0.2180 - classification_loss: 0.0098 39/500 [=>............................] - ETA: 3:51 - loss: 0.2249 - regression_loss: 0.2153 - classification_loss: 0.0096 40/500 [=>............................] - ETA: 3:50 - loss: 0.2235 - regression_loss: 0.2141 - classification_loss: 0.0095 41/500 [=>............................] - ETA: 3:50 - loss: 0.2211 - regression_loss: 0.2117 - classification_loss: 0.0093 42/500 [=>............................] - ETA: 3:49 - loss: 0.2205 - regression_loss: 0.2111 - classification_loss: 0.0093 43/500 [=>............................] - ETA: 3:49 - loss: 0.2206 - regression_loss: 0.2113 - classification_loss: 0.0093 44/500 [=>............................] - ETA: 3:48 - loss: 0.2203 - regression_loss: 0.2110 - classification_loss: 0.0093 45/500 [=>............................] - ETA: 3:48 - loss: 0.2181 - regression_loss: 0.2088 - classification_loss: 0.0092 46/500 [=>............................] - ETA: 3:47 - loss: 0.2174 - regression_loss: 0.2083 - classification_loss: 0.0091 47/500 [=>............................] - ETA: 3:47 - loss: 0.2223 - regression_loss: 0.2118 - classification_loss: 0.0104 48/500 [=>............................] - ETA: 3:46 - loss: 0.2225 - regression_loss: 0.2122 - classification_loss: 0.0103 49/500 [=>............................] - ETA: 3:45 - loss: 0.2223 - regression_loss: 0.2121 - classification_loss: 0.0102 50/500 [==>...........................] - ETA: 3:45 - loss: 0.2193 - regression_loss: 0.2092 - classification_loss: 0.0101 51/500 [==>...........................] - ETA: 3:45 - loss: 0.2201 - regression_loss: 0.2099 - classification_loss: 0.0102 52/500 [==>...........................] - ETA: 3:44 - loss: 0.2191 - regression_loss: 0.2089 - classification_loss: 0.0103 53/500 [==>...........................] - ETA: 3:43 - loss: 0.2190 - regression_loss: 0.2089 - classification_loss: 0.0101 54/500 [==>...........................] - ETA: 3:43 - loss: 0.2192 - regression_loss: 0.2092 - classification_loss: 0.0100 55/500 [==>...........................] - ETA: 3:43 - loss: 0.2210 - regression_loss: 0.2107 - classification_loss: 0.0103 56/500 [==>...........................] - ETA: 3:42 - loss: 0.2219 - regression_loss: 0.2116 - classification_loss: 0.0103 57/500 [==>...........................] - ETA: 3:42 - loss: 0.2227 - regression_loss: 0.2123 - classification_loss: 0.0103 58/500 [==>...........................] - ETA: 3:41 - loss: 0.2213 - regression_loss: 0.2110 - classification_loss: 0.0102 59/500 [==>...........................] - ETA: 3:41 - loss: 0.2193 - regression_loss: 0.2093 - classification_loss: 0.0101 60/500 [==>...........................] - ETA: 3:41 - loss: 0.2182 - regression_loss: 0.2082 - classification_loss: 0.0100 61/500 [==>...........................] - ETA: 3:40 - loss: 0.2221 - regression_loss: 0.2121 - classification_loss: 0.0100 62/500 [==>...........................] - ETA: 3:39 - loss: 0.2229 - regression_loss: 0.2128 - classification_loss: 0.0101 63/500 [==>...........................] - ETA: 3:39 - loss: 0.2224 - regression_loss: 0.2124 - classification_loss: 0.0100 64/500 [==>...........................] - ETA: 3:38 - loss: 0.2272 - regression_loss: 0.2170 - classification_loss: 0.0102 65/500 [==>...........................] - ETA: 3:38 - loss: 0.2279 - regression_loss: 0.2178 - classification_loss: 0.0101 66/500 [==>...........................] - ETA: 3:37 - loss: 0.2315 - regression_loss: 0.2212 - classification_loss: 0.0104 67/500 [===>..........................] - ETA: 3:37 - loss: 0.2309 - regression_loss: 0.2207 - classification_loss: 0.0102 68/500 [===>..........................] - ETA: 3:36 - loss: 0.2295 - regression_loss: 0.2194 - classification_loss: 0.0101 69/500 [===>..........................] - ETA: 3:36 - loss: 0.2272 - regression_loss: 0.2172 - classification_loss: 0.0100 70/500 [===>..........................] - ETA: 3:35 - loss: 0.2282 - regression_loss: 0.2182 - classification_loss: 0.0100 71/500 [===>..........................] - ETA: 3:35 - loss: 0.2286 - regression_loss: 0.2185 - classification_loss: 0.0101 72/500 [===>..........................] - ETA: 3:34 - loss: 0.2273 - regression_loss: 0.2173 - classification_loss: 0.0101 73/500 [===>..........................] - ETA: 3:34 - loss: 0.2270 - regression_loss: 0.2170 - classification_loss: 0.0099 74/500 [===>..........................] - ETA: 3:33 - loss: 0.2284 - regression_loss: 0.2184 - classification_loss: 0.0100 75/500 [===>..........................] - ETA: 3:33 - loss: 0.2276 - regression_loss: 0.2177 - classification_loss: 0.0099 76/500 [===>..........................] - ETA: 3:32 - loss: 0.2283 - regression_loss: 0.2184 - classification_loss: 0.0099 77/500 [===>..........................] - ETA: 3:32 - loss: 0.2287 - regression_loss: 0.2189 - classification_loss: 0.0099 78/500 [===>..........................] - ETA: 3:31 - loss: 0.2268 - regression_loss: 0.2170 - classification_loss: 0.0098 79/500 [===>..........................] - ETA: 3:31 - loss: 0.2253 - regression_loss: 0.2156 - classification_loss: 0.0097 80/500 [===>..........................] - ETA: 3:30 - loss: 0.2258 - regression_loss: 0.2162 - classification_loss: 0.0097 81/500 [===>..........................] - ETA: 3:30 - loss: 0.2247 - regression_loss: 0.2151 - classification_loss: 0.0096 82/500 [===>..........................] - ETA: 3:29 - loss: 0.2266 - regression_loss: 0.2167 - classification_loss: 0.0098 83/500 [===>..........................] - ETA: 3:29 - loss: 0.2267 - regression_loss: 0.2169 - classification_loss: 0.0097 84/500 [====>.........................] - ETA: 3:28 - loss: 0.2276 - regression_loss: 0.2179 - classification_loss: 0.0098 85/500 [====>.........................] - ETA: 3:28 - loss: 0.2291 - regression_loss: 0.2192 - classification_loss: 0.0099 86/500 [====>.........................] - ETA: 3:27 - loss: 0.2295 - regression_loss: 0.2196 - classification_loss: 0.0099 87/500 [====>.........................] - ETA: 3:27 - loss: 0.2301 - regression_loss: 0.2201 - classification_loss: 0.0100 88/500 [====>.........................] - ETA: 3:27 - loss: 0.2303 - regression_loss: 0.2203 - classification_loss: 0.0100 89/500 [====>.........................] - ETA: 3:26 - loss: 0.2309 - regression_loss: 0.2208 - classification_loss: 0.0101 90/500 [====>.........................] - ETA: 3:25 - loss: 0.2315 - regression_loss: 0.2214 - classification_loss: 0.0101 91/500 [====>.........................] - ETA: 3:25 - loss: 0.2309 - regression_loss: 0.2208 - classification_loss: 0.0101 92/500 [====>.........................] - ETA: 3:24 - loss: 0.2316 - regression_loss: 0.2214 - classification_loss: 0.0102 93/500 [====>.........................] - ETA: 3:24 - loss: 0.2308 - regression_loss: 0.2207 - classification_loss: 0.0102 94/500 [====>.........................] - ETA: 3:23 - loss: 0.2326 - regression_loss: 0.2225 - classification_loss: 0.0101 95/500 [====>.........................] - ETA: 3:23 - loss: 0.2335 - regression_loss: 0.2233 - classification_loss: 0.0102 96/500 [====>.........................] - ETA: 3:22 - loss: 0.2343 - regression_loss: 0.2241 - classification_loss: 0.0102 97/500 [====>.........................] - ETA: 3:22 - loss: 0.2348 - regression_loss: 0.2246 - classification_loss: 0.0101 98/500 [====>.........................] - ETA: 3:21 - loss: 0.2355 - regression_loss: 0.2253 - classification_loss: 0.0102 99/500 [====>.........................] - ETA: 3:21 - loss: 0.2369 - regression_loss: 0.2266 - classification_loss: 0.0102 100/500 [=====>........................] - ETA: 3:20 - loss: 0.2366 - regression_loss: 0.2263 - classification_loss: 0.0102 101/500 [=====>........................] - ETA: 3:19 - loss: 0.2370 - regression_loss: 0.2267 - classification_loss: 0.0103 102/500 [=====>........................] - ETA: 3:19 - loss: 0.2380 - regression_loss: 0.2275 - classification_loss: 0.0106 103/500 [=====>........................] - ETA: 3:18 - loss: 0.2379 - regression_loss: 0.2274 - classification_loss: 0.0105 104/500 [=====>........................] - ETA: 3:18 - loss: 0.2388 - regression_loss: 0.2283 - classification_loss: 0.0105 105/500 [=====>........................] - ETA: 3:18 - loss: 0.2382 - regression_loss: 0.2278 - classification_loss: 0.0104 106/500 [=====>........................] - ETA: 3:17 - loss: 0.2387 - regression_loss: 0.2282 - classification_loss: 0.0105 107/500 [=====>........................] - ETA: 3:16 - loss: 0.2397 - regression_loss: 0.2292 - classification_loss: 0.0105 108/500 [=====>........................] - ETA: 3:16 - loss: 0.2400 - regression_loss: 0.2295 - classification_loss: 0.0104 109/500 [=====>........................] - ETA: 3:15 - loss: 0.2392 - regression_loss: 0.2289 - classification_loss: 0.0104 110/500 [=====>........................] - ETA: 3:15 - loss: 0.2384 - regression_loss: 0.2281 - classification_loss: 0.0103 111/500 [=====>........................] - ETA: 3:15 - loss: 0.2381 - regression_loss: 0.2278 - classification_loss: 0.0103 112/500 [=====>........................] - ETA: 3:14 - loss: 0.2390 - regression_loss: 0.2286 - classification_loss: 0.0104 113/500 [=====>........................] - ETA: 3:14 - loss: 0.2378 - regression_loss: 0.2275 - classification_loss: 0.0103 114/500 [=====>........................] - ETA: 3:13 - loss: 0.2374 - regression_loss: 0.2271 - classification_loss: 0.0103 115/500 [=====>........................] - ETA: 3:13 - loss: 0.2387 - regression_loss: 0.2283 - classification_loss: 0.0104 116/500 [=====>........................] - ETA: 3:12 - loss: 0.2404 - regression_loss: 0.2300 - classification_loss: 0.0104 117/500 [======>.......................] - ETA: 3:12 - loss: 0.2419 - regression_loss: 0.2313 - classification_loss: 0.0106 118/500 [======>.......................] - ETA: 3:11 - loss: 0.2418 - regression_loss: 0.2312 - classification_loss: 0.0106 119/500 [======>.......................] - ETA: 3:11 - loss: 0.2420 - regression_loss: 0.2315 - classification_loss: 0.0105 120/500 [======>.......................] - ETA: 3:10 - loss: 0.2421 - regression_loss: 0.2315 - classification_loss: 0.0105 121/500 [======>.......................] - ETA: 3:10 - loss: 0.2409 - regression_loss: 0.2305 - classification_loss: 0.0105 122/500 [======>.......................] - ETA: 3:09 - loss: 0.2406 - regression_loss: 0.2302 - classification_loss: 0.0104 123/500 [======>.......................] - ETA: 3:09 - loss: 0.2410 - regression_loss: 0.2306 - classification_loss: 0.0104 124/500 [======>.......................] - ETA: 3:08 - loss: 0.2396 - regression_loss: 0.2292 - classification_loss: 0.0104 125/500 [======>.......................] - ETA: 3:08 - loss: 0.2391 - regression_loss: 0.2287 - classification_loss: 0.0104 126/500 [======>.......................] - ETA: 3:07 - loss: 0.2393 - regression_loss: 0.2290 - classification_loss: 0.0104 127/500 [======>.......................] - ETA: 3:07 - loss: 0.2391 - regression_loss: 0.2288 - classification_loss: 0.0103 128/500 [======>.......................] - ETA: 3:06 - loss: 0.2390 - regression_loss: 0.2287 - classification_loss: 0.0103 129/500 [======>.......................] - ETA: 3:06 - loss: 0.2399 - regression_loss: 0.2296 - classification_loss: 0.0103 130/500 [======>.......................] - ETA: 3:05 - loss: 0.2399 - regression_loss: 0.2295 - classification_loss: 0.0103 131/500 [======>.......................] - ETA: 3:05 - loss: 0.2400 - regression_loss: 0.2297 - classification_loss: 0.0103 132/500 [======>.......................] - ETA: 3:04 - loss: 0.2398 - regression_loss: 0.2294 - classification_loss: 0.0104 133/500 [======>.......................] - ETA: 3:04 - loss: 0.2413 - regression_loss: 0.2308 - classification_loss: 0.0105 134/500 [=======>......................] - ETA: 3:03 - loss: 0.2424 - regression_loss: 0.2318 - classification_loss: 0.0106 135/500 [=======>......................] - ETA: 3:02 - loss: 0.2413 - regression_loss: 0.2308 - classification_loss: 0.0105 136/500 [=======>......................] - ETA: 3:02 - loss: 0.2410 - regression_loss: 0.2305 - classification_loss: 0.0105 137/500 [=======>......................] - ETA: 3:01 - loss: 0.2401 - regression_loss: 0.2297 - classification_loss: 0.0104 138/500 [=======>......................] - ETA: 3:01 - loss: 0.2412 - regression_loss: 0.2306 - classification_loss: 0.0105 139/500 [=======>......................] - ETA: 3:00 - loss: 0.2407 - regression_loss: 0.2303 - classification_loss: 0.0105 140/500 [=======>......................] - ETA: 3:00 - loss: 0.2413 - regression_loss: 0.2307 - classification_loss: 0.0105 141/500 [=======>......................] - ETA: 2:59 - loss: 0.2413 - regression_loss: 0.2308 - classification_loss: 0.0105 142/500 [=======>......................] - ETA: 2:59 - loss: 0.2409 - regression_loss: 0.2304 - classification_loss: 0.0105 143/500 [=======>......................] - ETA: 2:58 - loss: 0.2407 - regression_loss: 0.2302 - classification_loss: 0.0105 144/500 [=======>......................] - ETA: 2:58 - loss: 0.2409 - regression_loss: 0.2303 - classification_loss: 0.0105 145/500 [=======>......................] - ETA: 2:57 - loss: 0.2422 - regression_loss: 0.2316 - classification_loss: 0.0106 146/500 [=======>......................] - ETA: 2:57 - loss: 0.2418 - regression_loss: 0.2312 - classification_loss: 0.0106 147/500 [=======>......................] - ETA: 2:56 - loss: 0.2412 - regression_loss: 0.2306 - classification_loss: 0.0106 148/500 [=======>......................] - ETA: 2:56 - loss: 0.2403 - regression_loss: 0.2298 - classification_loss: 0.0105 149/500 [=======>......................] - ETA: 2:55 - loss: 0.2405 - regression_loss: 0.2300 - classification_loss: 0.0105 150/500 [========>.....................] - ETA: 2:55 - loss: 0.2400 - regression_loss: 0.2295 - classification_loss: 0.0105 151/500 [========>.....................] - ETA: 2:54 - loss: 0.2395 - regression_loss: 0.2290 - classification_loss: 0.0104 152/500 [========>.....................] - ETA: 2:54 - loss: 0.2398 - regression_loss: 0.2293 - classification_loss: 0.0105 153/500 [========>.....................] - ETA: 2:53 - loss: 0.2389 - regression_loss: 0.2285 - classification_loss: 0.0104 154/500 [========>.....................] - ETA: 2:53 - loss: 0.2378 - regression_loss: 0.2274 - classification_loss: 0.0103 155/500 [========>.....................] - ETA: 2:52 - loss: 0.2383 - regression_loss: 0.2279 - classification_loss: 0.0104 156/500 [========>.....................] - ETA: 2:52 - loss: 0.2387 - regression_loss: 0.2284 - classification_loss: 0.0103 157/500 [========>.....................] - ETA: 2:51 - loss: 0.2380 - regression_loss: 0.2277 - classification_loss: 0.0103 158/500 [========>.....................] - ETA: 2:51 - loss: 0.2390 - regression_loss: 0.2285 - classification_loss: 0.0105 159/500 [========>.....................] - ETA: 2:50 - loss: 0.2390 - regression_loss: 0.2285 - classification_loss: 0.0105 160/500 [========>.....................] - ETA: 2:50 - loss: 0.2385 - regression_loss: 0.2280 - classification_loss: 0.0105 161/500 [========>.....................] - ETA: 2:49 - loss: 0.2378 - regression_loss: 0.2273 - classification_loss: 0.0105 162/500 [========>.....................] - ETA: 2:49 - loss: 0.2372 - regression_loss: 0.2268 - classification_loss: 0.0104 163/500 [========>.....................] - ETA: 2:48 - loss: 0.2376 - regression_loss: 0.2271 - classification_loss: 0.0106 164/500 [========>.....................] - ETA: 2:48 - loss: 0.2370 - regression_loss: 0.2265 - classification_loss: 0.0105 165/500 [========>.....................] - ETA: 2:47 - loss: 0.2372 - regression_loss: 0.2267 - classification_loss: 0.0105 166/500 [========>.....................] - ETA: 2:47 - loss: 0.2364 - regression_loss: 0.2258 - classification_loss: 0.0105 167/500 [=========>....................] - ETA: 2:46 - loss: 0.2365 - regression_loss: 0.2260 - classification_loss: 0.0105 168/500 [=========>....................] - ETA: 2:46 - loss: 0.2363 - regression_loss: 0.2258 - classification_loss: 0.0105 169/500 [=========>....................] - ETA: 2:45 - loss: 0.2364 - regression_loss: 0.2259 - classification_loss: 0.0105 170/500 [=========>....................] - ETA: 2:45 - loss: 0.2357 - regression_loss: 0.2253 - classification_loss: 0.0104 171/500 [=========>....................] - ETA: 2:44 - loss: 0.2354 - regression_loss: 0.2249 - classification_loss: 0.0104 172/500 [=========>....................] - ETA: 2:44 - loss: 0.2357 - regression_loss: 0.2252 - classification_loss: 0.0105 173/500 [=========>....................] - ETA: 2:43 - loss: 0.2355 - regression_loss: 0.2251 - classification_loss: 0.0104 174/500 [=========>....................] - ETA: 2:43 - loss: 0.2354 - regression_loss: 0.2249 - classification_loss: 0.0104 175/500 [=========>....................] - ETA: 2:42 - loss: 0.2345 - regression_loss: 0.2241 - classification_loss: 0.0104 176/500 [=========>....................] - ETA: 2:42 - loss: 0.2338 - regression_loss: 0.2235 - classification_loss: 0.0103 177/500 [=========>....................] - ETA: 2:41 - loss: 0.2337 - regression_loss: 0.2233 - classification_loss: 0.0104 178/500 [=========>....................] - ETA: 2:41 - loss: 0.2337 - regression_loss: 0.2234 - classification_loss: 0.0104 179/500 [=========>....................] - ETA: 2:40 - loss: 0.2333 - regression_loss: 0.2230 - classification_loss: 0.0103 180/500 [=========>....................] - ETA: 2:40 - loss: 0.2332 - regression_loss: 0.2229 - classification_loss: 0.0103 181/500 [=========>....................] - ETA: 2:39 - loss: 0.2327 - regression_loss: 0.2225 - classification_loss: 0.0103 182/500 [=========>....................] - ETA: 2:39 - loss: 0.2320 - regression_loss: 0.2218 - classification_loss: 0.0102 183/500 [=========>....................] - ETA: 2:38 - loss: 0.2321 - regression_loss: 0.2220 - classification_loss: 0.0102 184/500 [==========>...................] - ETA: 2:38 - loss: 0.2316 - regression_loss: 0.2215 - classification_loss: 0.0101 185/500 [==========>...................] - ETA: 2:37 - loss: 0.2321 - regression_loss: 0.2220 - classification_loss: 0.0102 186/500 [==========>...................] - ETA: 2:37 - loss: 0.2317 - regression_loss: 0.2216 - classification_loss: 0.0101 187/500 [==========>...................] - ETA: 2:36 - loss: 0.2320 - regression_loss: 0.2219 - classification_loss: 0.0101 188/500 [==========>...................] - ETA: 2:36 - loss: 0.2321 - regression_loss: 0.2220 - classification_loss: 0.0101 189/500 [==========>...................] - ETA: 2:35 - loss: 0.2315 - regression_loss: 0.2214 - classification_loss: 0.0101 190/500 [==========>...................] - ETA: 2:35 - loss: 0.2309 - regression_loss: 0.2209 - classification_loss: 0.0100 191/500 [==========>...................] - ETA: 2:34 - loss: 0.2314 - regression_loss: 0.2213 - classification_loss: 0.0101 192/500 [==========>...................] - ETA: 2:34 - loss: 0.2311 - regression_loss: 0.2210 - classification_loss: 0.0101 193/500 [==========>...................] - ETA: 2:33 - loss: 0.2305 - regression_loss: 0.2205 - classification_loss: 0.0100 194/500 [==========>...................] - ETA: 2:33 - loss: 0.2304 - regression_loss: 0.2204 - classification_loss: 0.0100 195/500 [==========>...................] - ETA: 2:32 - loss: 0.2311 - regression_loss: 0.2210 - classification_loss: 0.0100 196/500 [==========>...................] - ETA: 2:32 - loss: 0.2310 - regression_loss: 0.2210 - classification_loss: 0.0100 197/500 [==========>...................] - ETA: 2:31 - loss: 0.2315 - regression_loss: 0.2215 - classification_loss: 0.0100 198/500 [==========>...................] - ETA: 2:31 - loss: 0.2314 - regression_loss: 0.2214 - classification_loss: 0.0100 199/500 [==========>...................] - ETA: 2:30 - loss: 0.2324 - regression_loss: 0.2222 - classification_loss: 0.0101 200/500 [===========>..................] - ETA: 2:30 - loss: 0.2317 - regression_loss: 0.2216 - classification_loss: 0.0101 201/500 [===========>..................] - ETA: 2:29 - loss: 0.2310 - regression_loss: 0.2209 - classification_loss: 0.0100 202/500 [===========>..................] - ETA: 2:29 - loss: 0.2309 - regression_loss: 0.2209 - classification_loss: 0.0100 203/500 [===========>..................] - ETA: 2:28 - loss: 0.2311 - regression_loss: 0.2210 - classification_loss: 0.0100 204/500 [===========>..................] - ETA: 2:28 - loss: 0.2309 - regression_loss: 0.2209 - classification_loss: 0.0100 205/500 [===========>..................] - ETA: 2:27 - loss: 0.2308 - regression_loss: 0.2208 - classification_loss: 0.0100 206/500 [===========>..................] - ETA: 2:27 - loss: 0.2307 - regression_loss: 0.2207 - classification_loss: 0.0100 207/500 [===========>..................] - ETA: 2:26 - loss: 0.2305 - regression_loss: 0.2205 - classification_loss: 0.0100 208/500 [===========>..................] - ETA: 2:26 - loss: 0.2299 - regression_loss: 0.2199 - classification_loss: 0.0099 209/500 [===========>..................] - ETA: 2:25 - loss: 0.2304 - regression_loss: 0.2205 - classification_loss: 0.0100 210/500 [===========>..................] - ETA: 2:25 - loss: 0.2306 - regression_loss: 0.2206 - classification_loss: 0.0100 211/500 [===========>..................] - ETA: 2:24 - loss: 0.2297 - regression_loss: 0.2198 - classification_loss: 0.0099 212/500 [===========>..................] - ETA: 2:24 - loss: 0.2298 - regression_loss: 0.2199 - classification_loss: 0.0100 213/500 [===========>..................] - ETA: 2:23 - loss: 0.2291 - regression_loss: 0.2191 - classification_loss: 0.0100 214/500 [===========>..................] - ETA: 2:23 - loss: 0.2288 - regression_loss: 0.2188 - classification_loss: 0.0100 215/500 [===========>..................] - ETA: 2:22 - loss: 0.2287 - regression_loss: 0.2188 - classification_loss: 0.0099 216/500 [===========>..................] - ETA: 2:22 - loss: 0.2288 - regression_loss: 0.2188 - classification_loss: 0.0099 217/500 [============>.................] - ETA: 2:21 - loss: 0.2284 - regression_loss: 0.2185 - classification_loss: 0.0099 218/500 [============>.................] - ETA: 2:21 - loss: 0.2280 - regression_loss: 0.2181 - classification_loss: 0.0099 219/500 [============>.................] - ETA: 2:20 - loss: 0.2279 - regression_loss: 0.2181 - classification_loss: 0.0099 220/500 [============>.................] - ETA: 2:20 - loss: 0.2279 - regression_loss: 0.2180 - classification_loss: 0.0099 221/500 [============>.................] - ETA: 2:19 - loss: 0.2276 - regression_loss: 0.2178 - classification_loss: 0.0098 222/500 [============>.................] - ETA: 2:19 - loss: 0.2274 - regression_loss: 0.2176 - classification_loss: 0.0098 223/500 [============>.................] - ETA: 2:18 - loss: 0.2268 - regression_loss: 0.2171 - classification_loss: 0.0098 224/500 [============>.................] - ETA: 2:18 - loss: 0.2272 - regression_loss: 0.2174 - classification_loss: 0.0098 225/500 [============>.................] - ETA: 2:17 - loss: 0.2270 - regression_loss: 0.2172 - classification_loss: 0.0098 226/500 [============>.................] - ETA: 2:17 - loss: 0.2275 - regression_loss: 0.2177 - classification_loss: 0.0098 227/500 [============>.................] - ETA: 2:16 - loss: 0.2281 - regression_loss: 0.2183 - classification_loss: 0.0098 228/500 [============>.................] - ETA: 2:16 - loss: 0.2288 - regression_loss: 0.2190 - classification_loss: 0.0098 229/500 [============>.................] - ETA: 2:15 - loss: 0.2284 - regression_loss: 0.2186 - classification_loss: 0.0097 230/500 [============>.................] - ETA: 2:15 - loss: 0.2288 - regression_loss: 0.2191 - classification_loss: 0.0097 231/500 [============>.................] - ETA: 2:14 - loss: 0.2289 - regression_loss: 0.2191 - classification_loss: 0.0098 232/500 [============>.................] - ETA: 2:14 - loss: 0.2286 - regression_loss: 0.2188 - classification_loss: 0.0097 233/500 [============>.................] - ETA: 2:13 - loss: 0.2288 - regression_loss: 0.2190 - classification_loss: 0.0098 234/500 [=============>................] - ETA: 2:13 - loss: 0.2290 - regression_loss: 0.2193 - classification_loss: 0.0098 235/500 [=============>................] - ETA: 2:12 - loss: 0.2293 - regression_loss: 0.2196 - classification_loss: 0.0098 236/500 [=============>................] - ETA: 2:12 - loss: 0.2291 - regression_loss: 0.2194 - classification_loss: 0.0098 237/500 [=============>................] - ETA: 2:11 - loss: 0.2296 - regression_loss: 0.2198 - classification_loss: 0.0098 238/500 [=============>................] - ETA: 2:11 - loss: 0.2295 - regression_loss: 0.2197 - classification_loss: 0.0098 239/500 [=============>................] - ETA: 2:10 - loss: 0.2292 - regression_loss: 0.2194 - classification_loss: 0.0098 240/500 [=============>................] - ETA: 2:10 - loss: 0.2289 - regression_loss: 0.2191 - classification_loss: 0.0098 241/500 [=============>................] - ETA: 2:09 - loss: 0.2298 - regression_loss: 0.2199 - classification_loss: 0.0099 242/500 [=============>................] - ETA: 2:09 - loss: 0.2295 - regression_loss: 0.2197 - classification_loss: 0.0099 243/500 [=============>................] - ETA: 2:08 - loss: 0.2300 - regression_loss: 0.2201 - classification_loss: 0.0099 244/500 [=============>................] - ETA: 2:08 - loss: 0.2305 - regression_loss: 0.2206 - classification_loss: 0.0099 245/500 [=============>................] - ETA: 2:07 - loss: 0.2300 - regression_loss: 0.2202 - classification_loss: 0.0098 246/500 [=============>................] - ETA: 2:07 - loss: 0.2297 - regression_loss: 0.2199 - classification_loss: 0.0098 247/500 [=============>................] - ETA: 2:06 - loss: 0.2307 - regression_loss: 0.2208 - classification_loss: 0.0098 248/500 [=============>................] - ETA: 2:06 - loss: 0.2304 - regression_loss: 0.2205 - classification_loss: 0.0098 249/500 [=============>................] - ETA: 2:05 - loss: 0.2298 - regression_loss: 0.2200 - classification_loss: 0.0098 250/500 [==============>...............] - ETA: 2:05 - loss: 0.2302 - regression_loss: 0.2203 - classification_loss: 0.0098 251/500 [==============>...............] - ETA: 2:04 - loss: 0.2301 - regression_loss: 0.2203 - classification_loss: 0.0098 252/500 [==============>...............] - ETA: 2:04 - loss: 0.2302 - regression_loss: 0.2204 - classification_loss: 0.0098 253/500 [==============>...............] - ETA: 2:03 - loss: 0.2299 - regression_loss: 0.2201 - classification_loss: 0.0098 254/500 [==============>...............] - ETA: 2:03 - loss: 0.2302 - regression_loss: 0.2204 - classification_loss: 0.0098 255/500 [==============>...............] - ETA: 2:02 - loss: 0.2301 - regression_loss: 0.2203 - classification_loss: 0.0098 256/500 [==============>...............] - ETA: 2:02 - loss: 0.2298 - regression_loss: 0.2200 - classification_loss: 0.0098 257/500 [==============>...............] - ETA: 2:01 - loss: 0.2292 - regression_loss: 0.2195 - classification_loss: 0.0097 258/500 [==============>...............] - ETA: 2:01 - loss: 0.2290 - regression_loss: 0.2193 - classification_loss: 0.0097 259/500 [==============>...............] - ETA: 2:00 - loss: 0.2294 - regression_loss: 0.2197 - classification_loss: 0.0097 260/500 [==============>...............] - ETA: 2:00 - loss: 0.2302 - regression_loss: 0.2203 - classification_loss: 0.0100 261/500 [==============>...............] - ETA: 1:59 - loss: 0.2316 - regression_loss: 0.2216 - classification_loss: 0.0100 262/500 [==============>...............] - ETA: 1:59 - loss: 0.2320 - regression_loss: 0.2220 - classification_loss: 0.0100 263/500 [==============>...............] - ETA: 1:58 - loss: 0.2321 - regression_loss: 0.2222 - classification_loss: 0.0099 264/500 [==============>...............] - ETA: 1:58 - loss: 0.2323 - regression_loss: 0.2223 - classification_loss: 0.0100 265/500 [==============>...............] - ETA: 1:57 - loss: 0.2320 - regression_loss: 0.2220 - classification_loss: 0.0100 266/500 [==============>...............] - ETA: 1:57 - loss: 0.2321 - regression_loss: 0.2220 - classification_loss: 0.0100 267/500 [===============>..............] - ETA: 1:56 - loss: 0.2328 - regression_loss: 0.2228 - classification_loss: 0.0100 268/500 [===============>..............] - ETA: 1:56 - loss: 0.2325 - regression_loss: 0.2225 - classification_loss: 0.0100 269/500 [===============>..............] - ETA: 1:55 - loss: 0.2327 - regression_loss: 0.2227 - classification_loss: 0.0100 270/500 [===============>..............] - ETA: 1:55 - loss: 0.2326 - regression_loss: 0.2227 - classification_loss: 0.0100 271/500 [===============>..............] - ETA: 1:54 - loss: 0.2323 - regression_loss: 0.2223 - classification_loss: 0.0100 272/500 [===============>..............] - ETA: 1:54 - loss: 0.2322 - regression_loss: 0.2223 - classification_loss: 0.0100 273/500 [===============>..............] - ETA: 1:53 - loss: 0.2324 - regression_loss: 0.2224 - classification_loss: 0.0100 274/500 [===============>..............] - ETA: 1:53 - loss: 0.2324 - regression_loss: 0.2224 - classification_loss: 0.0100 275/500 [===============>..............] - ETA: 1:52 - loss: 0.2321 - regression_loss: 0.2221 - classification_loss: 0.0100 276/500 [===============>..............] - ETA: 1:52 - loss: 0.2327 - regression_loss: 0.2227 - classification_loss: 0.0100 277/500 [===============>..............] - ETA: 1:51 - loss: 0.2327 - regression_loss: 0.2227 - classification_loss: 0.0100 278/500 [===============>..............] - ETA: 1:51 - loss: 0.2328 - regression_loss: 0.2228 - classification_loss: 0.0100 279/500 [===============>..............] - ETA: 1:50 - loss: 0.2327 - regression_loss: 0.2228 - classification_loss: 0.0100 280/500 [===============>..............] - ETA: 1:50 - loss: 0.2321 - regression_loss: 0.2222 - classification_loss: 0.0100 281/500 [===============>..............] - ETA: 1:49 - loss: 0.2323 - regression_loss: 0.2223 - classification_loss: 0.0100 282/500 [===============>..............] - ETA: 1:49 - loss: 0.2322 - regression_loss: 0.2223 - classification_loss: 0.0099 283/500 [===============>..............] - ETA: 1:48 - loss: 0.2322 - regression_loss: 0.2222 - classification_loss: 0.0099 284/500 [================>.............] - ETA: 1:48 - loss: 0.2327 - regression_loss: 0.2227 - classification_loss: 0.0100 285/500 [================>.............] - ETA: 1:47 - loss: 0.2327 - regression_loss: 0.2227 - classification_loss: 0.0100 286/500 [================>.............] - ETA: 1:47 - loss: 0.2329 - regression_loss: 0.2229 - classification_loss: 0.0100 287/500 [================>.............] - ETA: 1:46 - loss: 0.2328 - regression_loss: 0.2228 - classification_loss: 0.0100 288/500 [================>.............] - ETA: 1:46 - loss: 0.2327 - regression_loss: 0.2227 - classification_loss: 0.0100 289/500 [================>.............] - ETA: 1:45 - loss: 0.2328 - regression_loss: 0.2229 - classification_loss: 0.0100 290/500 [================>.............] - ETA: 1:45 - loss: 0.2331 - regression_loss: 0.2231 - classification_loss: 0.0100 291/500 [================>.............] - ETA: 1:44 - loss: 0.2326 - regression_loss: 0.2227 - classification_loss: 0.0099 292/500 [================>.............] - ETA: 1:44 - loss: 0.2323 - regression_loss: 0.2224 - classification_loss: 0.0099 293/500 [================>.............] - ETA: 1:43 - loss: 0.2321 - regression_loss: 0.2222 - classification_loss: 0.0099 294/500 [================>.............] - ETA: 1:43 - loss: 0.2319 - regression_loss: 0.2220 - classification_loss: 0.0099 295/500 [================>.............] - ETA: 1:42 - loss: 0.2315 - regression_loss: 0.2217 - classification_loss: 0.0099 296/500 [================>.............] - ETA: 1:42 - loss: 0.2320 - regression_loss: 0.2220 - classification_loss: 0.0099 297/500 [================>.............] - ETA: 1:41 - loss: 0.2320 - regression_loss: 0.2220 - classification_loss: 0.0100 298/500 [================>.............] - ETA: 1:41 - loss: 0.2318 - regression_loss: 0.2218 - classification_loss: 0.0100 299/500 [================>.............] - ETA: 1:40 - loss: 0.2319 - regression_loss: 0.2219 - classification_loss: 0.0100 300/500 [=================>............] - ETA: 1:40 - loss: 0.2321 - regression_loss: 0.2221 - classification_loss: 0.0100 301/500 [=================>............] - ETA: 1:39 - loss: 0.2328 - regression_loss: 0.2227 - classification_loss: 0.0101 302/500 [=================>............] - ETA: 1:39 - loss: 0.2330 - regression_loss: 0.2229 - classification_loss: 0.0101 303/500 [=================>............] - ETA: 1:38 - loss: 0.2327 - regression_loss: 0.2226 - classification_loss: 0.0100 304/500 [=================>............] - ETA: 1:38 - loss: 0.2323 - regression_loss: 0.2223 - classification_loss: 0.0100 305/500 [=================>............] - ETA: 1:37 - loss: 0.2323 - regression_loss: 0.2222 - classification_loss: 0.0100 306/500 [=================>............] - ETA: 1:37 - loss: 0.2321 - regression_loss: 0.2221 - classification_loss: 0.0100 307/500 [=================>............] - ETA: 1:36 - loss: 0.2321 - regression_loss: 0.2221 - classification_loss: 0.0100 308/500 [=================>............] - ETA: 1:36 - loss: 0.2319 - regression_loss: 0.2218 - classification_loss: 0.0101 309/500 [=================>............] - ETA: 1:35 - loss: 0.2319 - regression_loss: 0.2219 - classification_loss: 0.0101 310/500 [=================>............] - ETA: 1:35 - loss: 0.2319 - regression_loss: 0.2218 - classification_loss: 0.0101 311/500 [=================>............] - ETA: 1:34 - loss: 0.2315 - regression_loss: 0.2215 - classification_loss: 0.0100 312/500 [=================>............] - ETA: 1:34 - loss: 0.2315 - regression_loss: 0.2215 - classification_loss: 0.0100 313/500 [=================>............] - ETA: 1:33 - loss: 0.2318 - regression_loss: 0.2218 - classification_loss: 0.0100 314/500 [=================>............] - ETA: 1:33 - loss: 0.2316 - regression_loss: 0.2216 - classification_loss: 0.0100 315/500 [=================>............] - ETA: 1:32 - loss: 0.2316 - regression_loss: 0.2215 - classification_loss: 0.0100 316/500 [=================>............] - ETA: 1:32 - loss: 0.2316 - regression_loss: 0.2215 - classification_loss: 0.0101 317/500 [==================>...........] - ETA: 1:31 - loss: 0.2315 - regression_loss: 0.2214 - classification_loss: 0.0101 318/500 [==================>...........] - ETA: 1:31 - loss: 0.2317 - regression_loss: 0.2216 - classification_loss: 0.0101 319/500 [==================>...........] - ETA: 1:30 - loss: 0.2318 - regression_loss: 0.2217 - classification_loss: 0.0101 320/500 [==================>...........] - ETA: 1:30 - loss: 0.2317 - regression_loss: 0.2216 - classification_loss: 0.0101 321/500 [==================>...........] - ETA: 1:29 - loss: 0.2316 - regression_loss: 0.2214 - classification_loss: 0.0101 322/500 [==================>...........] - ETA: 1:29 - loss: 0.2313 - regression_loss: 0.2212 - classification_loss: 0.0101 323/500 [==================>...........] - ETA: 1:28 - loss: 0.2315 - regression_loss: 0.2213 - classification_loss: 0.0102 324/500 [==================>...........] - ETA: 1:28 - loss: 0.2314 - regression_loss: 0.2212 - classification_loss: 0.0101 325/500 [==================>...........] - ETA: 1:27 - loss: 0.2316 - regression_loss: 0.2214 - classification_loss: 0.0102 326/500 [==================>...........] - ETA: 1:27 - loss: 0.2313 - regression_loss: 0.2211 - classification_loss: 0.0102 327/500 [==================>...........] - ETA: 1:26 - loss: 0.2312 - regression_loss: 0.2210 - classification_loss: 0.0102 328/500 [==================>...........] - ETA: 1:26 - loss: 0.2311 - regression_loss: 0.2209 - classification_loss: 0.0102 329/500 [==================>...........] - ETA: 1:25 - loss: 0.2309 - regression_loss: 0.2208 - classification_loss: 0.0102 330/500 [==================>...........] - ETA: 1:25 - loss: 0.2308 - regression_loss: 0.2206 - classification_loss: 0.0102 331/500 [==================>...........] - ETA: 1:24 - loss: 0.2313 - regression_loss: 0.2211 - classification_loss: 0.0102 332/500 [==================>...........] - ETA: 1:24 - loss: 0.2310 - regression_loss: 0.2208 - classification_loss: 0.0102 333/500 [==================>...........] - ETA: 1:23 - loss: 0.2304 - regression_loss: 0.2202 - classification_loss: 0.0102 334/500 [===================>..........] - ETA: 1:23 - loss: 0.2304 - regression_loss: 0.2202 - classification_loss: 0.0102 335/500 [===================>..........] - ETA: 1:22 - loss: 0.2302 - regression_loss: 0.2201 - classification_loss: 0.0102 336/500 [===================>..........] - ETA: 1:22 - loss: 0.2298 - regression_loss: 0.2197 - classification_loss: 0.0101 337/500 [===================>..........] - ETA: 1:21 - loss: 0.2297 - regression_loss: 0.2196 - classification_loss: 0.0101 338/500 [===================>..........] - ETA: 1:21 - loss: 0.2299 - regression_loss: 0.2198 - classification_loss: 0.0101 339/500 [===================>..........] - ETA: 1:20 - loss: 0.2301 - regression_loss: 0.2200 - classification_loss: 0.0101 340/500 [===================>..........] - ETA: 1:20 - loss: 0.2304 - regression_loss: 0.2202 - classification_loss: 0.0101 341/500 [===================>..........] - ETA: 1:19 - loss: 0.2301 - regression_loss: 0.2200 - classification_loss: 0.0101 342/500 [===================>..........] - ETA: 1:19 - loss: 0.2296 - regression_loss: 0.2195 - classification_loss: 0.0101 343/500 [===================>..........] - ETA: 1:18 - loss: 0.2296 - regression_loss: 0.2195 - classification_loss: 0.0101 344/500 [===================>..........] - ETA: 1:18 - loss: 0.2297 - regression_loss: 0.2196 - classification_loss: 0.0101 345/500 [===================>..........] - ETA: 1:17 - loss: 0.2304 - regression_loss: 0.2201 - classification_loss: 0.0103 346/500 [===================>..........] - ETA: 1:17 - loss: 0.2304 - regression_loss: 0.2201 - classification_loss: 0.0103 347/500 [===================>..........] - ETA: 1:16 - loss: 0.2307 - regression_loss: 0.2204 - classification_loss: 0.0103 348/500 [===================>..........] - ETA: 1:16 - loss: 0.2303 - regression_loss: 0.2200 - classification_loss: 0.0103 349/500 [===================>..........] - ETA: 1:15 - loss: 0.2302 - regression_loss: 0.2200 - classification_loss: 0.0103 350/500 [====================>.........] - ETA: 1:15 - loss: 0.2302 - regression_loss: 0.2199 - classification_loss: 0.0103 351/500 [====================>.........] - ETA: 1:14 - loss: 0.2304 - regression_loss: 0.2202 - classification_loss: 0.0103 352/500 [====================>.........] - ETA: 1:14 - loss: 0.2303 - regression_loss: 0.2200 - classification_loss: 0.0103 353/500 [====================>.........] - ETA: 1:13 - loss: 0.2300 - regression_loss: 0.2198 - classification_loss: 0.0102 354/500 [====================>.........] - ETA: 1:13 - loss: 0.2301 - regression_loss: 0.2198 - classification_loss: 0.0103 355/500 [====================>.........] - ETA: 1:12 - loss: 0.2301 - regression_loss: 0.2199 - classification_loss: 0.0103 356/500 [====================>.........] - ETA: 1:12 - loss: 0.2300 - regression_loss: 0.2197 - classification_loss: 0.0102 357/500 [====================>.........] - ETA: 1:11 - loss: 0.2302 - regression_loss: 0.2199 - classification_loss: 0.0103 358/500 [====================>.........] - ETA: 1:11 - loss: 0.2300 - regression_loss: 0.2197 - classification_loss: 0.0103 359/500 [====================>.........] - ETA: 1:10 - loss: 0.2299 - regression_loss: 0.2196 - classification_loss: 0.0103 360/500 [====================>.........] - ETA: 1:10 - loss: 0.2300 - regression_loss: 0.2197 - classification_loss: 0.0103 361/500 [====================>.........] - ETA: 1:09 - loss: 0.2298 - regression_loss: 0.2195 - classification_loss: 0.0103 362/500 [====================>.........] - ETA: 1:09 - loss: 0.2297 - regression_loss: 0.2194 - classification_loss: 0.0102 363/500 [====================>.........] - ETA: 1:08 - loss: 0.2295 - regression_loss: 0.2192 - classification_loss: 0.0102 364/500 [====================>.........] - ETA: 1:08 - loss: 0.2293 - regression_loss: 0.2191 - classification_loss: 0.0102 365/500 [====================>.........] - ETA: 1:07 - loss: 0.2293 - regression_loss: 0.2191 - classification_loss: 0.0102 366/500 [====================>.........] - ETA: 1:07 - loss: 0.2291 - regression_loss: 0.2189 - classification_loss: 0.0102 367/500 [=====================>........] - ETA: 1:06 - loss: 0.2290 - regression_loss: 0.2188 - classification_loss: 0.0101 368/500 [=====================>........] - ETA: 1:06 - loss: 0.2288 - regression_loss: 0.2187 - classification_loss: 0.0101 369/500 [=====================>........] - ETA: 1:05 - loss: 0.2287 - regression_loss: 0.2186 - classification_loss: 0.0101 370/500 [=====================>........] - ETA: 1:05 - loss: 0.2285 - regression_loss: 0.2184 - classification_loss: 0.0101 371/500 [=====================>........] - ETA: 1:04 - loss: 0.2287 - regression_loss: 0.2185 - classification_loss: 0.0101 372/500 [=====================>........] - ETA: 1:04 - loss: 0.2290 - regression_loss: 0.2189 - classification_loss: 0.0101 373/500 [=====================>........] - ETA: 1:03 - loss: 0.2291 - regression_loss: 0.2189 - classification_loss: 0.0101 374/500 [=====================>........] - ETA: 1:03 - loss: 0.2288 - regression_loss: 0.2187 - classification_loss: 0.0101 375/500 [=====================>........] - ETA: 1:02 - loss: 0.2289 - regression_loss: 0.2187 - classification_loss: 0.0102 376/500 [=====================>........] - ETA: 1:02 - loss: 0.2289 - regression_loss: 0.2187 - classification_loss: 0.0102 377/500 [=====================>........] - ETA: 1:01 - loss: 0.2287 - regression_loss: 0.2185 - classification_loss: 0.0101 378/500 [=====================>........] - ETA: 1:01 - loss: 0.2285 - regression_loss: 0.2184 - classification_loss: 0.0101 379/500 [=====================>........] - ETA: 1:00 - loss: 0.2284 - regression_loss: 0.2183 - classification_loss: 0.0101 380/500 [=====================>........] - ETA: 1:00 - loss: 0.2281 - regression_loss: 0.2180 - classification_loss: 0.0101 381/500 [=====================>........] - ETA: 59s - loss: 0.2282 - regression_loss: 0.2180 - classification_loss: 0.0101  382/500 [=====================>........] - ETA: 59s - loss: 0.2285 - regression_loss: 0.2184 - classification_loss: 0.0101 383/500 [=====================>........] - ETA: 58s - loss: 0.2285 - regression_loss: 0.2183 - classification_loss: 0.0101 384/500 [======================>.......] - ETA: 58s - loss: 0.2282 - regression_loss: 0.2181 - classification_loss: 0.0102 385/500 [======================>.......] - ETA: 57s - loss: 0.2282 - regression_loss: 0.2181 - classification_loss: 0.0102 386/500 [======================>.......] - ETA: 57s - loss: 0.2285 - regression_loss: 0.2183 - classification_loss: 0.0102 387/500 [======================>.......] - ETA: 56s - loss: 0.2281 - regression_loss: 0.2179 - classification_loss: 0.0102 388/500 [======================>.......] - ETA: 56s - loss: 0.2280 - regression_loss: 0.2178 - classification_loss: 0.0102 389/500 [======================>.......] - ETA: 55s - loss: 0.2279 - regression_loss: 0.2178 - classification_loss: 0.0102 390/500 [======================>.......] - ETA: 55s - loss: 0.2275 - regression_loss: 0.2174 - classification_loss: 0.0101 391/500 [======================>.......] - ETA: 54s - loss: 0.2274 - regression_loss: 0.2173 - classification_loss: 0.0101 392/500 [======================>.......] - ETA: 54s - loss: 0.2275 - regression_loss: 0.2174 - classification_loss: 0.0101 393/500 [======================>.......] - ETA: 53s - loss: 0.2275 - regression_loss: 0.2174 - classification_loss: 0.0101 394/500 [======================>.......] - ETA: 53s - loss: 0.2274 - regression_loss: 0.2172 - classification_loss: 0.0101 395/500 [======================>.......] - ETA: 52s - loss: 0.2271 - regression_loss: 0.2170 - classification_loss: 0.0101 396/500 [======================>.......] - ETA: 52s - loss: 0.2269 - regression_loss: 0.2168 - classification_loss: 0.0101 397/500 [======================>.......] - ETA: 51s - loss: 0.2267 - regression_loss: 0.2166 - classification_loss: 0.0101 398/500 [======================>.......] - ETA: 51s - loss: 0.2266 - regression_loss: 0.2165 - classification_loss: 0.0101 399/500 [======================>.......] - ETA: 50s - loss: 0.2263 - regression_loss: 0.2163 - classification_loss: 0.0101 400/500 [=======================>......] - ETA: 50s - loss: 0.2265 - regression_loss: 0.2164 - classification_loss: 0.0101 401/500 [=======================>......] - ETA: 49s - loss: 0.2265 - regression_loss: 0.2164 - classification_loss: 0.0101 402/500 [=======================>......] - ETA: 49s - loss: 0.2262 - regression_loss: 0.2161 - classification_loss: 0.0101 403/500 [=======================>......] - ETA: 48s - loss: 0.2261 - regression_loss: 0.2160 - classification_loss: 0.0101 404/500 [=======================>......] - ETA: 48s - loss: 0.2267 - regression_loss: 0.2166 - classification_loss: 0.0101 405/500 [=======================>......] - ETA: 47s - loss: 0.2266 - regression_loss: 0.2165 - classification_loss: 0.0101 406/500 [=======================>......] - ETA: 47s - loss: 0.2263 - regression_loss: 0.2162 - classification_loss: 0.0101 407/500 [=======================>......] - ETA: 46s - loss: 0.2264 - regression_loss: 0.2163 - classification_loss: 0.0101 408/500 [=======================>......] - ETA: 46s - loss: 0.2262 - regression_loss: 0.2161 - classification_loss: 0.0101 409/500 [=======================>......] - ETA: 45s - loss: 0.2270 - regression_loss: 0.2169 - classification_loss: 0.0101 410/500 [=======================>......] - ETA: 45s - loss: 0.2268 - regression_loss: 0.2167 - classification_loss: 0.0101 411/500 [=======================>......] - ETA: 44s - loss: 0.2276 - regression_loss: 0.2174 - classification_loss: 0.0102 412/500 [=======================>......] - ETA: 44s - loss: 0.2276 - regression_loss: 0.2174 - classification_loss: 0.0102 413/500 [=======================>......] - ETA: 43s - loss: 0.2274 - regression_loss: 0.2172 - classification_loss: 0.0102 414/500 [=======================>......] - ETA: 43s - loss: 0.2274 - regression_loss: 0.2173 - classification_loss: 0.0102 415/500 [=======================>......] - ETA: 42s - loss: 0.2276 - regression_loss: 0.2174 - classification_loss: 0.0102 416/500 [=======================>......] - ETA: 42s - loss: 0.2276 - regression_loss: 0.2174 - classification_loss: 0.0102 417/500 [========================>.....] - ETA: 41s - loss: 0.2279 - regression_loss: 0.2177 - classification_loss: 0.0102 418/500 [========================>.....] - ETA: 41s - loss: 0.2278 - regression_loss: 0.2176 - classification_loss: 0.0102 419/500 [========================>.....] - ETA: 40s - loss: 0.2279 - regression_loss: 0.2177 - classification_loss: 0.0102 420/500 [========================>.....] - ETA: 40s - loss: 0.2277 - regression_loss: 0.2175 - classification_loss: 0.0102 421/500 [========================>.....] - ETA: 39s - loss: 0.2275 - regression_loss: 0.2173 - classification_loss: 0.0102 422/500 [========================>.....] - ETA: 39s - loss: 0.2275 - regression_loss: 0.2174 - classification_loss: 0.0102 423/500 [========================>.....] - ETA: 38s - loss: 0.2277 - regression_loss: 0.2175 - classification_loss: 0.0102 424/500 [========================>.....] - ETA: 38s - loss: 0.2278 - regression_loss: 0.2176 - classification_loss: 0.0102 425/500 [========================>.....] - ETA: 37s - loss: 0.2276 - regression_loss: 0.2173 - classification_loss: 0.0102 426/500 [========================>.....] - ETA: 37s - loss: 0.2275 - regression_loss: 0.2173 - classification_loss: 0.0102 427/500 [========================>.....] - ETA: 36s - loss: 0.2272 - regression_loss: 0.2170 - classification_loss: 0.0102 428/500 [========================>.....] - ETA: 36s - loss: 0.2271 - regression_loss: 0.2169 - classification_loss: 0.0102 429/500 [========================>.....] - ETA: 35s - loss: 0.2270 - regression_loss: 0.2168 - classification_loss: 0.0102 430/500 [========================>.....] - ETA: 35s - loss: 0.2270 - regression_loss: 0.2168 - classification_loss: 0.0102 431/500 [========================>.....] - ETA: 34s - loss: 0.2273 - regression_loss: 0.2171 - classification_loss: 0.0102 432/500 [========================>.....] - ETA: 34s - loss: 0.2276 - regression_loss: 0.2174 - classification_loss: 0.0102 433/500 [========================>.....] - ETA: 33s - loss: 0.2278 - regression_loss: 0.2176 - classification_loss: 0.0102 434/500 [=========================>....] - ETA: 33s - loss: 0.2276 - regression_loss: 0.2174 - classification_loss: 0.0102 435/500 [=========================>....] - ETA: 32s - loss: 0.2279 - regression_loss: 0.2176 - classification_loss: 0.0103 436/500 [=========================>....] - ETA: 32s - loss: 0.2277 - regression_loss: 0.2175 - classification_loss: 0.0102 437/500 [=========================>....] - ETA: 31s - loss: 0.2280 - regression_loss: 0.2177 - classification_loss: 0.0103 438/500 [=========================>....] - ETA: 31s - loss: 0.2282 - regression_loss: 0.2179 - classification_loss: 0.0103 439/500 [=========================>....] - ETA: 30s - loss: 0.2284 - regression_loss: 0.2181 - classification_loss: 0.0103 440/500 [=========================>....] - ETA: 30s - loss: 0.2284 - regression_loss: 0.2180 - classification_loss: 0.0103 441/500 [=========================>....] - ETA: 29s - loss: 0.2284 - regression_loss: 0.2181 - classification_loss: 0.0103 442/500 [=========================>....] - ETA: 29s - loss: 0.2289 - regression_loss: 0.2185 - classification_loss: 0.0104 443/500 [=========================>....] - ETA: 28s - loss: 0.2290 - regression_loss: 0.2186 - classification_loss: 0.0104 444/500 [=========================>....] - ETA: 28s - loss: 0.2293 - regression_loss: 0.2189 - classification_loss: 0.0104 445/500 [=========================>....] - ETA: 27s - loss: 0.2293 - regression_loss: 0.2189 - classification_loss: 0.0104 446/500 [=========================>....] - ETA: 27s - loss: 0.2294 - regression_loss: 0.2191 - classification_loss: 0.0104 447/500 [=========================>....] - ETA: 26s - loss: 0.2295 - regression_loss: 0.2192 - classification_loss: 0.0104 448/500 [=========================>....] - ETA: 26s - loss: 0.2302 - regression_loss: 0.2198 - classification_loss: 0.0104 449/500 [=========================>....] - ETA: 25s - loss: 0.2305 - regression_loss: 0.2201 - classification_loss: 0.0104 450/500 [==========================>...] - ETA: 25s - loss: 0.2318 - regression_loss: 0.2213 - classification_loss: 0.0105 451/500 [==========================>...] - ETA: 24s - loss: 0.2323 - regression_loss: 0.2217 - classification_loss: 0.0105 452/500 [==========================>...] - ETA: 24s - loss: 0.2327 - regression_loss: 0.2221 - classification_loss: 0.0105 453/500 [==========================>...] - ETA: 23s - loss: 0.2327 - regression_loss: 0.2222 - classification_loss: 0.0105 454/500 [==========================>...] - ETA: 23s - loss: 0.2328 - regression_loss: 0.2222 - classification_loss: 0.0105 455/500 [==========================>...] - ETA: 22s - loss: 0.2327 - regression_loss: 0.2222 - classification_loss: 0.0105 456/500 [==========================>...] - ETA: 22s - loss: 0.2329 - regression_loss: 0.2224 - classification_loss: 0.0105 457/500 [==========================>...] - ETA: 21s - loss: 0.2329 - regression_loss: 0.2224 - classification_loss: 0.0105 458/500 [==========================>...] - ETA: 21s - loss: 0.2332 - regression_loss: 0.2226 - classification_loss: 0.0105 459/500 [==========================>...] - ETA: 20s - loss: 0.2328 - regression_loss: 0.2223 - classification_loss: 0.0105 460/500 [==========================>...] - ETA: 20s - loss: 0.2326 - regression_loss: 0.2221 - classification_loss: 0.0105 461/500 [==========================>...] - ETA: 19s - loss: 0.2326 - regression_loss: 0.2221 - classification_loss: 0.0105 462/500 [==========================>...] - ETA: 19s - loss: 0.2324 - regression_loss: 0.2219 - classification_loss: 0.0105 463/500 [==========================>...] - ETA: 18s - loss: 0.2326 - regression_loss: 0.2220 - classification_loss: 0.0106 464/500 [==========================>...] - ETA: 18s - loss: 0.2325 - regression_loss: 0.2219 - classification_loss: 0.0105 465/500 [==========================>...] - ETA: 17s - loss: 0.2323 - regression_loss: 0.2218 - classification_loss: 0.0105 466/500 [==========================>...] - ETA: 17s - loss: 0.2323 - regression_loss: 0.2218 - classification_loss: 0.0105 467/500 [===========================>..] - ETA: 16s - loss: 0.2320 - regression_loss: 0.2215 - classification_loss: 0.0105 468/500 [===========================>..] - ETA: 16s - loss: 0.2320 - regression_loss: 0.2215 - classification_loss: 0.0105 469/500 [===========================>..] - ETA: 15s - loss: 0.2319 - regression_loss: 0.2214 - classification_loss: 0.0105 470/500 [===========================>..] - ETA: 15s - loss: 0.2318 - regression_loss: 0.2214 - classification_loss: 0.0105 471/500 [===========================>..] - ETA: 14s - loss: 0.2317 - regression_loss: 0.2212 - classification_loss: 0.0105 472/500 [===========================>..] - ETA: 14s - loss: 0.2316 - regression_loss: 0.2211 - classification_loss: 0.0105 473/500 [===========================>..] - ETA: 13s - loss: 0.2313 - regression_loss: 0.2209 - classification_loss: 0.0104 474/500 [===========================>..] - ETA: 13s - loss: 0.2311 - regression_loss: 0.2207 - classification_loss: 0.0104 475/500 [===========================>..] - ETA: 12s - loss: 0.2308 - regression_loss: 0.2204 - classification_loss: 0.0104 476/500 [===========================>..] - ETA: 12s - loss: 0.2308 - regression_loss: 0.2204 - classification_loss: 0.0104 477/500 [===========================>..] - ETA: 11s - loss: 0.2308 - regression_loss: 0.2204 - classification_loss: 0.0104 478/500 [===========================>..] - ETA: 11s - loss: 0.2307 - regression_loss: 0.2203 - classification_loss: 0.0104 479/500 [===========================>..] - ETA: 10s - loss: 0.2304 - regression_loss: 0.2200 - classification_loss: 0.0104 480/500 [===========================>..] - ETA: 10s - loss: 0.2306 - regression_loss: 0.2202 - classification_loss: 0.0104 481/500 [===========================>..] - ETA: 9s - loss: 0.2304 - regression_loss: 0.2200 - classification_loss: 0.0104  482/500 [===========================>..] - ETA: 9s - loss: 0.2304 - regression_loss: 0.2200 - classification_loss: 0.0104 483/500 [===========================>..] - ETA: 8s - loss: 0.2306 - regression_loss: 0.2202 - classification_loss: 0.0104 484/500 [============================>.] - ETA: 8s - loss: 0.2308 - regression_loss: 0.2204 - classification_loss: 0.0104 485/500 [============================>.] - ETA: 7s - loss: 0.2307 - regression_loss: 0.2203 - classification_loss: 0.0104 486/500 [============================>.] - ETA: 7s - loss: 0.2305 - regression_loss: 0.2201 - classification_loss: 0.0104 487/500 [============================>.] - ETA: 6s - loss: 0.2305 - regression_loss: 0.2201 - classification_loss: 0.0104 488/500 [============================>.] - ETA: 6s - loss: 0.2304 - regression_loss: 0.2200 - classification_loss: 0.0104 489/500 [============================>.] - ETA: 5s - loss: 0.2304 - regression_loss: 0.2200 - classification_loss: 0.0104 490/500 [============================>.] - ETA: 5s - loss: 0.2303 - regression_loss: 0.2199 - classification_loss: 0.0104 491/500 [============================>.] - ETA: 4s - loss: 0.2301 - regression_loss: 0.2197 - classification_loss: 0.0104 492/500 [============================>.] - ETA: 4s - loss: 0.2302 - regression_loss: 0.2198 - classification_loss: 0.0104 493/500 [============================>.] - ETA: 3s - loss: 0.2301 - regression_loss: 0.2197 - classification_loss: 0.0104 494/500 [============================>.] - ETA: 3s - loss: 0.2303 - regression_loss: 0.2199 - classification_loss: 0.0104 495/500 [============================>.] - ETA: 2s - loss: 0.2301 - regression_loss: 0.2197 - classification_loss: 0.0104 496/500 [============================>.] - ETA: 2s - loss: 0.2301 - regression_loss: 0.2197 - classification_loss: 0.0104 497/500 [============================>.] - ETA: 1s - loss: 0.2306 - regression_loss: 0.2202 - classification_loss: 0.0104 498/500 [============================>.] - ETA: 1s - loss: 0.2306 - regression_loss: 0.2202 - classification_loss: 0.0104 499/500 [============================>.] - ETA: 0s - loss: 0.2307 - regression_loss: 0.2203 - classification_loss: 0.0104 500/500 [==============================] - 251s 502ms/step - loss: 0.2309 - regression_loss: 0.2204 - classification_loss: 0.0104 1172 instances of class plum with average precision: 0.7496 mAP: 0.7496 Epoch 00031: saving model to ./training/snapshots/resnet101_pascal_31.h5 Epoch 32/150 1/500 [..............................] - ETA: 4:21 - loss: 0.2038 - regression_loss: 0.1974 - classification_loss: 0.0063 2/500 [..............................] - ETA: 4:15 - loss: 0.1924 - regression_loss: 0.1863 - classification_loss: 0.0061 3/500 [..............................] - ETA: 4:10 - loss: 0.2148 - regression_loss: 0.2081 - classification_loss: 0.0067 4/500 [..............................] - ETA: 4:09 - loss: 0.2163 - regression_loss: 0.2092 - classification_loss: 0.0071 5/500 [..............................] - ETA: 4:08 - loss: 0.2258 - regression_loss: 0.2190 - classification_loss: 0.0068 6/500 [..............................] - ETA: 4:05 - loss: 0.2106 - regression_loss: 0.2040 - classification_loss: 0.0066 7/500 [..............................] - ETA: 4:06 - loss: 0.2080 - regression_loss: 0.2020 - classification_loss: 0.0060 8/500 [..............................] - ETA: 4:06 - loss: 0.2038 - regression_loss: 0.1981 - classification_loss: 0.0057 9/500 [..............................] - ETA: 4:07 - loss: 0.2536 - regression_loss: 0.2417 - classification_loss: 0.0119 10/500 [..............................] - ETA: 4:07 - loss: 0.2509 - regression_loss: 0.2389 - classification_loss: 0.0120 11/500 [..............................] - ETA: 4:05 - loss: 0.2454 - regression_loss: 0.2339 - classification_loss: 0.0115 12/500 [..............................] - ETA: 4:04 - loss: 0.2322 - regression_loss: 0.2214 - classification_loss: 0.0108 13/500 [..............................] - ETA: 4:05 - loss: 0.2238 - regression_loss: 0.2137 - classification_loss: 0.0101 14/500 [..............................] - ETA: 4:04 - loss: 0.2232 - regression_loss: 0.2127 - classification_loss: 0.0106 15/500 [..............................] - ETA: 4:04 - loss: 0.2153 - regression_loss: 0.2053 - classification_loss: 0.0100 16/500 [..............................] - ETA: 4:03 - loss: 0.2187 - regression_loss: 0.2088 - classification_loss: 0.0099 17/500 [>.............................] - ETA: 4:02 - loss: 0.2183 - regression_loss: 0.2082 - classification_loss: 0.0101 18/500 [>.............................] - ETA: 4:02 - loss: 0.2187 - regression_loss: 0.2085 - classification_loss: 0.0101 19/500 [>.............................] - ETA: 4:02 - loss: 0.2231 - regression_loss: 0.2128 - classification_loss: 0.0103 20/500 [>.............................] - ETA: 4:01 - loss: 0.2233 - regression_loss: 0.2133 - classification_loss: 0.0100 21/500 [>.............................] - ETA: 4:01 - loss: 0.2350 - regression_loss: 0.2240 - classification_loss: 0.0110 22/500 [>.............................] - ETA: 4:00 - loss: 0.2434 - regression_loss: 0.2319 - classification_loss: 0.0115 23/500 [>.............................] - ETA: 3:59 - loss: 0.2415 - regression_loss: 0.2302 - classification_loss: 0.0113 24/500 [>.............................] - ETA: 3:59 - loss: 0.2426 - regression_loss: 0.2313 - classification_loss: 0.0113 25/500 [>.............................] - ETA: 3:58 - loss: 0.2391 - regression_loss: 0.2280 - classification_loss: 0.0111 26/500 [>.............................] - ETA: 3:58 - loss: 0.2420 - regression_loss: 0.2310 - classification_loss: 0.0109 27/500 [>.............................] - ETA: 3:57 - loss: 0.2453 - regression_loss: 0.2343 - classification_loss: 0.0110 28/500 [>.............................] - ETA: 3:57 - loss: 0.2442 - regression_loss: 0.2334 - classification_loss: 0.0108 29/500 [>.............................] - ETA: 3:56 - loss: 0.2431 - regression_loss: 0.2326 - classification_loss: 0.0106 30/500 [>.............................] - ETA: 3:56 - loss: 0.2418 - regression_loss: 0.2313 - classification_loss: 0.0105 31/500 [>.............................] - ETA: 3:55 - loss: 0.2372 - regression_loss: 0.2270 - classification_loss: 0.0102 32/500 [>.............................] - ETA: 3:55 - loss: 0.2368 - regression_loss: 0.2264 - classification_loss: 0.0104 33/500 [>.............................] - ETA: 3:54 - loss: 0.2393 - regression_loss: 0.2288 - classification_loss: 0.0105 34/500 [=>............................] - ETA: 3:54 - loss: 0.2382 - regression_loss: 0.2277 - classification_loss: 0.0105 35/500 [=>............................] - ETA: 3:53 - loss: 0.2413 - regression_loss: 0.2306 - classification_loss: 0.0107 36/500 [=>............................] - ETA: 3:53 - loss: 0.2385 - regression_loss: 0.2280 - classification_loss: 0.0105 37/500 [=>............................] - ETA: 3:52 - loss: 0.2432 - regression_loss: 0.2327 - classification_loss: 0.0105 38/500 [=>............................] - ETA: 3:52 - loss: 0.2446 - regression_loss: 0.2342 - classification_loss: 0.0104 39/500 [=>............................] - ETA: 3:51 - loss: 0.2440 - regression_loss: 0.2337 - classification_loss: 0.0102 40/500 [=>............................] - ETA: 3:51 - loss: 0.2493 - regression_loss: 0.2389 - classification_loss: 0.0105 41/500 [=>............................] - ETA: 3:50 - loss: 0.2462 - regression_loss: 0.2360 - classification_loss: 0.0103 42/500 [=>............................] - ETA: 3:50 - loss: 0.2435 - regression_loss: 0.2334 - classification_loss: 0.0102 43/500 [=>............................] - ETA: 3:49 - loss: 0.2456 - regression_loss: 0.2348 - classification_loss: 0.0108 44/500 [=>............................] - ETA: 3:49 - loss: 0.2423 - regression_loss: 0.2316 - classification_loss: 0.0107 45/500 [=>............................] - ETA: 3:48 - loss: 0.2422 - regression_loss: 0.2315 - classification_loss: 0.0106 46/500 [=>............................] - ETA: 3:48 - loss: 0.2424 - regression_loss: 0.2316 - classification_loss: 0.0108 47/500 [=>............................] - ETA: 3:47 - loss: 0.2418 - regression_loss: 0.2308 - classification_loss: 0.0109 48/500 [=>............................] - ETA: 3:46 - loss: 0.2444 - regression_loss: 0.2337 - classification_loss: 0.0107 49/500 [=>............................] - ETA: 3:46 - loss: 0.2486 - regression_loss: 0.2380 - classification_loss: 0.0106 50/500 [==>...........................] - ETA: 3:46 - loss: 0.2483 - regression_loss: 0.2378 - classification_loss: 0.0106 51/500 [==>...........................] - ETA: 3:45 - loss: 0.2508 - regression_loss: 0.2402 - classification_loss: 0.0105 52/500 [==>...........................] - ETA: 3:45 - loss: 0.2552 - regression_loss: 0.2447 - classification_loss: 0.0105 53/500 [==>...........................] - ETA: 3:44 - loss: 0.2583 - regression_loss: 0.2477 - classification_loss: 0.0106 54/500 [==>...........................] - ETA: 3:44 - loss: 0.2578 - regression_loss: 0.2472 - classification_loss: 0.0106 55/500 [==>...........................] - ETA: 3:43 - loss: 0.2572 - regression_loss: 0.2465 - classification_loss: 0.0107 56/500 [==>...........................] - ETA: 3:43 - loss: 0.2581 - regression_loss: 0.2469 - classification_loss: 0.0112 57/500 [==>...........................] - ETA: 3:42 - loss: 0.2572 - regression_loss: 0.2461 - classification_loss: 0.0110 58/500 [==>...........................] - ETA: 3:42 - loss: 0.2554 - regression_loss: 0.2445 - classification_loss: 0.0109 59/500 [==>...........................] - ETA: 3:41 - loss: 0.2549 - regression_loss: 0.2441 - classification_loss: 0.0108 60/500 [==>...........................] - ETA: 3:41 - loss: 0.2552 - regression_loss: 0.2444 - classification_loss: 0.0108 61/500 [==>...........................] - ETA: 3:40 - loss: 0.2550 - regression_loss: 0.2443 - classification_loss: 0.0107 62/500 [==>...........................] - ETA: 3:40 - loss: 0.2529 - regression_loss: 0.2423 - classification_loss: 0.0106 63/500 [==>...........................] - ETA: 3:39 - loss: 0.2537 - regression_loss: 0.2430 - classification_loss: 0.0107 64/500 [==>...........................] - ETA: 3:39 - loss: 0.2529 - regression_loss: 0.2423 - classification_loss: 0.0106 65/500 [==>...........................] - ETA: 3:38 - loss: 0.2508 - regression_loss: 0.2403 - classification_loss: 0.0105 66/500 [==>...........................] - ETA: 3:38 - loss: 0.2506 - regression_loss: 0.2400 - classification_loss: 0.0105 67/500 [===>..........................] - ETA: 3:37 - loss: 0.2520 - regression_loss: 0.2415 - classification_loss: 0.0105 68/500 [===>..........................] - ETA: 3:37 - loss: 0.2528 - regression_loss: 0.2422 - classification_loss: 0.0106 69/500 [===>..........................] - ETA: 3:36 - loss: 0.2510 - regression_loss: 0.2406 - classification_loss: 0.0104 70/500 [===>..........................] - ETA: 3:36 - loss: 0.2511 - regression_loss: 0.2408 - classification_loss: 0.0103 71/500 [===>..........................] - ETA: 3:35 - loss: 0.2515 - regression_loss: 0.2411 - classification_loss: 0.0104 72/500 [===>..........................] - ETA: 3:35 - loss: 0.2509 - regression_loss: 0.2406 - classification_loss: 0.0103 73/500 [===>..........................] - ETA: 3:34 - loss: 0.2501 - regression_loss: 0.2397 - classification_loss: 0.0104 74/500 [===>..........................] - ETA: 3:34 - loss: 0.2511 - regression_loss: 0.2403 - classification_loss: 0.0108 75/500 [===>..........................] - ETA: 3:33 - loss: 0.2509 - regression_loss: 0.2403 - classification_loss: 0.0107 76/500 [===>..........................] - ETA: 3:33 - loss: 0.2499 - regression_loss: 0.2393 - classification_loss: 0.0106 77/500 [===>..........................] - ETA: 3:32 - loss: 0.2506 - regression_loss: 0.2400 - classification_loss: 0.0106 78/500 [===>..........................] - ETA: 3:32 - loss: 0.2494 - regression_loss: 0.2389 - classification_loss: 0.0106 79/500 [===>..........................] - ETA: 3:31 - loss: 0.2494 - regression_loss: 0.2388 - classification_loss: 0.0105 80/500 [===>..........................] - ETA: 3:31 - loss: 0.2491 - regression_loss: 0.2386 - classification_loss: 0.0105 81/500 [===>..........................] - ETA: 3:30 - loss: 0.2474 - regression_loss: 0.2370 - classification_loss: 0.0104 82/500 [===>..........................] - ETA: 3:30 - loss: 0.2468 - regression_loss: 0.2364 - classification_loss: 0.0104 83/500 [===>..........................] - ETA: 3:29 - loss: 0.2455 - regression_loss: 0.2352 - classification_loss: 0.0103 84/500 [====>.........................] - ETA: 3:29 - loss: 0.2470 - regression_loss: 0.2360 - classification_loss: 0.0110 85/500 [====>.........................] - ETA: 3:28 - loss: 0.2455 - regression_loss: 0.2346 - classification_loss: 0.0109 86/500 [====>.........................] - ETA: 3:28 - loss: 0.2439 - regression_loss: 0.2331 - classification_loss: 0.0108 87/500 [====>.........................] - ETA: 3:27 - loss: 0.2437 - regression_loss: 0.2330 - classification_loss: 0.0107 88/500 [====>.........................] - ETA: 3:27 - loss: 0.2447 - regression_loss: 0.2339 - classification_loss: 0.0108 89/500 [====>.........................] - ETA: 3:26 - loss: 0.2448 - regression_loss: 0.2340 - classification_loss: 0.0108 90/500 [====>.........................] - ETA: 3:26 - loss: 0.2462 - regression_loss: 0.2354 - classification_loss: 0.0108 91/500 [====>.........................] - ETA: 3:25 - loss: 0.2452 - regression_loss: 0.2345 - classification_loss: 0.0107 92/500 [====>.........................] - ETA: 3:25 - loss: 0.2462 - regression_loss: 0.2355 - classification_loss: 0.0107 93/500 [====>.........................] - ETA: 3:24 - loss: 0.2449 - regression_loss: 0.2343 - classification_loss: 0.0106 94/500 [====>.........................] - ETA: 3:24 - loss: 0.2443 - regression_loss: 0.2336 - classification_loss: 0.0106 95/500 [====>.........................] - ETA: 3:23 - loss: 0.2448 - regression_loss: 0.2340 - classification_loss: 0.0107 96/500 [====>.........................] - ETA: 3:23 - loss: 0.2442 - regression_loss: 0.2334 - classification_loss: 0.0108 97/500 [====>.........................] - ETA: 3:22 - loss: 0.2433 - regression_loss: 0.2326 - classification_loss: 0.0107 98/500 [====>.........................] - ETA: 3:22 - loss: 0.2434 - regression_loss: 0.2327 - classification_loss: 0.0107 99/500 [====>.........................] - ETA: 3:21 - loss: 0.2429 - regression_loss: 0.2322 - classification_loss: 0.0107 100/500 [=====>........................] - ETA: 3:21 - loss: 0.2430 - regression_loss: 0.2324 - classification_loss: 0.0107 101/500 [=====>........................] - ETA: 3:20 - loss: 0.2414 - regression_loss: 0.2308 - classification_loss: 0.0106 102/500 [=====>........................] - ETA: 3:20 - loss: 0.2412 - regression_loss: 0.2306 - classification_loss: 0.0106 103/500 [=====>........................] - ETA: 3:19 - loss: 0.2408 - regression_loss: 0.2302 - classification_loss: 0.0105 104/500 [=====>........................] - ETA: 3:19 - loss: 0.2404 - regression_loss: 0.2299 - classification_loss: 0.0106 105/500 [=====>........................] - ETA: 3:18 - loss: 0.2401 - regression_loss: 0.2296 - classification_loss: 0.0105 106/500 [=====>........................] - ETA: 3:18 - loss: 0.2405 - regression_loss: 0.2300 - classification_loss: 0.0105 107/500 [=====>........................] - ETA: 3:17 - loss: 0.2419 - regression_loss: 0.2312 - classification_loss: 0.0107 108/500 [=====>........................] - ETA: 3:17 - loss: 0.2413 - regression_loss: 0.2306 - classification_loss: 0.0107 109/500 [=====>........................] - ETA: 3:16 - loss: 0.2408 - regression_loss: 0.2302 - classification_loss: 0.0106 110/500 [=====>........................] - ETA: 3:16 - loss: 0.2421 - regression_loss: 0.2315 - classification_loss: 0.0106 111/500 [=====>........................] - ETA: 3:15 - loss: 0.2423 - regression_loss: 0.2316 - classification_loss: 0.0106 112/500 [=====>........................] - ETA: 3:15 - loss: 0.2416 - regression_loss: 0.2311 - classification_loss: 0.0105 113/500 [=====>........................] - ETA: 3:14 - loss: 0.2409 - regression_loss: 0.2304 - classification_loss: 0.0105 114/500 [=====>........................] - ETA: 3:14 - loss: 0.2404 - regression_loss: 0.2299 - classification_loss: 0.0104 115/500 [=====>........................] - ETA: 3:13 - loss: 0.2416 - regression_loss: 0.2312 - classification_loss: 0.0104 116/500 [=====>........................] - ETA: 3:13 - loss: 0.2415 - regression_loss: 0.2311 - classification_loss: 0.0104 117/500 [======>.......................] - ETA: 3:12 - loss: 0.2407 - regression_loss: 0.2304 - classification_loss: 0.0103 118/500 [======>.......................] - ETA: 3:12 - loss: 0.2398 - regression_loss: 0.2295 - classification_loss: 0.0102 119/500 [======>.......................] - ETA: 3:11 - loss: 0.2406 - regression_loss: 0.2303 - classification_loss: 0.0103 120/500 [======>.......................] - ETA: 3:11 - loss: 0.2403 - regression_loss: 0.2299 - classification_loss: 0.0104 121/500 [======>.......................] - ETA: 3:10 - loss: 0.2421 - regression_loss: 0.2315 - classification_loss: 0.0106 122/500 [======>.......................] - ETA: 3:10 - loss: 0.2416 - regression_loss: 0.2308 - classification_loss: 0.0107 123/500 [======>.......................] - ETA: 3:09 - loss: 0.2412 - regression_loss: 0.2305 - classification_loss: 0.0107 124/500 [======>.......................] - ETA: 3:09 - loss: 0.2409 - regression_loss: 0.2302 - classification_loss: 0.0107 125/500 [======>.......................] - ETA: 3:08 - loss: 0.2400 - regression_loss: 0.2294 - classification_loss: 0.0106 126/500 [======>.......................] - ETA: 3:08 - loss: 0.2391 - regression_loss: 0.2285 - classification_loss: 0.0106 127/500 [======>.......................] - ETA: 3:07 - loss: 0.2411 - regression_loss: 0.2305 - classification_loss: 0.0106 128/500 [======>.......................] - ETA: 3:07 - loss: 0.2412 - regression_loss: 0.2306 - classification_loss: 0.0106 129/500 [======>.......................] - ETA: 3:06 - loss: 0.2406 - regression_loss: 0.2300 - classification_loss: 0.0105 130/500 [======>.......................] - ETA: 3:05 - loss: 0.2410 - regression_loss: 0.2304 - classification_loss: 0.0106 131/500 [======>.......................] - ETA: 3:05 - loss: 0.2402 - regression_loss: 0.2296 - classification_loss: 0.0105 132/500 [======>.......................] - ETA: 3:05 - loss: 0.2393 - regression_loss: 0.2288 - classification_loss: 0.0105 133/500 [======>.......................] - ETA: 3:04 - loss: 0.2380 - regression_loss: 0.2276 - classification_loss: 0.0104 134/500 [=======>......................] - ETA: 3:04 - loss: 0.2378 - regression_loss: 0.2274 - classification_loss: 0.0104 135/500 [=======>......................] - ETA: 3:03 - loss: 0.2377 - regression_loss: 0.2273 - classification_loss: 0.0104 136/500 [=======>......................] - ETA: 3:03 - loss: 0.2379 - regression_loss: 0.2275 - classification_loss: 0.0105 137/500 [=======>......................] - ETA: 3:02 - loss: 0.2367 - regression_loss: 0.2263 - classification_loss: 0.0104 138/500 [=======>......................] - ETA: 3:01 - loss: 0.2357 - regression_loss: 0.2253 - classification_loss: 0.0103 139/500 [=======>......................] - ETA: 3:01 - loss: 0.2353 - regression_loss: 0.2250 - classification_loss: 0.0103 140/500 [=======>......................] - ETA: 3:00 - loss: 0.2360 - regression_loss: 0.2257 - classification_loss: 0.0103 141/500 [=======>......................] - ETA: 3:00 - loss: 0.2359 - regression_loss: 0.2256 - classification_loss: 0.0103 142/500 [=======>......................] - ETA: 3:00 - loss: 0.2365 - regression_loss: 0.2262 - classification_loss: 0.0103 143/500 [=======>......................] - ETA: 2:59 - loss: 0.2365 - regression_loss: 0.2262 - classification_loss: 0.0103 144/500 [=======>......................] - ETA: 2:59 - loss: 0.2366 - regression_loss: 0.2263 - classification_loss: 0.0103 145/500 [=======>......................] - ETA: 2:58 - loss: 0.2371 - regression_loss: 0.2268 - classification_loss: 0.0103 146/500 [=======>......................] - ETA: 2:58 - loss: 0.2376 - regression_loss: 0.2273 - classification_loss: 0.0104 147/500 [=======>......................] - ETA: 2:57 - loss: 0.2371 - regression_loss: 0.2268 - classification_loss: 0.0103 148/500 [=======>......................] - ETA: 2:57 - loss: 0.2377 - regression_loss: 0.2273 - classification_loss: 0.0104 149/500 [=======>......................] - ETA: 2:56 - loss: 0.2369 - regression_loss: 0.2265 - classification_loss: 0.0104 150/500 [========>.....................] - ETA: 2:55 - loss: 0.2374 - regression_loss: 0.2269 - classification_loss: 0.0104 151/500 [========>.....................] - ETA: 2:55 - loss: 0.2375 - regression_loss: 0.2270 - classification_loss: 0.0105 152/500 [========>.....................] - ETA: 2:55 - loss: 0.2367 - regression_loss: 0.2263 - classification_loss: 0.0104 153/500 [========>.....................] - ETA: 2:54 - loss: 0.2368 - regression_loss: 0.2263 - classification_loss: 0.0104 154/500 [========>.....................] - ETA: 2:54 - loss: 0.2364 - regression_loss: 0.2260 - classification_loss: 0.0104 155/500 [========>.....................] - ETA: 2:53 - loss: 0.2356 - regression_loss: 0.2253 - classification_loss: 0.0104 156/500 [========>.....................] - ETA: 2:53 - loss: 0.2355 - regression_loss: 0.2251 - classification_loss: 0.0103 157/500 [========>.....................] - ETA: 2:52 - loss: 0.2352 - regression_loss: 0.2248 - classification_loss: 0.0104 158/500 [========>.....................] - ETA: 2:52 - loss: 0.2345 - regression_loss: 0.2241 - classification_loss: 0.0104 159/500 [========>.....................] - ETA: 2:51 - loss: 0.2340 - regression_loss: 0.2235 - classification_loss: 0.0105 160/500 [========>.....................] - ETA: 2:50 - loss: 0.2338 - regression_loss: 0.2233 - classification_loss: 0.0105 161/500 [========>.....................] - ETA: 2:50 - loss: 0.2336 - regression_loss: 0.2231 - classification_loss: 0.0105 162/500 [========>.....................] - ETA: 2:49 - loss: 0.2337 - regression_loss: 0.2231 - classification_loss: 0.0105 163/500 [========>.....................] - ETA: 2:49 - loss: 0.2342 - regression_loss: 0.2236 - classification_loss: 0.0106 164/500 [========>.....................] - ETA: 2:48 - loss: 0.2343 - regression_loss: 0.2237 - classification_loss: 0.0106 165/500 [========>.....................] - ETA: 2:48 - loss: 0.2339 - regression_loss: 0.2233 - classification_loss: 0.0106 166/500 [========>.....................] - ETA: 2:47 - loss: 0.2333 - regression_loss: 0.2227 - classification_loss: 0.0105 167/500 [=========>....................] - ETA: 2:47 - loss: 0.2331 - regression_loss: 0.2227 - classification_loss: 0.0105 168/500 [=========>....................] - ETA: 2:46 - loss: 0.2323 - regression_loss: 0.2218 - classification_loss: 0.0104 169/500 [=========>....................] - ETA: 2:46 - loss: 0.2317 - regression_loss: 0.2213 - classification_loss: 0.0104 170/500 [=========>....................] - ETA: 2:45 - loss: 0.2314 - regression_loss: 0.2210 - classification_loss: 0.0104 171/500 [=========>....................] - ETA: 2:45 - loss: 0.2315 - regression_loss: 0.2212 - classification_loss: 0.0104 172/500 [=========>....................] - ETA: 2:44 - loss: 0.2315 - regression_loss: 0.2211 - classification_loss: 0.0104 173/500 [=========>....................] - ETA: 2:44 - loss: 0.2305 - regression_loss: 0.2202 - classification_loss: 0.0103 174/500 [=========>....................] - ETA: 2:43 - loss: 0.2316 - regression_loss: 0.2212 - classification_loss: 0.0104 175/500 [=========>....................] - ETA: 2:43 - loss: 0.2313 - regression_loss: 0.2209 - classification_loss: 0.0104 176/500 [=========>....................] - ETA: 2:42 - loss: 0.2311 - regression_loss: 0.2207 - classification_loss: 0.0103 177/500 [=========>....................] - ETA: 2:42 - loss: 0.2314 - regression_loss: 0.2211 - classification_loss: 0.0103 178/500 [=========>....................] - ETA: 2:41 - loss: 0.2310 - regression_loss: 0.2207 - classification_loss: 0.0103 179/500 [=========>....................] - ETA: 2:41 - loss: 0.2307 - regression_loss: 0.2205 - classification_loss: 0.0103 180/500 [=========>....................] - ETA: 2:40 - loss: 0.2300 - regression_loss: 0.2198 - classification_loss: 0.0102 181/500 [=========>....................] - ETA: 2:40 - loss: 0.2299 - regression_loss: 0.2197 - classification_loss: 0.0102 182/500 [=========>....................] - ETA: 2:39 - loss: 0.2298 - regression_loss: 0.2196 - classification_loss: 0.0101 183/500 [=========>....................] - ETA: 2:39 - loss: 0.2298 - regression_loss: 0.2196 - classification_loss: 0.0101 184/500 [==========>...................] - ETA: 2:38 - loss: 0.2300 - regression_loss: 0.2198 - classification_loss: 0.0102 185/500 [==========>...................] - ETA: 2:38 - loss: 0.2297 - regression_loss: 0.2195 - classification_loss: 0.0102 186/500 [==========>...................] - ETA: 2:37 - loss: 0.2301 - regression_loss: 0.2198 - classification_loss: 0.0103 187/500 [==========>...................] - ETA: 2:37 - loss: 0.2299 - regression_loss: 0.2196 - classification_loss: 0.0103 188/500 [==========>...................] - ETA: 2:36 - loss: 0.2298 - regression_loss: 0.2195 - classification_loss: 0.0103 189/500 [==========>...................] - ETA: 2:36 - loss: 0.2306 - regression_loss: 0.2203 - classification_loss: 0.0103 190/500 [==========>...................] - ETA: 2:35 - loss: 0.2304 - regression_loss: 0.2202 - classification_loss: 0.0102 191/500 [==========>...................] - ETA: 2:35 - loss: 0.2311 - regression_loss: 0.2208 - classification_loss: 0.0103 192/500 [==========>...................] - ETA: 2:34 - loss: 0.2318 - regression_loss: 0.2215 - classification_loss: 0.0103 193/500 [==========>...................] - ETA: 2:34 - loss: 0.2314 - regression_loss: 0.2211 - classification_loss: 0.0103 194/500 [==========>...................] - ETA: 2:33 - loss: 0.2318 - regression_loss: 0.2215 - classification_loss: 0.0103 195/500 [==========>...................] - ETA: 2:33 - loss: 0.2314 - regression_loss: 0.2211 - classification_loss: 0.0103 196/500 [==========>...................] - ETA: 2:32 - loss: 0.2325 - regression_loss: 0.2219 - classification_loss: 0.0106 197/500 [==========>...................] - ETA: 2:32 - loss: 0.2316 - regression_loss: 0.2210 - classification_loss: 0.0106 198/500 [==========>...................] - ETA: 2:31 - loss: 0.2319 - regression_loss: 0.2213 - classification_loss: 0.0106 199/500 [==========>...................] - ETA: 2:31 - loss: 0.2321 - regression_loss: 0.2215 - classification_loss: 0.0106 200/500 [===========>..................] - ETA: 2:30 - loss: 0.2325 - regression_loss: 0.2218 - classification_loss: 0.0106 201/500 [===========>..................] - ETA: 2:30 - loss: 0.2323 - regression_loss: 0.2216 - classification_loss: 0.0107 202/500 [===========>..................] - ETA: 2:29 - loss: 0.2322 - regression_loss: 0.2215 - classification_loss: 0.0106 203/500 [===========>..................] - ETA: 2:29 - loss: 0.2319 - regression_loss: 0.2213 - classification_loss: 0.0106 204/500 [===========>..................] - ETA: 2:28 - loss: 0.2319 - regression_loss: 0.2213 - classification_loss: 0.0106 205/500 [===========>..................] - ETA: 2:28 - loss: 0.2327 - regression_loss: 0.2220 - classification_loss: 0.0107 206/500 [===========>..................] - ETA: 2:27 - loss: 0.2331 - regression_loss: 0.2225 - classification_loss: 0.0107 207/500 [===========>..................] - ETA: 2:27 - loss: 0.2330 - regression_loss: 0.2224 - classification_loss: 0.0107 208/500 [===========>..................] - ETA: 2:26 - loss: 0.2337 - regression_loss: 0.2230 - classification_loss: 0.0107 209/500 [===========>..................] - ETA: 2:26 - loss: 0.2336 - regression_loss: 0.2230 - classification_loss: 0.0106 210/500 [===========>..................] - ETA: 2:25 - loss: 0.2338 - regression_loss: 0.2232 - classification_loss: 0.0106 211/500 [===========>..................] - ETA: 2:25 - loss: 0.2344 - regression_loss: 0.2238 - classification_loss: 0.0106 212/500 [===========>..................] - ETA: 2:24 - loss: 0.2343 - regression_loss: 0.2237 - classification_loss: 0.0106 213/500 [===========>..................] - ETA: 2:24 - loss: 0.2343 - regression_loss: 0.2237 - classification_loss: 0.0106 214/500 [===========>..................] - ETA: 2:23 - loss: 0.2339 - regression_loss: 0.2233 - classification_loss: 0.0106 215/500 [===========>..................] - ETA: 2:23 - loss: 0.2334 - regression_loss: 0.2228 - classification_loss: 0.0106 216/500 [===========>..................] - ETA: 2:22 - loss: 0.2346 - regression_loss: 0.2240 - classification_loss: 0.0106 217/500 [============>.................] - ETA: 2:22 - loss: 0.2342 - regression_loss: 0.2236 - classification_loss: 0.0106 218/500 [============>.................] - ETA: 2:21 - loss: 0.2353 - regression_loss: 0.2247 - classification_loss: 0.0106 219/500 [============>.................] - ETA: 2:21 - loss: 0.2366 - regression_loss: 0.2259 - classification_loss: 0.0106 220/500 [============>.................] - ETA: 2:20 - loss: 0.2370 - regression_loss: 0.2264 - classification_loss: 0.0106 221/500 [============>.................] - ETA: 2:20 - loss: 0.2369 - regression_loss: 0.2263 - classification_loss: 0.0106 222/500 [============>.................] - ETA: 2:19 - loss: 0.2366 - regression_loss: 0.2260 - classification_loss: 0.0106 223/500 [============>.................] - ETA: 2:19 - loss: 0.2372 - regression_loss: 0.2265 - classification_loss: 0.0106 224/500 [============>.................] - ETA: 2:18 - loss: 0.2370 - regression_loss: 0.2264 - classification_loss: 0.0106 225/500 [============>.................] - ETA: 2:18 - loss: 0.2375 - regression_loss: 0.2269 - classification_loss: 0.0107 226/500 [============>.................] - ETA: 2:17 - loss: 0.2388 - regression_loss: 0.2280 - classification_loss: 0.0107 227/500 [============>.................] - ETA: 2:17 - loss: 0.2386 - regression_loss: 0.2278 - classification_loss: 0.0107 228/500 [============>.................] - ETA: 2:16 - loss: 0.2386 - regression_loss: 0.2279 - classification_loss: 0.0107 229/500 [============>.................] - ETA: 2:16 - loss: 0.2387 - regression_loss: 0.2280 - classification_loss: 0.0107 230/500 [============>.................] - ETA: 2:15 - loss: 0.2390 - regression_loss: 0.2282 - classification_loss: 0.0108 231/500 [============>.................] - ETA: 2:15 - loss: 0.2387 - regression_loss: 0.2280 - classification_loss: 0.0107 232/500 [============>.................] - ETA: 2:14 - loss: 0.2388 - regression_loss: 0.2281 - classification_loss: 0.0107 233/500 [============>.................] - ETA: 2:14 - loss: 0.2390 - regression_loss: 0.2283 - classification_loss: 0.0107 234/500 [=============>................] - ETA: 2:13 - loss: 0.2398 - regression_loss: 0.2290 - classification_loss: 0.0107 235/500 [=============>................] - ETA: 2:13 - loss: 0.2401 - regression_loss: 0.2294 - classification_loss: 0.0107 236/500 [=============>................] - ETA: 2:12 - loss: 0.2400 - regression_loss: 0.2293 - classification_loss: 0.0107 237/500 [=============>................] - ETA: 2:12 - loss: 0.2400 - regression_loss: 0.2292 - classification_loss: 0.0108 238/500 [=============>................] - ETA: 2:11 - loss: 0.2406 - regression_loss: 0.2298 - classification_loss: 0.0108 239/500 [=============>................] - ETA: 2:11 - loss: 0.2403 - regression_loss: 0.2296 - classification_loss: 0.0108 240/500 [=============>................] - ETA: 2:10 - loss: 0.2401 - regression_loss: 0.2293 - classification_loss: 0.0107 241/500 [=============>................] - ETA: 2:10 - loss: 0.2404 - regression_loss: 0.2296 - classification_loss: 0.0108 242/500 [=============>................] - ETA: 2:09 - loss: 0.2403 - regression_loss: 0.2296 - classification_loss: 0.0108 243/500 [=============>................] - ETA: 2:09 - loss: 0.2402 - regression_loss: 0.2294 - classification_loss: 0.0107 244/500 [=============>................] - ETA: 2:08 - loss: 0.2403 - regression_loss: 0.2295 - classification_loss: 0.0108 245/500 [=============>................] - ETA: 2:08 - loss: 0.2398 - regression_loss: 0.2290 - classification_loss: 0.0108 246/500 [=============>................] - ETA: 2:07 - loss: 0.2400 - regression_loss: 0.2292 - classification_loss: 0.0108 247/500 [=============>................] - ETA: 2:07 - loss: 0.2395 - regression_loss: 0.2287 - classification_loss: 0.0108 248/500 [=============>................] - ETA: 2:06 - loss: 0.2393 - regression_loss: 0.2285 - classification_loss: 0.0108 249/500 [=============>................] - ETA: 2:06 - loss: 0.2400 - regression_loss: 0.2292 - classification_loss: 0.0108 250/500 [==============>...............] - ETA: 2:05 - loss: 0.2397 - regression_loss: 0.2289 - classification_loss: 0.0108 251/500 [==============>...............] - ETA: 2:05 - loss: 0.2399 - regression_loss: 0.2291 - classification_loss: 0.0108 252/500 [==============>...............] - ETA: 2:04 - loss: 0.2396 - regression_loss: 0.2288 - classification_loss: 0.0108 253/500 [==============>...............] - ETA: 2:04 - loss: 0.2394 - regression_loss: 0.2286 - classification_loss: 0.0108 254/500 [==============>...............] - ETA: 2:03 - loss: 0.2393 - regression_loss: 0.2285 - classification_loss: 0.0108 255/500 [==============>...............] - ETA: 2:03 - loss: 0.2388 - regression_loss: 0.2281 - classification_loss: 0.0108 256/500 [==============>...............] - ETA: 2:02 - loss: 0.2386 - regression_loss: 0.2279 - classification_loss: 0.0107 257/500 [==============>...............] - ETA: 2:02 - loss: 0.2387 - regression_loss: 0.2280 - classification_loss: 0.0107 258/500 [==============>...............] - ETA: 2:01 - loss: 0.2384 - regression_loss: 0.2277 - classification_loss: 0.0107 259/500 [==============>...............] - ETA: 2:01 - loss: 0.2384 - regression_loss: 0.2277 - classification_loss: 0.0107 260/500 [==============>...............] - ETA: 2:00 - loss: 0.2383 - regression_loss: 0.2277 - classification_loss: 0.0107 261/500 [==============>...............] - ETA: 2:00 - loss: 0.2379 - regression_loss: 0.2273 - classification_loss: 0.0106 262/500 [==============>...............] - ETA: 1:59 - loss: 0.2378 - regression_loss: 0.2271 - classification_loss: 0.0106 263/500 [==============>...............] - ETA: 1:59 - loss: 0.2375 - regression_loss: 0.2268 - classification_loss: 0.0106 264/500 [==============>...............] - ETA: 1:58 - loss: 0.2373 - regression_loss: 0.2267 - classification_loss: 0.0106 265/500 [==============>...............] - ETA: 1:58 - loss: 0.2373 - regression_loss: 0.2267 - classification_loss: 0.0106 266/500 [==============>...............] - ETA: 1:57 - loss: 0.2372 - regression_loss: 0.2266 - classification_loss: 0.0106 267/500 [===============>..............] - ETA: 1:57 - loss: 0.2375 - regression_loss: 0.2269 - classification_loss: 0.0106 268/500 [===============>..............] - ETA: 1:56 - loss: 0.2370 - regression_loss: 0.2264 - classification_loss: 0.0106 269/500 [===============>..............] - ETA: 1:56 - loss: 0.2369 - regression_loss: 0.2263 - classification_loss: 0.0106 270/500 [===============>..............] - ETA: 1:55 - loss: 0.2364 - regression_loss: 0.2258 - classification_loss: 0.0106 271/500 [===============>..............] - ETA: 1:55 - loss: 0.2363 - regression_loss: 0.2257 - classification_loss: 0.0106 272/500 [===============>..............] - ETA: 1:54 - loss: 0.2367 - regression_loss: 0.2261 - classification_loss: 0.0106 273/500 [===============>..............] - ETA: 1:54 - loss: 0.2365 - regression_loss: 0.2259 - classification_loss: 0.0106 274/500 [===============>..............] - ETA: 1:53 - loss: 0.2362 - regression_loss: 0.2256 - classification_loss: 0.0106 275/500 [===============>..............] - ETA: 1:53 - loss: 0.2357 - regression_loss: 0.2252 - classification_loss: 0.0105 276/500 [===============>..............] - ETA: 1:52 - loss: 0.2357 - regression_loss: 0.2251 - classification_loss: 0.0105 277/500 [===============>..............] - ETA: 1:52 - loss: 0.2352 - regression_loss: 0.2247 - classification_loss: 0.0105 278/500 [===============>..............] - ETA: 1:51 - loss: 0.2350 - regression_loss: 0.2245 - classification_loss: 0.0105 279/500 [===============>..............] - ETA: 1:51 - loss: 0.2351 - regression_loss: 0.2246 - classification_loss: 0.0105 280/500 [===============>..............] - ETA: 1:50 - loss: 0.2352 - regression_loss: 0.2246 - classification_loss: 0.0105 281/500 [===============>..............] - ETA: 1:50 - loss: 0.2355 - regression_loss: 0.2249 - classification_loss: 0.0105 282/500 [===============>..............] - ETA: 1:49 - loss: 0.2349 - regression_loss: 0.2244 - classification_loss: 0.0105 283/500 [===============>..............] - ETA: 1:49 - loss: 0.2347 - regression_loss: 0.2242 - classification_loss: 0.0105 284/500 [================>.............] - ETA: 1:48 - loss: 0.2351 - regression_loss: 0.2245 - classification_loss: 0.0105 285/500 [================>.............] - ETA: 1:48 - loss: 0.2355 - regression_loss: 0.2250 - classification_loss: 0.0105 286/500 [================>.............] - ETA: 1:47 - loss: 0.2350 - regression_loss: 0.2246 - classification_loss: 0.0105 287/500 [================>.............] - ETA: 1:47 - loss: 0.2351 - regression_loss: 0.2246 - classification_loss: 0.0105 288/500 [================>.............] - ETA: 1:46 - loss: 0.2348 - regression_loss: 0.2243 - classification_loss: 0.0105 289/500 [================>.............] - ETA: 1:46 - loss: 0.2351 - regression_loss: 0.2246 - classification_loss: 0.0105 290/500 [================>.............] - ETA: 1:45 - loss: 0.2349 - regression_loss: 0.2244 - classification_loss: 0.0105 291/500 [================>.............] - ETA: 1:45 - loss: 0.2350 - regression_loss: 0.2246 - classification_loss: 0.0104 292/500 [================>.............] - ETA: 1:44 - loss: 0.2373 - regression_loss: 0.2268 - classification_loss: 0.0104 293/500 [================>.............] - ETA: 1:44 - loss: 0.2373 - regression_loss: 0.2269 - classification_loss: 0.0104 294/500 [================>.............] - ETA: 1:43 - loss: 0.2373 - regression_loss: 0.2269 - classification_loss: 0.0104 295/500 [================>.............] - ETA: 1:43 - loss: 0.2374 - regression_loss: 0.2270 - classification_loss: 0.0104 296/500 [================>.............] - ETA: 1:42 - loss: 0.2373 - regression_loss: 0.2269 - classification_loss: 0.0104 297/500 [================>.............] - ETA: 1:42 - loss: 0.2375 - regression_loss: 0.2271 - classification_loss: 0.0104 298/500 [================>.............] - ETA: 1:41 - loss: 0.2374 - regression_loss: 0.2270 - classification_loss: 0.0104 299/500 [================>.............] - ETA: 1:41 - loss: 0.2375 - regression_loss: 0.2270 - classification_loss: 0.0105 300/500 [=================>............] - ETA: 1:40 - loss: 0.2379 - regression_loss: 0.2274 - classification_loss: 0.0105 301/500 [=================>............] - ETA: 1:40 - loss: 0.2377 - regression_loss: 0.2272 - classification_loss: 0.0105 302/500 [=================>............] - ETA: 1:39 - loss: 0.2382 - regression_loss: 0.2277 - classification_loss: 0.0105 303/500 [=================>............] - ETA: 1:39 - loss: 0.2378 - regression_loss: 0.2273 - classification_loss: 0.0105 304/500 [=================>............] - ETA: 1:38 - loss: 0.2381 - regression_loss: 0.2276 - classification_loss: 0.0104 305/500 [=================>............] - ETA: 1:38 - loss: 0.2380 - regression_loss: 0.2276 - classification_loss: 0.0104 306/500 [=================>............] - ETA: 1:37 - loss: 0.2380 - regression_loss: 0.2276 - classification_loss: 0.0104 307/500 [=================>............] - ETA: 1:37 - loss: 0.2387 - regression_loss: 0.2283 - classification_loss: 0.0104 308/500 [=================>............] - ETA: 1:36 - loss: 0.2385 - regression_loss: 0.2281 - classification_loss: 0.0104 309/500 [=================>............] - ETA: 1:36 - loss: 0.2385 - regression_loss: 0.2281 - classification_loss: 0.0104 310/500 [=================>............] - ETA: 1:35 - loss: 0.2384 - regression_loss: 0.2280 - classification_loss: 0.0104 311/500 [=================>............] - ETA: 1:34 - loss: 0.2382 - regression_loss: 0.2278 - classification_loss: 0.0104 312/500 [=================>............] - ETA: 1:34 - loss: 0.2380 - regression_loss: 0.2277 - classification_loss: 0.0104 313/500 [=================>............] - ETA: 1:33 - loss: 0.2386 - regression_loss: 0.2281 - classification_loss: 0.0104 314/500 [=================>............] - ETA: 1:33 - loss: 0.2383 - regression_loss: 0.2279 - classification_loss: 0.0104 315/500 [=================>............] - ETA: 1:32 - loss: 0.2378 - regression_loss: 0.2274 - classification_loss: 0.0104 316/500 [=================>............] - ETA: 1:32 - loss: 0.2381 - regression_loss: 0.2277 - classification_loss: 0.0104 317/500 [==================>...........] - ETA: 1:31 - loss: 0.2377 - regression_loss: 0.2273 - classification_loss: 0.0104 318/500 [==================>...........] - ETA: 1:31 - loss: 0.2375 - regression_loss: 0.2271 - classification_loss: 0.0103 319/500 [==================>...........] - ETA: 1:30 - loss: 0.2373 - regression_loss: 0.2270 - classification_loss: 0.0103 320/500 [==================>...........] - ETA: 1:30 - loss: 0.2373 - regression_loss: 0.2270 - classification_loss: 0.0103 321/500 [==================>...........] - ETA: 1:29 - loss: 0.2368 - regression_loss: 0.2265 - classification_loss: 0.0103 322/500 [==================>...........] - ETA: 1:29 - loss: 0.2368 - regression_loss: 0.2265 - classification_loss: 0.0103 323/500 [==================>...........] - ETA: 1:28 - loss: 0.2364 - regression_loss: 0.2261 - classification_loss: 0.0103 324/500 [==================>...........] - ETA: 1:28 - loss: 0.2363 - regression_loss: 0.2261 - classification_loss: 0.0103 325/500 [==================>...........] - ETA: 1:27 - loss: 0.2364 - regression_loss: 0.2261 - classification_loss: 0.0103 326/500 [==================>...........] - ETA: 1:27 - loss: 0.2365 - regression_loss: 0.2262 - classification_loss: 0.0103 327/500 [==================>...........] - ETA: 1:26 - loss: 0.2364 - regression_loss: 0.2261 - classification_loss: 0.0103 328/500 [==================>...........] - ETA: 1:26 - loss: 0.2363 - regression_loss: 0.2260 - classification_loss: 0.0103 329/500 [==================>...........] - ETA: 1:25 - loss: 0.2360 - regression_loss: 0.2257 - classification_loss: 0.0103 330/500 [==================>...........] - ETA: 1:25 - loss: 0.2356 - regression_loss: 0.2253 - classification_loss: 0.0103 331/500 [==================>...........] - ETA: 1:24 - loss: 0.2354 - regression_loss: 0.2251 - classification_loss: 0.0103 332/500 [==================>...........] - ETA: 1:24 - loss: 0.2356 - regression_loss: 0.2253 - classification_loss: 0.0103 333/500 [==================>...........] - ETA: 1:23 - loss: 0.2352 - regression_loss: 0.2249 - classification_loss: 0.0103 334/500 [===================>..........] - ETA: 1:23 - loss: 0.2358 - regression_loss: 0.2255 - classification_loss: 0.0103 335/500 [===================>..........] - ETA: 1:22 - loss: 0.2360 - regression_loss: 0.2256 - classification_loss: 0.0103 336/500 [===================>..........] - ETA: 1:22 - loss: 0.2354 - regression_loss: 0.2251 - classification_loss: 0.0103 337/500 [===================>..........] - ETA: 1:21 - loss: 0.2358 - regression_loss: 0.2255 - classification_loss: 0.0103 338/500 [===================>..........] - ETA: 1:21 - loss: 0.2357 - regression_loss: 0.2254 - classification_loss: 0.0103 339/500 [===================>..........] - ETA: 1:20 - loss: 0.2356 - regression_loss: 0.2253 - classification_loss: 0.0103 340/500 [===================>..........] - ETA: 1:20 - loss: 0.2354 - regression_loss: 0.2252 - classification_loss: 0.0103 341/500 [===================>..........] - ETA: 1:19 - loss: 0.2357 - regression_loss: 0.2254 - classification_loss: 0.0103 342/500 [===================>..........] - ETA: 1:19 - loss: 0.2355 - regression_loss: 0.2252 - classification_loss: 0.0103 343/500 [===================>..........] - ETA: 1:18 - loss: 0.2354 - regression_loss: 0.2251 - classification_loss: 0.0103 344/500 [===================>..........] - ETA: 1:18 - loss: 0.2351 - regression_loss: 0.2248 - classification_loss: 0.0103 345/500 [===================>..........] - ETA: 1:17 - loss: 0.2349 - regression_loss: 0.2246 - classification_loss: 0.0103 346/500 [===================>..........] - ETA: 1:17 - loss: 0.2348 - regression_loss: 0.2245 - classification_loss: 0.0103 347/500 [===================>..........] - ETA: 1:16 - loss: 0.2355 - regression_loss: 0.2250 - classification_loss: 0.0105 348/500 [===================>..........] - ETA: 1:16 - loss: 0.2358 - regression_loss: 0.2253 - classification_loss: 0.0105 349/500 [===================>..........] - ETA: 1:15 - loss: 0.2367 - regression_loss: 0.2262 - classification_loss: 0.0105 350/500 [====================>.........] - ETA: 1:15 - loss: 0.2368 - regression_loss: 0.2263 - classification_loss: 0.0104 351/500 [====================>.........] - ETA: 1:14 - loss: 0.2372 - regression_loss: 0.2267 - classification_loss: 0.0105 352/500 [====================>.........] - ETA: 1:14 - loss: 0.2369 - regression_loss: 0.2265 - classification_loss: 0.0104 353/500 [====================>.........] - ETA: 1:13 - loss: 0.2369 - regression_loss: 0.2265 - classification_loss: 0.0104 354/500 [====================>.........] - ETA: 1:13 - loss: 0.2366 - regression_loss: 0.2262 - classification_loss: 0.0104 355/500 [====================>.........] - ETA: 1:12 - loss: 0.2369 - regression_loss: 0.2265 - classification_loss: 0.0104 356/500 [====================>.........] - ETA: 1:12 - loss: 0.2371 - regression_loss: 0.2267 - classification_loss: 0.0104 357/500 [====================>.........] - ETA: 1:11 - loss: 0.2369 - regression_loss: 0.2265 - classification_loss: 0.0104 358/500 [====================>.........] - ETA: 1:11 - loss: 0.2367 - regression_loss: 0.2263 - classification_loss: 0.0104 359/500 [====================>.........] - ETA: 1:10 - loss: 0.2364 - regression_loss: 0.2261 - classification_loss: 0.0104 360/500 [====================>.........] - ETA: 1:10 - loss: 0.2359 - regression_loss: 0.2256 - classification_loss: 0.0103 361/500 [====================>.........] - ETA: 1:09 - loss: 0.2356 - regression_loss: 0.2253 - classification_loss: 0.0103 362/500 [====================>.........] - ETA: 1:09 - loss: 0.2358 - regression_loss: 0.2254 - classification_loss: 0.0104 363/500 [====================>.........] - ETA: 1:08 - loss: 0.2358 - regression_loss: 0.2255 - classification_loss: 0.0104 364/500 [====================>.........] - ETA: 1:08 - loss: 0.2359 - regression_loss: 0.2256 - classification_loss: 0.0104 365/500 [====================>.........] - ETA: 1:07 - loss: 0.2363 - regression_loss: 0.2259 - classification_loss: 0.0104 366/500 [====================>.........] - ETA: 1:07 - loss: 0.2361 - regression_loss: 0.2258 - classification_loss: 0.0103 367/500 [=====================>........] - ETA: 1:06 - loss: 0.2368 - regression_loss: 0.2265 - classification_loss: 0.0103 368/500 [=====================>........] - ETA: 1:06 - loss: 0.2365 - regression_loss: 0.2262 - classification_loss: 0.0103 369/500 [=====================>........] - ETA: 1:05 - loss: 0.2365 - regression_loss: 0.2262 - classification_loss: 0.0103 370/500 [=====================>........] - ETA: 1:05 - loss: 0.2365 - regression_loss: 0.2262 - classification_loss: 0.0103 371/500 [=====================>........] - ETA: 1:04 - loss: 0.2367 - regression_loss: 0.2264 - classification_loss: 0.0103 372/500 [=====================>........] - ETA: 1:04 - loss: 0.2366 - regression_loss: 0.2263 - classification_loss: 0.0103 373/500 [=====================>........] - ETA: 1:03 - loss: 0.2367 - regression_loss: 0.2264 - classification_loss: 0.0103 374/500 [=====================>........] - ETA: 1:03 - loss: 0.2364 - regression_loss: 0.2261 - classification_loss: 0.0103 375/500 [=====================>........] - ETA: 1:02 - loss: 0.2362 - regression_loss: 0.2259 - classification_loss: 0.0103 376/500 [=====================>........] - ETA: 1:02 - loss: 0.2359 - regression_loss: 0.2256 - classification_loss: 0.0103 377/500 [=====================>........] - ETA: 1:01 - loss: 0.2360 - regression_loss: 0.2257 - classification_loss: 0.0102 378/500 [=====================>........] - ETA: 1:01 - loss: 0.2355 - regression_loss: 0.2253 - classification_loss: 0.0102 379/500 [=====================>........] - ETA: 1:00 - loss: 0.2356 - regression_loss: 0.2254 - classification_loss: 0.0102 380/500 [=====================>........] - ETA: 1:00 - loss: 0.2355 - regression_loss: 0.2253 - classification_loss: 0.0102 381/500 [=====================>........] - ETA: 59s - loss: 0.2355 - regression_loss: 0.2253 - classification_loss: 0.0102  382/500 [=====================>........] - ETA: 59s - loss: 0.2353 - regression_loss: 0.2251 - classification_loss: 0.0102 383/500 [=====================>........] - ETA: 58s - loss: 0.2349 - regression_loss: 0.2247 - classification_loss: 0.0102 384/500 [======================>.......] - ETA: 58s - loss: 0.2350 - regression_loss: 0.2248 - classification_loss: 0.0102 385/500 [======================>.......] - ETA: 57s - loss: 0.2350 - regression_loss: 0.2249 - classification_loss: 0.0102 386/500 [======================>.......] - ETA: 57s - loss: 0.2348 - regression_loss: 0.2246 - classification_loss: 0.0102 387/500 [======================>.......] - ETA: 56s - loss: 0.2346 - regression_loss: 0.2245 - classification_loss: 0.0101 388/500 [======================>.......] - ETA: 56s - loss: 0.2346 - regression_loss: 0.2244 - classification_loss: 0.0101 389/500 [======================>.......] - ETA: 55s - loss: 0.2346 - regression_loss: 0.2245 - classification_loss: 0.0101 390/500 [======================>.......] - ETA: 55s - loss: 0.2346 - regression_loss: 0.2245 - classification_loss: 0.0101 391/500 [======================>.......] - ETA: 54s - loss: 0.2351 - regression_loss: 0.2249 - classification_loss: 0.0101 392/500 [======================>.......] - ETA: 54s - loss: 0.2352 - regression_loss: 0.2251 - classification_loss: 0.0101 393/500 [======================>.......] - ETA: 53s - loss: 0.2350 - regression_loss: 0.2249 - classification_loss: 0.0101 394/500 [======================>.......] - ETA: 53s - loss: 0.2351 - regression_loss: 0.2250 - classification_loss: 0.0101 395/500 [======================>.......] - ETA: 52s - loss: 0.2352 - regression_loss: 0.2251 - classification_loss: 0.0101 396/500 [======================>.......] - ETA: 52s - loss: 0.2351 - regression_loss: 0.2250 - classification_loss: 0.0101 397/500 [======================>.......] - ETA: 51s - loss: 0.2353 - regression_loss: 0.2252 - classification_loss: 0.0101 398/500 [======================>.......] - ETA: 51s - loss: 0.2356 - regression_loss: 0.2255 - classification_loss: 0.0102 399/500 [======================>.......] - ETA: 50s - loss: 0.2360 - regression_loss: 0.2258 - classification_loss: 0.0102 400/500 [=======================>......] - ETA: 50s - loss: 0.2360 - regression_loss: 0.2258 - classification_loss: 0.0102 401/500 [=======================>......] - ETA: 49s - loss: 0.2358 - regression_loss: 0.2256 - classification_loss: 0.0102 402/500 [=======================>......] - ETA: 49s - loss: 0.2360 - regression_loss: 0.2257 - classification_loss: 0.0102 403/500 [=======================>......] - ETA: 48s - loss: 0.2357 - regression_loss: 0.2255 - classification_loss: 0.0102 404/500 [=======================>......] - ETA: 48s - loss: 0.2356 - regression_loss: 0.2254 - classification_loss: 0.0102 405/500 [=======================>......] - ETA: 47s - loss: 0.2358 - regression_loss: 0.2255 - classification_loss: 0.0102 406/500 [=======================>......] - ETA: 47s - loss: 0.2361 - regression_loss: 0.2258 - classification_loss: 0.0103 407/500 [=======================>......] - ETA: 46s - loss: 0.2361 - regression_loss: 0.2258 - classification_loss: 0.0103 408/500 [=======================>......] - ETA: 46s - loss: 0.2363 - regression_loss: 0.2261 - classification_loss: 0.0103 409/500 [=======================>......] - ETA: 45s - loss: 0.2362 - regression_loss: 0.2259 - classification_loss: 0.0102 410/500 [=======================>......] - ETA: 45s - loss: 0.2363 - regression_loss: 0.2260 - classification_loss: 0.0103 411/500 [=======================>......] - ETA: 44s - loss: 0.2361 - regression_loss: 0.2259 - classification_loss: 0.0102 412/500 [=======================>......] - ETA: 44s - loss: 0.2363 - regression_loss: 0.2260 - classification_loss: 0.0103 413/500 [=======================>......] - ETA: 43s - loss: 0.2362 - regression_loss: 0.2259 - classification_loss: 0.0103 414/500 [=======================>......] - ETA: 43s - loss: 0.2361 - regression_loss: 0.2258 - classification_loss: 0.0103 415/500 [=======================>......] - ETA: 42s - loss: 0.2362 - regression_loss: 0.2259 - classification_loss: 0.0103 416/500 [=======================>......] - ETA: 42s - loss: 0.2361 - regression_loss: 0.2259 - classification_loss: 0.0103 417/500 [========================>.....] - ETA: 41s - loss: 0.2360 - regression_loss: 0.2257 - classification_loss: 0.0103 418/500 [========================>.....] - ETA: 41s - loss: 0.2362 - regression_loss: 0.2259 - classification_loss: 0.0102 419/500 [========================>.....] - ETA: 40s - loss: 0.2363 - regression_loss: 0.2260 - classification_loss: 0.0102 420/500 [========================>.....] - ETA: 40s - loss: 0.2361 - regression_loss: 0.2258 - classification_loss: 0.0102 421/500 [========================>.....] - ETA: 39s - loss: 0.2359 - regression_loss: 0.2257 - classification_loss: 0.0102 422/500 [========================>.....] - ETA: 39s - loss: 0.2356 - regression_loss: 0.2254 - classification_loss: 0.0102 423/500 [========================>.....] - ETA: 38s - loss: 0.2360 - regression_loss: 0.2258 - classification_loss: 0.0102 424/500 [========================>.....] - ETA: 38s - loss: 0.2362 - regression_loss: 0.2260 - classification_loss: 0.0102 425/500 [========================>.....] - ETA: 37s - loss: 0.2364 - regression_loss: 0.2262 - classification_loss: 0.0102 426/500 [========================>.....] - ETA: 37s - loss: 0.2366 - regression_loss: 0.2265 - classification_loss: 0.0102 427/500 [========================>.....] - ETA: 36s - loss: 0.2368 - regression_loss: 0.2267 - classification_loss: 0.0102 428/500 [========================>.....] - ETA: 36s - loss: 0.2369 - regression_loss: 0.2267 - classification_loss: 0.0102 429/500 [========================>.....] - ETA: 35s - loss: 0.2369 - regression_loss: 0.2267 - classification_loss: 0.0102 430/500 [========================>.....] - ETA: 35s - loss: 0.2368 - regression_loss: 0.2266 - classification_loss: 0.0101 431/500 [========================>.....] - ETA: 34s - loss: 0.2368 - regression_loss: 0.2267 - classification_loss: 0.0101 432/500 [========================>.....] - ETA: 34s - loss: 0.2368 - regression_loss: 0.2267 - classification_loss: 0.0101 433/500 [========================>.....] - ETA: 33s - loss: 0.2370 - regression_loss: 0.2269 - classification_loss: 0.0101 434/500 [=========================>....] - ETA: 33s - loss: 0.2372 - regression_loss: 0.2271 - classification_loss: 0.0101 435/500 [=========================>....] - ETA: 32s - loss: 0.2374 - regression_loss: 0.2272 - classification_loss: 0.0102 436/500 [=========================>....] - ETA: 32s - loss: 0.2376 - regression_loss: 0.2274 - classification_loss: 0.0102 437/500 [=========================>....] - ETA: 31s - loss: 0.2383 - regression_loss: 0.2281 - classification_loss: 0.0102 438/500 [=========================>....] - ETA: 31s - loss: 0.2380 - regression_loss: 0.2279 - classification_loss: 0.0102 439/500 [=========================>....] - ETA: 30s - loss: 0.2381 - regression_loss: 0.2279 - classification_loss: 0.0102 440/500 [=========================>....] - ETA: 30s - loss: 0.2383 - regression_loss: 0.2281 - classification_loss: 0.0102 441/500 [=========================>....] - ETA: 29s - loss: 0.2386 - regression_loss: 0.2283 - classification_loss: 0.0103 442/500 [=========================>....] - ETA: 29s - loss: 0.2387 - regression_loss: 0.2284 - classification_loss: 0.0103 443/500 [=========================>....] - ETA: 28s - loss: 0.2385 - regression_loss: 0.2282 - classification_loss: 0.0103 444/500 [=========================>....] - ETA: 28s - loss: 0.2381 - regression_loss: 0.2279 - classification_loss: 0.0102 445/500 [=========================>....] - ETA: 27s - loss: 0.2378 - regression_loss: 0.2276 - classification_loss: 0.0102 446/500 [=========================>....] - ETA: 27s - loss: 0.2376 - regression_loss: 0.2274 - classification_loss: 0.0102 447/500 [=========================>....] - ETA: 26s - loss: 0.2376 - regression_loss: 0.2274 - classification_loss: 0.0102 448/500 [=========================>....] - ETA: 26s - loss: 0.2378 - regression_loss: 0.2276 - classification_loss: 0.0102 449/500 [=========================>....] - ETA: 25s - loss: 0.2378 - regression_loss: 0.2276 - classification_loss: 0.0102 450/500 [==========================>...] - ETA: 25s - loss: 0.2379 - regression_loss: 0.2277 - classification_loss: 0.0102 451/500 [==========================>...] - ETA: 24s - loss: 0.2377 - regression_loss: 0.2275 - classification_loss: 0.0102 452/500 [==========================>...] - ETA: 24s - loss: 0.2375 - regression_loss: 0.2273 - classification_loss: 0.0102 453/500 [==========================>...] - ETA: 23s - loss: 0.2374 - regression_loss: 0.2272 - classification_loss: 0.0102 454/500 [==========================>...] - ETA: 23s - loss: 0.2371 - regression_loss: 0.2269 - classification_loss: 0.0102 455/500 [==========================>...] - ETA: 22s - loss: 0.2374 - regression_loss: 0.2272 - classification_loss: 0.0102 456/500 [==========================>...] - ETA: 22s - loss: 0.2373 - regression_loss: 0.2271 - classification_loss: 0.0102 457/500 [==========================>...] - ETA: 21s - loss: 0.2373 - regression_loss: 0.2271 - classification_loss: 0.0102 458/500 [==========================>...] - ETA: 21s - loss: 0.2372 - regression_loss: 0.2270 - classification_loss: 0.0102 459/500 [==========================>...] - ETA: 20s - loss: 0.2371 - regression_loss: 0.2269 - classification_loss: 0.0102 460/500 [==========================>...] - ETA: 20s - loss: 0.2367 - regression_loss: 0.2266 - classification_loss: 0.0102 461/500 [==========================>...] - ETA: 19s - loss: 0.2368 - regression_loss: 0.2266 - classification_loss: 0.0102 462/500 [==========================>...] - ETA: 19s - loss: 0.2365 - regression_loss: 0.2264 - classification_loss: 0.0102 463/500 [==========================>...] - ETA: 18s - loss: 0.2363 - regression_loss: 0.2262 - classification_loss: 0.0101 464/500 [==========================>...] - ETA: 18s - loss: 0.2361 - regression_loss: 0.2260 - classification_loss: 0.0102 465/500 [==========================>...] - ETA: 17s - loss: 0.2361 - regression_loss: 0.2260 - classification_loss: 0.0102 466/500 [==========================>...] - ETA: 17s - loss: 0.2361 - regression_loss: 0.2260 - classification_loss: 0.0102 467/500 [===========================>..] - ETA: 16s - loss: 0.2359 - regression_loss: 0.2258 - classification_loss: 0.0101 468/500 [===========================>..] - ETA: 16s - loss: 0.2357 - regression_loss: 0.2256 - classification_loss: 0.0101 469/500 [===========================>..] - ETA: 15s - loss: 0.2358 - regression_loss: 0.2257 - classification_loss: 0.0101 470/500 [===========================>..] - ETA: 15s - loss: 0.2362 - regression_loss: 0.2260 - classification_loss: 0.0102 471/500 [===========================>..] - ETA: 14s - loss: 0.2360 - regression_loss: 0.2259 - classification_loss: 0.0101 472/500 [===========================>..] - ETA: 14s - loss: 0.2359 - regression_loss: 0.2258 - classification_loss: 0.0101 473/500 [===========================>..] - ETA: 13s - loss: 0.2358 - regression_loss: 0.2256 - classification_loss: 0.0101 474/500 [===========================>..] - ETA: 13s - loss: 0.2357 - regression_loss: 0.2255 - classification_loss: 0.0101 475/500 [===========================>..] - ETA: 12s - loss: 0.2359 - regression_loss: 0.2258 - classification_loss: 0.0101 476/500 [===========================>..] - ETA: 12s - loss: 0.2358 - regression_loss: 0.2257 - classification_loss: 0.0101 477/500 [===========================>..] - ETA: 11s - loss: 0.2358 - regression_loss: 0.2257 - classification_loss: 0.0101 478/500 [===========================>..] - ETA: 11s - loss: 0.2357 - regression_loss: 0.2256 - classification_loss: 0.0101 479/500 [===========================>..] - ETA: 10s - loss: 0.2356 - regression_loss: 0.2255 - classification_loss: 0.0101 480/500 [===========================>..] - ETA: 10s - loss: 0.2355 - regression_loss: 0.2254 - classification_loss: 0.0101 481/500 [===========================>..] - ETA: 9s - loss: 0.2354 - regression_loss: 0.2253 - classification_loss: 0.0101  482/500 [===========================>..] - ETA: 9s - loss: 0.2354 - regression_loss: 0.2253 - classification_loss: 0.0101 483/500 [===========================>..] - ETA: 8s - loss: 0.2356 - regression_loss: 0.2255 - classification_loss: 0.0101 484/500 [============================>.] - ETA: 8s - loss: 0.2359 - regression_loss: 0.2258 - classification_loss: 0.0101 485/500 [============================>.] - ETA: 7s - loss: 0.2358 - regression_loss: 0.2257 - classification_loss: 0.0101 486/500 [============================>.] - ETA: 7s - loss: 0.2356 - regression_loss: 0.2255 - classification_loss: 0.0101 487/500 [============================>.] - ETA: 6s - loss: 0.2355 - regression_loss: 0.2254 - classification_loss: 0.0101 488/500 [============================>.] - ETA: 6s - loss: 0.2354 - regression_loss: 0.2252 - classification_loss: 0.0101 489/500 [============================>.] - ETA: 5s - loss: 0.2354 - regression_loss: 0.2253 - classification_loss: 0.0101 490/500 [============================>.] - ETA: 5s - loss: 0.2351 - regression_loss: 0.2250 - classification_loss: 0.0101 491/500 [============================>.] - ETA: 4s - loss: 0.2356 - regression_loss: 0.2255 - classification_loss: 0.0101 492/500 [============================>.] - ETA: 4s - loss: 0.2356 - regression_loss: 0.2255 - classification_loss: 0.0101 493/500 [============================>.] - ETA: 3s - loss: 0.2355 - regression_loss: 0.2254 - classification_loss: 0.0101 494/500 [============================>.] - ETA: 3s - loss: 0.2355 - regression_loss: 0.2254 - classification_loss: 0.0101 495/500 [============================>.] - ETA: 2s - loss: 0.2360 - regression_loss: 0.2258 - classification_loss: 0.0101 496/500 [============================>.] - ETA: 2s - loss: 0.2362 - regression_loss: 0.2261 - classification_loss: 0.0101 497/500 [============================>.] - ETA: 1s - loss: 0.2360 - regression_loss: 0.2259 - classification_loss: 0.0101 498/500 [============================>.] - ETA: 1s - loss: 0.2362 - regression_loss: 0.2260 - classification_loss: 0.0101 499/500 [============================>.] - ETA: 0s - loss: 0.2362 - regression_loss: 0.2261 - classification_loss: 0.0101 500/500 [==============================] - 251s 502ms/step - loss: 0.2363 - regression_loss: 0.2262 - classification_loss: 0.0101 1172 instances of class plum with average precision: 0.7412 mAP: 0.7412 Epoch 00032: saving model to ./training/snapshots/resnet101_pascal_32.h5 Epoch 33/150 1/500 [..............................] - ETA: 3:54 - loss: 0.2250 - regression_loss: 0.2152 - classification_loss: 0.0098 2/500 [..............................] - ETA: 4:04 - loss: 0.2530 - regression_loss: 0.2434 - classification_loss: 0.0096 3/500 [..............................] - ETA: 4:09 - loss: 0.2631 - regression_loss: 0.2538 - classification_loss: 0.0093 4/500 [..............................] - ETA: 4:08 - loss: 0.2306 - regression_loss: 0.2221 - classification_loss: 0.0085 5/500 [..............................] - ETA: 4:09 - loss: 0.2243 - regression_loss: 0.2173 - classification_loss: 0.0070 6/500 [..............................] - ETA: 4:08 - loss: 0.2113 - regression_loss: 0.2031 - classification_loss: 0.0083 7/500 [..............................] - ETA: 4:05 - loss: 0.2205 - regression_loss: 0.2083 - classification_loss: 0.0121 8/500 [..............................] - ETA: 4:05 - loss: 0.2107 - regression_loss: 0.1991 - classification_loss: 0.0116 9/500 [..............................] - ETA: 4:04 - loss: 0.2136 - regression_loss: 0.2031 - classification_loss: 0.0106 10/500 [..............................] - ETA: 4:04 - loss: 0.2292 - regression_loss: 0.2194 - classification_loss: 0.0097 11/500 [..............................] - ETA: 4:04 - loss: 0.2485 - regression_loss: 0.2395 - classification_loss: 0.0091 12/500 [..............................] - ETA: 4:04 - loss: 0.2905 - regression_loss: 0.2782 - classification_loss: 0.0122 13/500 [..............................] - ETA: 4:03 - loss: 0.3056 - regression_loss: 0.2930 - classification_loss: 0.0126 14/500 [..............................] - ETA: 4:03 - loss: 0.2996 - regression_loss: 0.2863 - classification_loss: 0.0134 15/500 [..............................] - ETA: 4:03 - loss: 0.3079 - regression_loss: 0.2946 - classification_loss: 0.0133 16/500 [..............................] - ETA: 4:02 - loss: 0.3006 - regression_loss: 0.2873 - classification_loss: 0.0133 17/500 [>.............................] - ETA: 4:01 - loss: 0.2981 - regression_loss: 0.2834 - classification_loss: 0.0148 18/500 [>.............................] - ETA: 4:01 - loss: 0.2998 - regression_loss: 0.2854 - classification_loss: 0.0144 19/500 [>.............................] - ETA: 4:01 - loss: 0.3005 - regression_loss: 0.2864 - classification_loss: 0.0141 20/500 [>.............................] - ETA: 4:00 - loss: 0.3031 - regression_loss: 0.2891 - classification_loss: 0.0140 21/500 [>.............................] - ETA: 4:00 - loss: 0.3065 - regression_loss: 0.2929 - classification_loss: 0.0137 22/500 [>.............................] - ETA: 3:59 - loss: 0.3029 - regression_loss: 0.2898 - classification_loss: 0.0131 23/500 [>.............................] - ETA: 3:59 - loss: 0.2994 - regression_loss: 0.2866 - classification_loss: 0.0129 24/500 [>.............................] - ETA: 3:58 - loss: 0.2971 - regression_loss: 0.2842 - classification_loss: 0.0129 25/500 [>.............................] - ETA: 3:58 - loss: 0.2944 - regression_loss: 0.2816 - classification_loss: 0.0128 26/500 [>.............................] - ETA: 3:58 - loss: 0.2899 - regression_loss: 0.2774 - classification_loss: 0.0125 27/500 [>.............................] - ETA: 3:57 - loss: 0.2863 - regression_loss: 0.2740 - classification_loss: 0.0124 28/500 [>.............................] - ETA: 3:56 - loss: 0.2881 - regression_loss: 0.2755 - classification_loss: 0.0125 29/500 [>.............................] - ETA: 3:56 - loss: 0.2858 - regression_loss: 0.2733 - classification_loss: 0.0125 30/500 [>.............................] - ETA: 3:55 - loss: 0.2868 - regression_loss: 0.2742 - classification_loss: 0.0126 31/500 [>.............................] - ETA: 3:55 - loss: 0.2819 - regression_loss: 0.2695 - classification_loss: 0.0124 32/500 [>.............................] - ETA: 3:54 - loss: 0.2783 - regression_loss: 0.2658 - classification_loss: 0.0125 33/500 [>.............................] - ETA: 3:54 - loss: 0.2742 - regression_loss: 0.2619 - classification_loss: 0.0123 34/500 [=>............................] - ETA: 3:54 - loss: 0.2682 - regression_loss: 0.2562 - classification_loss: 0.0120 35/500 [=>............................] - ETA: 3:53 - loss: 0.2656 - regression_loss: 0.2539 - classification_loss: 0.0117 36/500 [=>............................] - ETA: 3:53 - loss: 0.2618 - regression_loss: 0.2503 - classification_loss: 0.0115 37/500 [=>............................] - ETA: 3:52 - loss: 0.2594 - regression_loss: 0.2480 - classification_loss: 0.0114 38/500 [=>............................] - ETA: 3:52 - loss: 0.2575 - regression_loss: 0.2461 - classification_loss: 0.0114 39/500 [=>............................] - ETA: 3:51 - loss: 0.2576 - regression_loss: 0.2457 - classification_loss: 0.0119 40/500 [=>............................] - ETA: 3:51 - loss: 0.2602 - regression_loss: 0.2484 - classification_loss: 0.0117 41/500 [=>............................] - ETA: 3:50 - loss: 0.2602 - regression_loss: 0.2484 - classification_loss: 0.0118 42/500 [=>............................] - ETA: 3:50 - loss: 0.2596 - regression_loss: 0.2481 - classification_loss: 0.0116 43/500 [=>............................] - ETA: 3:49 - loss: 0.2614 - regression_loss: 0.2498 - classification_loss: 0.0116 44/500 [=>............................] - ETA: 3:49 - loss: 0.2612 - regression_loss: 0.2496 - classification_loss: 0.0116 45/500 [=>............................] - ETA: 3:48 - loss: 0.2606 - regression_loss: 0.2490 - classification_loss: 0.0116 46/500 [=>............................] - ETA: 3:48 - loss: 0.2586 - regression_loss: 0.2472 - classification_loss: 0.0114 47/500 [=>............................] - ETA: 3:47 - loss: 0.2592 - regression_loss: 0.2477 - classification_loss: 0.0115 48/500 [=>............................] - ETA: 3:47 - loss: 0.2608 - regression_loss: 0.2493 - classification_loss: 0.0115 49/500 [=>............................] - ETA: 3:46 - loss: 0.2569 - regression_loss: 0.2456 - classification_loss: 0.0113 50/500 [==>...........................] - ETA: 3:46 - loss: 0.2555 - regression_loss: 0.2442 - classification_loss: 0.0113 51/500 [==>...........................] - ETA: 3:45 - loss: 0.2545 - regression_loss: 0.2433 - classification_loss: 0.0112 52/500 [==>...........................] - ETA: 3:45 - loss: 0.2536 - regression_loss: 0.2424 - classification_loss: 0.0112 53/500 [==>...........................] - ETA: 3:44 - loss: 0.2528 - regression_loss: 0.2416 - classification_loss: 0.0112 54/500 [==>...........................] - ETA: 3:44 - loss: 0.2502 - regression_loss: 0.2391 - classification_loss: 0.0111 55/500 [==>...........................] - ETA: 3:43 - loss: 0.2497 - regression_loss: 0.2386 - classification_loss: 0.0111 56/500 [==>...........................] - ETA: 3:43 - loss: 0.2476 - regression_loss: 0.2366 - classification_loss: 0.0110 57/500 [==>...........................] - ETA: 3:42 - loss: 0.2491 - regression_loss: 0.2381 - classification_loss: 0.0109 58/500 [==>...........................] - ETA: 3:42 - loss: 0.2459 - regression_loss: 0.2351 - classification_loss: 0.0108 59/500 [==>...........................] - ETA: 3:41 - loss: 0.2475 - regression_loss: 0.2365 - classification_loss: 0.0110 60/500 [==>...........................] - ETA: 3:41 - loss: 0.2492 - regression_loss: 0.2380 - classification_loss: 0.0112 61/500 [==>...........................] - ETA: 3:40 - loss: 0.2489 - regression_loss: 0.2379 - classification_loss: 0.0110 62/500 [==>...........................] - ETA: 3:40 - loss: 0.2481 - regression_loss: 0.2372 - classification_loss: 0.0110 63/500 [==>...........................] - ETA: 3:39 - loss: 0.2462 - regression_loss: 0.2354 - classification_loss: 0.0109 64/500 [==>...........................] - ETA: 3:39 - loss: 0.2453 - regression_loss: 0.2345 - classification_loss: 0.0108 65/500 [==>...........................] - ETA: 3:38 - loss: 0.2439 - regression_loss: 0.2332 - classification_loss: 0.0107 66/500 [==>...........................] - ETA: 3:38 - loss: 0.2426 - regression_loss: 0.2320 - classification_loss: 0.0107 67/500 [===>..........................] - ETA: 3:37 - loss: 0.2412 - regression_loss: 0.2306 - classification_loss: 0.0106 68/500 [===>..........................] - ETA: 3:37 - loss: 0.2386 - regression_loss: 0.2281 - classification_loss: 0.0105 69/500 [===>..........................] - ETA: 3:36 - loss: 0.2368 - regression_loss: 0.2264 - classification_loss: 0.0105 70/500 [===>..........................] - ETA: 3:36 - loss: 0.2371 - regression_loss: 0.2267 - classification_loss: 0.0104 71/500 [===>..........................] - ETA: 3:35 - loss: 0.2371 - regression_loss: 0.2267 - classification_loss: 0.0104 72/500 [===>..........................] - ETA: 3:35 - loss: 0.2371 - regression_loss: 0.2266 - classification_loss: 0.0105 73/500 [===>..........................] - ETA: 3:34 - loss: 0.2390 - regression_loss: 0.2283 - classification_loss: 0.0107 74/500 [===>..........................] - ETA: 3:34 - loss: 0.2370 - regression_loss: 0.2264 - classification_loss: 0.0106 75/500 [===>..........................] - ETA: 3:33 - loss: 0.2355 - regression_loss: 0.2251 - classification_loss: 0.0105 76/500 [===>..........................] - ETA: 3:33 - loss: 0.2341 - regression_loss: 0.2237 - classification_loss: 0.0104 77/500 [===>..........................] - ETA: 3:32 - loss: 0.2340 - regression_loss: 0.2237 - classification_loss: 0.0103 78/500 [===>..........................] - ETA: 3:32 - loss: 0.2336 - regression_loss: 0.2232 - classification_loss: 0.0104 79/500 [===>..........................] - ETA: 3:31 - loss: 0.2344 - regression_loss: 0.2240 - classification_loss: 0.0104 80/500 [===>..........................] - ETA: 3:31 - loss: 0.2335 - regression_loss: 0.2231 - classification_loss: 0.0103 81/500 [===>..........................] - ETA: 3:30 - loss: 0.2350 - regression_loss: 0.2243 - classification_loss: 0.0107 82/500 [===>..........................] - ETA: 3:30 - loss: 0.2339 - regression_loss: 0.2233 - classification_loss: 0.0106 83/500 [===>..........................] - ETA: 3:29 - loss: 0.2337 - regression_loss: 0.2231 - classification_loss: 0.0106 84/500 [====>.........................] - ETA: 3:29 - loss: 0.2343 - regression_loss: 0.2236 - classification_loss: 0.0107 85/500 [====>.........................] - ETA: 3:28 - loss: 0.2333 - regression_loss: 0.2227 - classification_loss: 0.0106 86/500 [====>.........................] - ETA: 3:28 - loss: 0.2334 - regression_loss: 0.2227 - classification_loss: 0.0107 87/500 [====>.........................] - ETA: 3:27 - loss: 0.2336 - regression_loss: 0.2230 - classification_loss: 0.0106 88/500 [====>.........................] - ETA: 3:27 - loss: 0.2324 - regression_loss: 0.2219 - classification_loss: 0.0105 89/500 [====>.........................] - ETA: 3:26 - loss: 0.2308 - regression_loss: 0.2204 - classification_loss: 0.0104 90/500 [====>.........................] - ETA: 3:26 - loss: 0.2309 - regression_loss: 0.2204 - classification_loss: 0.0105 91/500 [====>.........................] - ETA: 3:25 - loss: 0.2291 - regression_loss: 0.2188 - classification_loss: 0.0104 92/500 [====>.........................] - ETA: 3:25 - loss: 0.2307 - regression_loss: 0.2201 - classification_loss: 0.0106 93/500 [====>.........................] - ETA: 3:24 - loss: 0.2303 - regression_loss: 0.2197 - classification_loss: 0.0106 94/500 [====>.........................] - ETA: 3:24 - loss: 0.2291 - regression_loss: 0.2186 - classification_loss: 0.0105 95/500 [====>.........................] - ETA: 3:23 - loss: 0.2283 - regression_loss: 0.2178 - classification_loss: 0.0105 96/500 [====>.........................] - ETA: 3:23 - loss: 0.2270 - regression_loss: 0.2166 - classification_loss: 0.0104 97/500 [====>.........................] - ETA: 3:22 - loss: 0.2261 - regression_loss: 0.2157 - classification_loss: 0.0103 98/500 [====>.........................] - ETA: 3:22 - loss: 0.2254 - regression_loss: 0.2151 - classification_loss: 0.0103 99/500 [====>.........................] - ETA: 3:21 - loss: 0.2259 - regression_loss: 0.2156 - classification_loss: 0.0103 100/500 [=====>........................] - ETA: 3:21 - loss: 0.2268 - regression_loss: 0.2165 - classification_loss: 0.0103 101/500 [=====>........................] - ETA: 3:20 - loss: 0.2253 - regression_loss: 0.2151 - classification_loss: 0.0102 102/500 [=====>........................] - ETA: 3:20 - loss: 0.2253 - regression_loss: 0.2151 - classification_loss: 0.0102 103/500 [=====>........................] - ETA: 3:19 - loss: 0.2259 - regression_loss: 0.2156 - classification_loss: 0.0103 104/500 [=====>........................] - ETA: 3:19 - loss: 0.2265 - regression_loss: 0.2162 - classification_loss: 0.0103 105/500 [=====>........................] - ETA: 3:18 - loss: 0.2255 - regression_loss: 0.2153 - classification_loss: 0.0102 106/500 [=====>........................] - ETA: 3:18 - loss: 0.2245 - regression_loss: 0.2144 - classification_loss: 0.0101 107/500 [=====>........................] - ETA: 3:17 - loss: 0.2243 - regression_loss: 0.2142 - classification_loss: 0.0102 108/500 [=====>........................] - ETA: 3:17 - loss: 0.2242 - regression_loss: 0.2140 - classification_loss: 0.0102 109/500 [=====>........................] - ETA: 3:16 - loss: 0.2257 - regression_loss: 0.2154 - classification_loss: 0.0102 110/500 [=====>........................] - ETA: 3:16 - loss: 0.2261 - regression_loss: 0.2159 - classification_loss: 0.0102 111/500 [=====>........................] - ETA: 3:15 - loss: 0.2262 - regression_loss: 0.2159 - classification_loss: 0.0103 112/500 [=====>........................] - ETA: 3:15 - loss: 0.2256 - regression_loss: 0.2154 - classification_loss: 0.0102 113/500 [=====>........................] - ETA: 3:14 - loss: 0.2248 - regression_loss: 0.2147 - classification_loss: 0.0101 114/500 [=====>........................] - ETA: 3:14 - loss: 0.2253 - regression_loss: 0.2150 - classification_loss: 0.0102 115/500 [=====>........................] - ETA: 3:13 - loss: 0.2239 - regression_loss: 0.2138 - classification_loss: 0.0102 116/500 [=====>........................] - ETA: 3:13 - loss: 0.2272 - regression_loss: 0.2170 - classification_loss: 0.0102 117/500 [======>.......................] - ETA: 3:12 - loss: 0.2272 - regression_loss: 0.2171 - classification_loss: 0.0102 118/500 [======>.......................] - ETA: 3:12 - loss: 0.2261 - regression_loss: 0.2160 - classification_loss: 0.0101 119/500 [======>.......................] - ETA: 3:11 - loss: 0.2261 - regression_loss: 0.2160 - classification_loss: 0.0101 120/500 [======>.......................] - ETA: 3:10 - loss: 0.2271 - regression_loss: 0.2170 - classification_loss: 0.0101 121/500 [======>.......................] - ETA: 3:10 - loss: 0.2268 - regression_loss: 0.2167 - classification_loss: 0.0101 122/500 [======>.......................] - ETA: 3:10 - loss: 0.2280 - regression_loss: 0.2174 - classification_loss: 0.0106 123/500 [======>.......................] - ETA: 3:09 - loss: 0.2278 - regression_loss: 0.2173 - classification_loss: 0.0105 124/500 [======>.......................] - ETA: 3:09 - loss: 0.2284 - regression_loss: 0.2177 - classification_loss: 0.0107 125/500 [======>.......................] - ETA: 3:08 - loss: 0.2277 - regression_loss: 0.2170 - classification_loss: 0.0107 126/500 [======>.......................] - ETA: 3:08 - loss: 0.2266 - regression_loss: 0.2160 - classification_loss: 0.0106 127/500 [======>.......................] - ETA: 3:07 - loss: 0.2279 - regression_loss: 0.2173 - classification_loss: 0.0106 128/500 [======>.......................] - ETA: 3:07 - loss: 0.2270 - regression_loss: 0.2165 - classification_loss: 0.0105 129/500 [======>.......................] - ETA: 3:06 - loss: 0.2273 - regression_loss: 0.2167 - classification_loss: 0.0105 130/500 [======>.......................] - ETA: 3:06 - loss: 0.2260 - regression_loss: 0.2155 - classification_loss: 0.0105 131/500 [======>.......................] - ETA: 3:05 - loss: 0.2250 - regression_loss: 0.2146 - classification_loss: 0.0104 132/500 [======>.......................] - ETA: 3:05 - loss: 0.2243 - regression_loss: 0.2140 - classification_loss: 0.0103 133/500 [======>.......................] - ETA: 3:04 - loss: 0.2240 - regression_loss: 0.2137 - classification_loss: 0.0103 134/500 [=======>......................] - ETA: 3:04 - loss: 0.2245 - regression_loss: 0.2142 - classification_loss: 0.0103 135/500 [=======>......................] - ETA: 3:03 - loss: 0.2234 - regression_loss: 0.2132 - classification_loss: 0.0103 136/500 [=======>......................] - ETA: 3:03 - loss: 0.2233 - regression_loss: 0.2131 - classification_loss: 0.0102 137/500 [=======>......................] - ETA: 3:02 - loss: 0.2245 - regression_loss: 0.2143 - classification_loss: 0.0102 138/500 [=======>......................] - ETA: 3:02 - loss: 0.2241 - regression_loss: 0.2139 - classification_loss: 0.0102 139/500 [=======>......................] - ETA: 3:01 - loss: 0.2239 - regression_loss: 0.2137 - classification_loss: 0.0101 140/500 [=======>......................] - ETA: 3:01 - loss: 0.2238 - regression_loss: 0.2137 - classification_loss: 0.0101 141/500 [=======>......................] - ETA: 3:00 - loss: 0.2236 - regression_loss: 0.2135 - classification_loss: 0.0101 142/500 [=======>......................] - ETA: 3:00 - loss: 0.2232 - regression_loss: 0.2131 - classification_loss: 0.0101 143/500 [=======>......................] - ETA: 2:59 - loss: 0.2240 - regression_loss: 0.2138 - classification_loss: 0.0102 144/500 [=======>......................] - ETA: 2:59 - loss: 0.2240 - regression_loss: 0.2138 - classification_loss: 0.0102 145/500 [=======>......................] - ETA: 2:58 - loss: 0.2236 - regression_loss: 0.2134 - classification_loss: 0.0102 146/500 [=======>......................] - ETA: 2:58 - loss: 0.2237 - regression_loss: 0.2135 - classification_loss: 0.0102 147/500 [=======>......................] - ETA: 2:57 - loss: 0.2242 - regression_loss: 0.2140 - classification_loss: 0.0102 148/500 [=======>......................] - ETA: 2:57 - loss: 0.2239 - regression_loss: 0.2136 - classification_loss: 0.0103 149/500 [=======>......................] - ETA: 2:56 - loss: 0.2234 - regression_loss: 0.2132 - classification_loss: 0.0103 150/500 [========>.....................] - ETA: 2:56 - loss: 0.2238 - regression_loss: 0.2135 - classification_loss: 0.0103 151/500 [========>.....................] - ETA: 2:55 - loss: 0.2236 - regression_loss: 0.2134 - classification_loss: 0.0103 152/500 [========>.....................] - ETA: 2:55 - loss: 0.2237 - regression_loss: 0.2135 - classification_loss: 0.0102 153/500 [========>.....................] - ETA: 2:54 - loss: 0.2228 - regression_loss: 0.2126 - classification_loss: 0.0102 154/500 [========>.....................] - ETA: 2:54 - loss: 0.2221 - regression_loss: 0.2119 - classification_loss: 0.0101 155/500 [========>.....................] - ETA: 2:53 - loss: 0.2212 - regression_loss: 0.2111 - classification_loss: 0.0101 156/500 [========>.....................] - ETA: 2:53 - loss: 0.2219 - regression_loss: 0.2118 - classification_loss: 0.0101 157/500 [========>.....................] - ETA: 2:52 - loss: 0.2210 - regression_loss: 0.2109 - classification_loss: 0.0100 158/500 [========>.....................] - ETA: 2:52 - loss: 0.2206 - regression_loss: 0.2106 - classification_loss: 0.0100 159/500 [========>.....................] - ETA: 2:51 - loss: 0.2202 - regression_loss: 0.2102 - classification_loss: 0.0100 160/500 [========>.....................] - ETA: 2:51 - loss: 0.2199 - regression_loss: 0.2099 - classification_loss: 0.0100 161/500 [========>.....................] - ETA: 2:50 - loss: 0.2202 - regression_loss: 0.2101 - classification_loss: 0.0101 162/500 [========>.....................] - ETA: 2:50 - loss: 0.2198 - regression_loss: 0.2097 - classification_loss: 0.0101 163/500 [========>.....................] - ETA: 2:49 - loss: 0.2191 - regression_loss: 0.2091 - classification_loss: 0.0100 164/500 [========>.....................] - ETA: 2:49 - loss: 0.2181 - regression_loss: 0.2081 - classification_loss: 0.0100 165/500 [========>.....................] - ETA: 2:48 - loss: 0.2187 - regression_loss: 0.2085 - classification_loss: 0.0101 166/500 [========>.....................] - ETA: 2:48 - loss: 0.2190 - regression_loss: 0.2089 - classification_loss: 0.0102 167/500 [=========>....................] - ETA: 2:47 - loss: 0.2201 - regression_loss: 0.2100 - classification_loss: 0.0101 168/500 [=========>....................] - ETA: 2:47 - loss: 0.2199 - regression_loss: 0.2098 - classification_loss: 0.0101 169/500 [=========>....................] - ETA: 2:46 - loss: 0.2199 - regression_loss: 0.2098 - classification_loss: 0.0101 170/500 [=========>....................] - ETA: 2:46 - loss: 0.2195 - regression_loss: 0.2095 - classification_loss: 0.0101 171/500 [=========>....................] - ETA: 2:45 - loss: 0.2193 - regression_loss: 0.2092 - classification_loss: 0.0101 172/500 [=========>....................] - ETA: 2:45 - loss: 0.2190 - regression_loss: 0.2089 - classification_loss: 0.0100 173/500 [=========>....................] - ETA: 2:44 - loss: 0.2200 - regression_loss: 0.2099 - classification_loss: 0.0101 174/500 [=========>....................] - ETA: 2:44 - loss: 0.2207 - regression_loss: 0.2106 - classification_loss: 0.0101 175/500 [=========>....................] - ETA: 2:43 - loss: 0.2208 - regression_loss: 0.2107 - classification_loss: 0.0101 176/500 [=========>....................] - ETA: 2:43 - loss: 0.2203 - regression_loss: 0.2102 - classification_loss: 0.0101 177/500 [=========>....................] - ETA: 2:42 - loss: 0.2206 - regression_loss: 0.2105 - classification_loss: 0.0101 178/500 [=========>....................] - ETA: 2:42 - loss: 0.2209 - regression_loss: 0.2109 - classification_loss: 0.0101 179/500 [=========>....................] - ETA: 2:41 - loss: 0.2210 - regression_loss: 0.2110 - classification_loss: 0.0101 180/500 [=========>....................] - ETA: 2:40 - loss: 0.2204 - regression_loss: 0.2104 - classification_loss: 0.0100 181/500 [=========>....................] - ETA: 2:40 - loss: 0.2201 - regression_loss: 0.2101 - classification_loss: 0.0100 182/500 [=========>....................] - ETA: 2:39 - loss: 0.2202 - regression_loss: 0.2103 - classification_loss: 0.0100 183/500 [=========>....................] - ETA: 2:39 - loss: 0.2201 - regression_loss: 0.2101 - classification_loss: 0.0099 184/500 [==========>...................] - ETA: 2:38 - loss: 0.2204 - regression_loss: 0.2105 - classification_loss: 0.0100 185/500 [==========>...................] - ETA: 2:38 - loss: 0.2197 - regression_loss: 0.2098 - classification_loss: 0.0099 186/500 [==========>...................] - ETA: 2:37 - loss: 0.2199 - regression_loss: 0.2100 - classification_loss: 0.0099 187/500 [==========>...................] - ETA: 2:37 - loss: 0.2195 - regression_loss: 0.2096 - classification_loss: 0.0099 188/500 [==========>...................] - ETA: 2:36 - loss: 0.2196 - regression_loss: 0.2097 - classification_loss: 0.0099 189/500 [==========>...................] - ETA: 2:36 - loss: 0.2193 - regression_loss: 0.2094 - classification_loss: 0.0099 190/500 [==========>...................] - ETA: 2:35 - loss: 0.2189 - regression_loss: 0.2091 - classification_loss: 0.0098 191/500 [==========>...................] - ETA: 2:35 - loss: 0.2186 - regression_loss: 0.2088 - classification_loss: 0.0098 192/500 [==========>...................] - ETA: 2:34 - loss: 0.2189 - regression_loss: 0.2091 - classification_loss: 0.0098 193/500 [==========>...................] - ETA: 2:34 - loss: 0.2191 - regression_loss: 0.2093 - classification_loss: 0.0098 194/500 [==========>...................] - ETA: 2:33 - loss: 0.2187 - regression_loss: 0.2089 - classification_loss: 0.0097 195/500 [==========>...................] - ETA: 2:33 - loss: 0.2180 - regression_loss: 0.2083 - classification_loss: 0.0097 196/500 [==========>...................] - ETA: 2:32 - loss: 0.2175 - regression_loss: 0.2078 - classification_loss: 0.0097 197/500 [==========>...................] - ETA: 2:32 - loss: 0.2183 - regression_loss: 0.2086 - classification_loss: 0.0097 198/500 [==========>...................] - ETA: 2:31 - loss: 0.2177 - regression_loss: 0.2080 - classification_loss: 0.0097 199/500 [==========>...................] - ETA: 2:31 - loss: 0.2175 - regression_loss: 0.2079 - classification_loss: 0.0097 200/500 [===========>..................] - ETA: 2:30 - loss: 0.2180 - regression_loss: 0.2084 - classification_loss: 0.0096 201/500 [===========>..................] - ETA: 2:30 - loss: 0.2172 - regression_loss: 0.2076 - classification_loss: 0.0096 202/500 [===========>..................] - ETA: 2:29 - loss: 0.2172 - regression_loss: 0.2076 - classification_loss: 0.0096 203/500 [===========>..................] - ETA: 2:29 - loss: 0.2178 - regression_loss: 0.2082 - classification_loss: 0.0096 204/500 [===========>..................] - ETA: 2:28 - loss: 0.2180 - regression_loss: 0.2084 - classification_loss: 0.0096 205/500 [===========>..................] - ETA: 2:28 - loss: 0.2179 - regression_loss: 0.2084 - classification_loss: 0.0096 206/500 [===========>..................] - ETA: 2:27 - loss: 0.2179 - regression_loss: 0.2083 - classification_loss: 0.0096 207/500 [===========>..................] - ETA: 2:27 - loss: 0.2182 - regression_loss: 0.2086 - classification_loss: 0.0096 208/500 [===========>..................] - ETA: 2:26 - loss: 0.2176 - regression_loss: 0.2080 - classification_loss: 0.0096 209/500 [===========>..................] - ETA: 2:26 - loss: 0.2181 - regression_loss: 0.2085 - classification_loss: 0.0096 210/500 [===========>..................] - ETA: 2:25 - loss: 0.2179 - regression_loss: 0.2082 - classification_loss: 0.0097 211/500 [===========>..................] - ETA: 2:25 - loss: 0.2174 - regression_loss: 0.2078 - classification_loss: 0.0096 212/500 [===========>..................] - ETA: 2:24 - loss: 0.2173 - regression_loss: 0.2077 - classification_loss: 0.0096 213/500 [===========>..................] - ETA: 2:24 - loss: 0.2169 - regression_loss: 0.2073 - classification_loss: 0.0096 214/500 [===========>..................] - ETA: 2:23 - loss: 0.2170 - regression_loss: 0.2074 - classification_loss: 0.0096 215/500 [===========>..................] - ETA: 2:23 - loss: 0.2165 - regression_loss: 0.2070 - classification_loss: 0.0096 216/500 [===========>..................] - ETA: 2:22 - loss: 0.2168 - regression_loss: 0.2072 - classification_loss: 0.0096 217/500 [============>.................] - ETA: 2:22 - loss: 0.2165 - regression_loss: 0.2069 - classification_loss: 0.0095 218/500 [============>.................] - ETA: 2:21 - loss: 0.2159 - regression_loss: 0.2064 - classification_loss: 0.0095 219/500 [============>.................] - ETA: 2:21 - loss: 0.2157 - regression_loss: 0.2062 - classification_loss: 0.0095 220/500 [============>.................] - ETA: 2:20 - loss: 0.2153 - regression_loss: 0.2059 - classification_loss: 0.0095 221/500 [============>.................] - ETA: 2:20 - loss: 0.2151 - regression_loss: 0.2056 - classification_loss: 0.0094 222/500 [============>.................] - ETA: 2:19 - loss: 0.2149 - regression_loss: 0.2054 - classification_loss: 0.0094 223/500 [============>.................] - ETA: 2:19 - loss: 0.2147 - regression_loss: 0.2053 - classification_loss: 0.0094 224/500 [============>.................] - ETA: 2:18 - loss: 0.2144 - regression_loss: 0.2050 - classification_loss: 0.0094 225/500 [============>.................] - ETA: 2:18 - loss: 0.2147 - regression_loss: 0.2054 - classification_loss: 0.0094 226/500 [============>.................] - ETA: 2:17 - loss: 0.2147 - regression_loss: 0.2053 - classification_loss: 0.0094 227/500 [============>.................] - ETA: 2:17 - loss: 0.2153 - regression_loss: 0.2058 - classification_loss: 0.0095 228/500 [============>.................] - ETA: 2:16 - loss: 0.2151 - regression_loss: 0.2056 - classification_loss: 0.0095 229/500 [============>.................] - ETA: 2:16 - loss: 0.2150 - regression_loss: 0.2055 - classification_loss: 0.0095 230/500 [============>.................] - ETA: 2:15 - loss: 0.2151 - regression_loss: 0.2056 - classification_loss: 0.0095 231/500 [============>.................] - ETA: 2:15 - loss: 0.2147 - regression_loss: 0.2052 - classification_loss: 0.0095 232/500 [============>.................] - ETA: 2:14 - loss: 0.2146 - regression_loss: 0.2051 - classification_loss: 0.0095 233/500 [============>.................] - ETA: 2:14 - loss: 0.2143 - regression_loss: 0.2049 - classification_loss: 0.0094 234/500 [=============>................] - ETA: 2:13 - loss: 0.2141 - regression_loss: 0.2047 - classification_loss: 0.0094 235/500 [=============>................] - ETA: 2:13 - loss: 0.2150 - regression_loss: 0.2056 - classification_loss: 0.0094 236/500 [=============>................] - ETA: 2:12 - loss: 0.2149 - regression_loss: 0.2055 - classification_loss: 0.0094 237/500 [=============>................] - ETA: 2:12 - loss: 0.2155 - regression_loss: 0.2061 - classification_loss: 0.0094 238/500 [=============>................] - ETA: 2:11 - loss: 0.2155 - regression_loss: 0.2061 - classification_loss: 0.0094 239/500 [=============>................] - ETA: 2:11 - loss: 0.2157 - regression_loss: 0.2063 - classification_loss: 0.0094 240/500 [=============>................] - ETA: 2:10 - loss: 0.2162 - regression_loss: 0.2068 - classification_loss: 0.0094 241/500 [=============>................] - ETA: 2:10 - loss: 0.2160 - regression_loss: 0.2066 - classification_loss: 0.0094 242/500 [=============>................] - ETA: 2:09 - loss: 0.2161 - regression_loss: 0.2067 - classification_loss: 0.0094 243/500 [=============>................] - ETA: 2:09 - loss: 0.2167 - regression_loss: 0.2073 - classification_loss: 0.0094 244/500 [=============>................] - ETA: 2:08 - loss: 0.2173 - regression_loss: 0.2078 - classification_loss: 0.0094 245/500 [=============>................] - ETA: 2:08 - loss: 0.2170 - regression_loss: 0.2076 - classification_loss: 0.0094 246/500 [=============>................] - ETA: 2:07 - loss: 0.2167 - regression_loss: 0.2073 - classification_loss: 0.0094 247/500 [=============>................] - ETA: 2:07 - loss: 0.2169 - regression_loss: 0.2074 - classification_loss: 0.0094 248/500 [=============>................] - ETA: 2:06 - loss: 0.2165 - regression_loss: 0.2071 - classification_loss: 0.0094 249/500 [=============>................] - ETA: 2:06 - loss: 0.2169 - regression_loss: 0.2074 - classification_loss: 0.0095 250/500 [==============>...............] - ETA: 2:05 - loss: 0.2168 - regression_loss: 0.2073 - classification_loss: 0.0095 251/500 [==============>...............] - ETA: 2:05 - loss: 0.2170 - regression_loss: 0.2076 - classification_loss: 0.0095 252/500 [==============>...............] - ETA: 2:04 - loss: 0.2171 - regression_loss: 0.2076 - classification_loss: 0.0095 253/500 [==============>...............] - ETA: 2:04 - loss: 0.2174 - regression_loss: 0.2080 - classification_loss: 0.0095 254/500 [==============>...............] - ETA: 2:03 - loss: 0.2185 - regression_loss: 0.2090 - classification_loss: 0.0095 255/500 [==============>...............] - ETA: 2:03 - loss: 0.2192 - regression_loss: 0.2096 - classification_loss: 0.0095 256/500 [==============>...............] - ETA: 2:02 - loss: 0.2199 - regression_loss: 0.2104 - classification_loss: 0.0095 257/500 [==============>...............] - ETA: 2:02 - loss: 0.2200 - regression_loss: 0.2105 - classification_loss: 0.0095 258/500 [==============>...............] - ETA: 2:01 - loss: 0.2200 - regression_loss: 0.2105 - classification_loss: 0.0095 259/500 [==============>...............] - ETA: 2:01 - loss: 0.2198 - regression_loss: 0.2103 - classification_loss: 0.0095 260/500 [==============>...............] - ETA: 2:00 - loss: 0.2195 - regression_loss: 0.2100 - classification_loss: 0.0095 261/500 [==============>...............] - ETA: 2:00 - loss: 0.2198 - regression_loss: 0.2104 - classification_loss: 0.0095 262/500 [==============>...............] - ETA: 1:59 - loss: 0.2206 - regression_loss: 0.2111 - classification_loss: 0.0095 263/500 [==============>...............] - ETA: 1:59 - loss: 0.2215 - regression_loss: 0.2120 - classification_loss: 0.0095 264/500 [==============>...............] - ETA: 1:58 - loss: 0.2220 - regression_loss: 0.2125 - classification_loss: 0.0095 265/500 [==============>...............] - ETA: 1:58 - loss: 0.2222 - regression_loss: 0.2127 - classification_loss: 0.0095 266/500 [==============>...............] - ETA: 1:57 - loss: 0.2226 - regression_loss: 0.2131 - classification_loss: 0.0095 267/500 [===============>..............] - ETA: 1:57 - loss: 0.2226 - regression_loss: 0.2131 - classification_loss: 0.0095 268/500 [===============>..............] - ETA: 1:56 - loss: 0.2225 - regression_loss: 0.2130 - classification_loss: 0.0095 269/500 [===============>..............] - ETA: 1:56 - loss: 0.2224 - regression_loss: 0.2130 - classification_loss: 0.0095 270/500 [===============>..............] - ETA: 1:55 - loss: 0.2220 - regression_loss: 0.2125 - classification_loss: 0.0094 271/500 [===============>..............] - ETA: 1:55 - loss: 0.2218 - regression_loss: 0.2124 - classification_loss: 0.0094 272/500 [===============>..............] - ETA: 1:54 - loss: 0.2218 - regression_loss: 0.2123 - classification_loss: 0.0094 273/500 [===============>..............] - ETA: 1:53 - loss: 0.2218 - regression_loss: 0.2124 - classification_loss: 0.0094 274/500 [===============>..............] - ETA: 1:53 - loss: 0.2227 - regression_loss: 0.2132 - classification_loss: 0.0095 275/500 [===============>..............] - ETA: 1:52 - loss: 0.2229 - regression_loss: 0.2133 - classification_loss: 0.0095 276/500 [===============>..............] - ETA: 1:52 - loss: 0.2228 - regression_loss: 0.2133 - classification_loss: 0.0095 277/500 [===============>..............] - ETA: 1:51 - loss: 0.2225 - regression_loss: 0.2130 - classification_loss: 0.0095 278/500 [===============>..............] - ETA: 1:51 - loss: 0.2223 - regression_loss: 0.2128 - classification_loss: 0.0095 279/500 [===============>..............] - ETA: 1:51 - loss: 0.2221 - regression_loss: 0.2126 - classification_loss: 0.0095 280/500 [===============>..............] - ETA: 1:50 - loss: 0.2230 - regression_loss: 0.2134 - classification_loss: 0.0096 281/500 [===============>..............] - ETA: 1:49 - loss: 0.2233 - regression_loss: 0.2137 - classification_loss: 0.0096 282/500 [===============>..............] - ETA: 1:49 - loss: 0.2242 - regression_loss: 0.2145 - classification_loss: 0.0097 283/500 [===============>..............] - ETA: 1:48 - loss: 0.2244 - regression_loss: 0.2147 - classification_loss: 0.0097 284/500 [================>.............] - ETA: 1:48 - loss: 0.2242 - regression_loss: 0.2145 - classification_loss: 0.0097 285/500 [================>.............] - ETA: 1:47 - loss: 0.2245 - regression_loss: 0.2148 - classification_loss: 0.0097 286/500 [================>.............] - ETA: 1:47 - loss: 0.2248 - regression_loss: 0.2151 - classification_loss: 0.0097 287/500 [================>.............] - ETA: 1:46 - loss: 0.2252 - regression_loss: 0.2155 - classification_loss: 0.0097 288/500 [================>.............] - ETA: 1:46 - loss: 0.2254 - regression_loss: 0.2157 - classification_loss: 0.0097 289/500 [================>.............] - ETA: 1:45 - loss: 0.2251 - regression_loss: 0.2154 - classification_loss: 0.0097 290/500 [================>.............] - ETA: 1:45 - loss: 0.2249 - regression_loss: 0.2152 - classification_loss: 0.0097 291/500 [================>.............] - ETA: 1:44 - loss: 0.2248 - regression_loss: 0.2151 - classification_loss: 0.0097 292/500 [================>.............] - ETA: 1:44 - loss: 0.2247 - regression_loss: 0.2150 - classification_loss: 0.0096 293/500 [================>.............] - ETA: 1:43 - loss: 0.2261 - regression_loss: 0.2164 - classification_loss: 0.0097 294/500 [================>.............] - ETA: 1:43 - loss: 0.2261 - regression_loss: 0.2164 - classification_loss: 0.0097 295/500 [================>.............] - ETA: 1:42 - loss: 0.2260 - regression_loss: 0.2163 - classification_loss: 0.0097 296/500 [================>.............] - ETA: 1:42 - loss: 0.2263 - regression_loss: 0.2166 - classification_loss: 0.0097 297/500 [================>.............] - ETA: 1:41 - loss: 0.2273 - regression_loss: 0.2176 - classification_loss: 0.0098 298/500 [================>.............] - ETA: 1:41 - loss: 0.2273 - regression_loss: 0.2176 - classification_loss: 0.0097 299/500 [================>.............] - ETA: 1:40 - loss: 0.2281 - regression_loss: 0.2184 - classification_loss: 0.0098 300/500 [=================>............] - ETA: 1:40 - loss: 0.2279 - regression_loss: 0.2181 - classification_loss: 0.0098 301/500 [=================>............] - ETA: 1:39 - loss: 0.2281 - regression_loss: 0.2183 - classification_loss: 0.0098 302/500 [=================>............] - ETA: 1:39 - loss: 0.2281 - regression_loss: 0.2183 - classification_loss: 0.0098 303/500 [=================>............] - ETA: 1:38 - loss: 0.2277 - regression_loss: 0.2180 - classification_loss: 0.0097 304/500 [=================>............] - ETA: 1:38 - loss: 0.2278 - regression_loss: 0.2181 - classification_loss: 0.0097 305/500 [=================>............] - ETA: 1:37 - loss: 0.2277 - regression_loss: 0.2179 - classification_loss: 0.0097 306/500 [=================>............] - ETA: 1:37 - loss: 0.2279 - regression_loss: 0.2181 - classification_loss: 0.0098 307/500 [=================>............] - ETA: 1:36 - loss: 0.2281 - regression_loss: 0.2184 - classification_loss: 0.0098 308/500 [=================>............] - ETA: 1:36 - loss: 0.2279 - regression_loss: 0.2181 - classification_loss: 0.0097 309/500 [=================>............] - ETA: 1:35 - loss: 0.2276 - regression_loss: 0.2179 - classification_loss: 0.0097 310/500 [=================>............] - ETA: 1:35 - loss: 0.2277 - regression_loss: 0.2179 - classification_loss: 0.0098 311/500 [=================>............] - ETA: 1:34 - loss: 0.2275 - regression_loss: 0.2177 - classification_loss: 0.0097 312/500 [=================>............] - ETA: 1:34 - loss: 0.2278 - regression_loss: 0.2181 - classification_loss: 0.0097 313/500 [=================>............] - ETA: 1:33 - loss: 0.2281 - regression_loss: 0.2184 - classification_loss: 0.0097 314/500 [=================>............] - ETA: 1:33 - loss: 0.2285 - regression_loss: 0.2187 - classification_loss: 0.0097 315/500 [=================>............] - ETA: 1:32 - loss: 0.2284 - regression_loss: 0.2187 - classification_loss: 0.0097 316/500 [=================>............] - ETA: 1:32 - loss: 0.2288 - regression_loss: 0.2191 - classification_loss: 0.0097 317/500 [==================>...........] - ETA: 1:31 - loss: 0.2283 - regression_loss: 0.2186 - classification_loss: 0.0097 318/500 [==================>...........] - ETA: 1:31 - loss: 0.2287 - regression_loss: 0.2190 - classification_loss: 0.0097 319/500 [==================>...........] - ETA: 1:30 - loss: 0.2286 - regression_loss: 0.2190 - classification_loss: 0.0096 320/500 [==================>...........] - ETA: 1:30 - loss: 0.2290 - regression_loss: 0.2194 - classification_loss: 0.0097 321/500 [==================>...........] - ETA: 1:29 - loss: 0.2290 - regression_loss: 0.2193 - classification_loss: 0.0096 322/500 [==================>...........] - ETA: 1:29 - loss: 0.2292 - regression_loss: 0.2196 - classification_loss: 0.0097 323/500 [==================>...........] - ETA: 1:28 - loss: 0.2293 - regression_loss: 0.2197 - classification_loss: 0.0096 324/500 [==================>...........] - ETA: 1:28 - loss: 0.2300 - regression_loss: 0.2203 - classification_loss: 0.0096 325/500 [==================>...........] - ETA: 1:27 - loss: 0.2301 - regression_loss: 0.2205 - classification_loss: 0.0096 326/500 [==================>...........] - ETA: 1:27 - loss: 0.2302 - regression_loss: 0.2206 - classification_loss: 0.0096 327/500 [==================>...........] - ETA: 1:26 - loss: 0.2300 - regression_loss: 0.2204 - classification_loss: 0.0096 328/500 [==================>...........] - ETA: 1:26 - loss: 0.2301 - regression_loss: 0.2206 - classification_loss: 0.0096 329/500 [==================>...........] - ETA: 1:25 - loss: 0.2299 - regression_loss: 0.2203 - classification_loss: 0.0095 330/500 [==================>...........] - ETA: 1:25 - loss: 0.2299 - regression_loss: 0.2204 - classification_loss: 0.0095 331/500 [==================>...........] - ETA: 1:24 - loss: 0.2296 - regression_loss: 0.2201 - classification_loss: 0.0095 332/500 [==================>...........] - ETA: 1:24 - loss: 0.2298 - regression_loss: 0.2203 - classification_loss: 0.0095 333/500 [==================>...........] - ETA: 1:23 - loss: 0.2299 - regression_loss: 0.2203 - classification_loss: 0.0095 334/500 [===================>..........] - ETA: 1:23 - loss: 0.2298 - regression_loss: 0.2203 - classification_loss: 0.0095 335/500 [===================>..........] - ETA: 1:22 - loss: 0.2299 - regression_loss: 0.2204 - classification_loss: 0.0095 336/500 [===================>..........] - ETA: 1:22 - loss: 0.2297 - regression_loss: 0.2202 - classification_loss: 0.0095 337/500 [===================>..........] - ETA: 1:21 - loss: 0.2297 - regression_loss: 0.2202 - classification_loss: 0.0095 338/500 [===================>..........] - ETA: 1:21 - loss: 0.2295 - regression_loss: 0.2200 - classification_loss: 0.0095 339/500 [===================>..........] - ETA: 1:20 - loss: 0.2300 - regression_loss: 0.2205 - classification_loss: 0.0095 340/500 [===================>..........] - ETA: 1:20 - loss: 0.2300 - regression_loss: 0.2205 - classification_loss: 0.0095 341/500 [===================>..........] - ETA: 1:19 - loss: 0.2301 - regression_loss: 0.2206 - classification_loss: 0.0095 342/500 [===================>..........] - ETA: 1:19 - loss: 0.2299 - regression_loss: 0.2204 - classification_loss: 0.0095 343/500 [===================>..........] - ETA: 1:18 - loss: 0.2304 - regression_loss: 0.2207 - classification_loss: 0.0097 344/500 [===================>..........] - ETA: 1:18 - loss: 0.2304 - regression_loss: 0.2207 - classification_loss: 0.0097 345/500 [===================>..........] - ETA: 1:17 - loss: 0.2303 - regression_loss: 0.2206 - classification_loss: 0.0097 346/500 [===================>..........] - ETA: 1:17 - loss: 0.2302 - regression_loss: 0.2205 - classification_loss: 0.0097 347/500 [===================>..........] - ETA: 1:16 - loss: 0.2304 - regression_loss: 0.2206 - classification_loss: 0.0098 348/500 [===================>..........] - ETA: 1:16 - loss: 0.2306 - regression_loss: 0.2208 - classification_loss: 0.0098 349/500 [===================>..........] - ETA: 1:15 - loss: 0.2305 - regression_loss: 0.2207 - classification_loss: 0.0098 350/500 [====================>.........] - ETA: 1:15 - loss: 0.2305 - regression_loss: 0.2207 - classification_loss: 0.0098 351/500 [====================>.........] - ETA: 1:14 - loss: 0.2303 - regression_loss: 0.2205 - classification_loss: 0.0098 352/500 [====================>.........] - ETA: 1:14 - loss: 0.2304 - regression_loss: 0.2206 - classification_loss: 0.0098 353/500 [====================>.........] - ETA: 1:13 - loss: 0.2306 - regression_loss: 0.2208 - classification_loss: 0.0098 354/500 [====================>.........] - ETA: 1:13 - loss: 0.2302 - regression_loss: 0.2205 - classification_loss: 0.0098 355/500 [====================>.........] - ETA: 1:12 - loss: 0.2300 - regression_loss: 0.2203 - classification_loss: 0.0097 356/500 [====================>.........] - ETA: 1:12 - loss: 0.2313 - regression_loss: 0.2215 - classification_loss: 0.0098 357/500 [====================>.........] - ETA: 1:11 - loss: 0.2315 - regression_loss: 0.2216 - classification_loss: 0.0098 358/500 [====================>.........] - ETA: 1:11 - loss: 0.2317 - regression_loss: 0.2219 - classification_loss: 0.0098 359/500 [====================>.........] - ETA: 1:10 - loss: 0.2316 - regression_loss: 0.2218 - classification_loss: 0.0098 360/500 [====================>.........] - ETA: 1:10 - loss: 0.2319 - regression_loss: 0.2220 - classification_loss: 0.0098 361/500 [====================>.........] - ETA: 1:09 - loss: 0.2316 - regression_loss: 0.2218 - classification_loss: 0.0098 362/500 [====================>.........] - ETA: 1:09 - loss: 0.2317 - regression_loss: 0.2218 - classification_loss: 0.0098 363/500 [====================>.........] - ETA: 1:08 - loss: 0.2318 - regression_loss: 0.2220 - classification_loss: 0.0098 364/500 [====================>.........] - ETA: 1:08 - loss: 0.2317 - regression_loss: 0.2219 - classification_loss: 0.0098 365/500 [====================>.........] - ETA: 1:07 - loss: 0.2315 - regression_loss: 0.2217 - classification_loss: 0.0098 366/500 [====================>.........] - ETA: 1:07 - loss: 0.2313 - regression_loss: 0.2215 - classification_loss: 0.0098 367/500 [=====================>........] - ETA: 1:06 - loss: 0.2310 - regression_loss: 0.2212 - classification_loss: 0.0098 368/500 [=====================>........] - ETA: 1:06 - loss: 0.2308 - regression_loss: 0.2210 - classification_loss: 0.0098 369/500 [=====================>........] - ETA: 1:05 - loss: 0.2308 - regression_loss: 0.2210 - classification_loss: 0.0098 370/500 [=====================>........] - ETA: 1:05 - loss: 0.2312 - regression_loss: 0.2214 - classification_loss: 0.0098 371/500 [=====================>........] - ETA: 1:04 - loss: 0.2313 - regression_loss: 0.2214 - classification_loss: 0.0098 372/500 [=====================>........] - ETA: 1:04 - loss: 0.2310 - regression_loss: 0.2212 - classification_loss: 0.0098 373/500 [=====================>........] - ETA: 1:03 - loss: 0.2307 - regression_loss: 0.2209 - classification_loss: 0.0098 374/500 [=====================>........] - ETA: 1:03 - loss: 0.2305 - regression_loss: 0.2207 - classification_loss: 0.0098 375/500 [=====================>........] - ETA: 1:02 - loss: 0.2304 - regression_loss: 0.2206 - classification_loss: 0.0098 376/500 [=====================>........] - ETA: 1:02 - loss: 0.2302 - regression_loss: 0.2205 - classification_loss: 0.0098 377/500 [=====================>........] - ETA: 1:01 - loss: 0.2302 - regression_loss: 0.2204 - classification_loss: 0.0098 378/500 [=====================>........] - ETA: 1:01 - loss: 0.2301 - regression_loss: 0.2204 - classification_loss: 0.0098 379/500 [=====================>........] - ETA: 1:00 - loss: 0.2304 - regression_loss: 0.2206 - classification_loss: 0.0099 380/500 [=====================>........] - ETA: 1:00 - loss: 0.2304 - regression_loss: 0.2205 - classification_loss: 0.0099 381/500 [=====================>........] - ETA: 59s - loss: 0.2303 - regression_loss: 0.2205 - classification_loss: 0.0098  382/500 [=====================>........] - ETA: 59s - loss: 0.2305 - regression_loss: 0.2206 - classification_loss: 0.0099 383/500 [=====================>........] - ETA: 58s - loss: 0.2303 - regression_loss: 0.2204 - classification_loss: 0.0099 384/500 [======================>.......] - ETA: 58s - loss: 0.2300 - regression_loss: 0.2201 - classification_loss: 0.0099 385/500 [======================>.......] - ETA: 57s - loss: 0.2302 - regression_loss: 0.2203 - classification_loss: 0.0099 386/500 [======================>.......] - ETA: 57s - loss: 0.2311 - regression_loss: 0.2212 - classification_loss: 0.0099 387/500 [======================>.......] - ETA: 56s - loss: 0.2308 - regression_loss: 0.2210 - classification_loss: 0.0099 388/500 [======================>.......] - ETA: 56s - loss: 0.2305 - regression_loss: 0.2207 - classification_loss: 0.0099 389/500 [======================>.......] - ETA: 55s - loss: 0.2304 - regression_loss: 0.2205 - classification_loss: 0.0098 390/500 [======================>.......] - ETA: 55s - loss: 0.2302 - regression_loss: 0.2204 - classification_loss: 0.0098 391/500 [======================>.......] - ETA: 54s - loss: 0.2300 - regression_loss: 0.2202 - classification_loss: 0.0098 392/500 [======================>.......] - ETA: 54s - loss: 0.2302 - regression_loss: 0.2204 - classification_loss: 0.0099 393/500 [======================>.......] - ETA: 53s - loss: 0.2308 - regression_loss: 0.2208 - classification_loss: 0.0100 394/500 [======================>.......] - ETA: 53s - loss: 0.2308 - regression_loss: 0.2209 - classification_loss: 0.0100 395/500 [======================>.......] - ETA: 52s - loss: 0.2304 - regression_loss: 0.2205 - classification_loss: 0.0099 396/500 [======================>.......] - ETA: 52s - loss: 0.2302 - regression_loss: 0.2203 - classification_loss: 0.0099 397/500 [======================>.......] - ETA: 51s - loss: 0.2301 - regression_loss: 0.2202 - classification_loss: 0.0099 398/500 [======================>.......] - ETA: 51s - loss: 0.2300 - regression_loss: 0.2201 - classification_loss: 0.0099 399/500 [======================>.......] - ETA: 50s - loss: 0.2301 - regression_loss: 0.2202 - classification_loss: 0.0099 400/500 [=======================>......] - ETA: 50s - loss: 0.2302 - regression_loss: 0.2203 - classification_loss: 0.0099 401/500 [=======================>......] - ETA: 49s - loss: 0.2303 - regression_loss: 0.2204 - classification_loss: 0.0099 402/500 [=======================>......] - ETA: 49s - loss: 0.2300 - regression_loss: 0.2201 - classification_loss: 0.0099 403/500 [=======================>......] - ETA: 48s - loss: 0.2301 - regression_loss: 0.2201 - classification_loss: 0.0099 404/500 [=======================>......] - ETA: 48s - loss: 0.2297 - regression_loss: 0.2198 - classification_loss: 0.0099 405/500 [=======================>......] - ETA: 47s - loss: 0.2296 - regression_loss: 0.2197 - classification_loss: 0.0099 406/500 [=======================>......] - ETA: 47s - loss: 0.2292 - regression_loss: 0.2193 - classification_loss: 0.0099 407/500 [=======================>......] - ETA: 46s - loss: 0.2293 - regression_loss: 0.2194 - classification_loss: 0.0099 408/500 [=======================>......] - ETA: 46s - loss: 0.2296 - regression_loss: 0.2197 - classification_loss: 0.0099 409/500 [=======================>......] - ETA: 45s - loss: 0.2298 - regression_loss: 0.2198 - classification_loss: 0.0099 410/500 [=======================>......] - ETA: 45s - loss: 0.2296 - regression_loss: 0.2196 - classification_loss: 0.0099 411/500 [=======================>......] - ETA: 44s - loss: 0.2297 - regression_loss: 0.2198 - classification_loss: 0.0100 412/500 [=======================>......] - ETA: 44s - loss: 0.2297 - regression_loss: 0.2197 - classification_loss: 0.0099 413/500 [=======================>......] - ETA: 43s - loss: 0.2297 - regression_loss: 0.2198 - classification_loss: 0.0099 414/500 [=======================>......] - ETA: 43s - loss: 0.2297 - regression_loss: 0.2198 - classification_loss: 0.0099 415/500 [=======================>......] - ETA: 42s - loss: 0.2300 - regression_loss: 0.2201 - classification_loss: 0.0099 416/500 [=======================>......] - ETA: 42s - loss: 0.2298 - regression_loss: 0.2199 - classification_loss: 0.0099 417/500 [========================>.....] - ETA: 41s - loss: 0.2300 - regression_loss: 0.2201 - classification_loss: 0.0099 418/500 [========================>.....] - ETA: 41s - loss: 0.2300 - regression_loss: 0.2201 - classification_loss: 0.0099 419/500 [========================>.....] - ETA: 40s - loss: 0.2296 - regression_loss: 0.2197 - classification_loss: 0.0099 420/500 [========================>.....] - ETA: 40s - loss: 0.2294 - regression_loss: 0.2195 - classification_loss: 0.0099 421/500 [========================>.....] - ETA: 39s - loss: 0.2293 - regression_loss: 0.2194 - classification_loss: 0.0099 422/500 [========================>.....] - ETA: 39s - loss: 0.2293 - regression_loss: 0.2194 - classification_loss: 0.0099 423/500 [========================>.....] - ETA: 38s - loss: 0.2292 - regression_loss: 0.2193 - classification_loss: 0.0099 424/500 [========================>.....] - ETA: 38s - loss: 0.2290 - regression_loss: 0.2191 - classification_loss: 0.0099 425/500 [========================>.....] - ETA: 37s - loss: 0.2291 - regression_loss: 0.2192 - classification_loss: 0.0099 426/500 [========================>.....] - ETA: 37s - loss: 0.2293 - regression_loss: 0.2194 - classification_loss: 0.0099 427/500 [========================>.....] - ETA: 36s - loss: 0.2293 - regression_loss: 0.2194 - classification_loss: 0.0099 428/500 [========================>.....] - ETA: 36s - loss: 0.2292 - regression_loss: 0.2193 - classification_loss: 0.0099 429/500 [========================>.....] - ETA: 35s - loss: 0.2291 - regression_loss: 0.2192 - classification_loss: 0.0099 430/500 [========================>.....] - ETA: 35s - loss: 0.2290 - regression_loss: 0.2191 - classification_loss: 0.0099 431/500 [========================>.....] - ETA: 34s - loss: 0.2288 - regression_loss: 0.2189 - classification_loss: 0.0099 432/500 [========================>.....] - ETA: 34s - loss: 0.2289 - regression_loss: 0.2190 - classification_loss: 0.0099 433/500 [========================>.....] - ETA: 33s - loss: 0.2291 - regression_loss: 0.2192 - classification_loss: 0.0099 434/500 [=========================>....] - ETA: 33s - loss: 0.2289 - regression_loss: 0.2190 - classification_loss: 0.0099 435/500 [=========================>....] - ETA: 32s - loss: 0.2285 - regression_loss: 0.2187 - classification_loss: 0.0099 436/500 [=========================>....] - ETA: 32s - loss: 0.2284 - regression_loss: 0.2185 - classification_loss: 0.0099 437/500 [=========================>....] - ETA: 31s - loss: 0.2281 - regression_loss: 0.2182 - classification_loss: 0.0099 438/500 [=========================>....] - ETA: 31s - loss: 0.2282 - regression_loss: 0.2183 - classification_loss: 0.0098 439/500 [=========================>....] - ETA: 30s - loss: 0.2281 - regression_loss: 0.2183 - classification_loss: 0.0098 440/500 [=========================>....] - ETA: 30s - loss: 0.2280 - regression_loss: 0.2182 - classification_loss: 0.0098 441/500 [=========================>....] - ETA: 29s - loss: 0.2280 - regression_loss: 0.2182 - classification_loss: 0.0098 442/500 [=========================>....] - ETA: 29s - loss: 0.2279 - regression_loss: 0.2181 - classification_loss: 0.0098 443/500 [=========================>....] - ETA: 28s - loss: 0.2278 - regression_loss: 0.2180 - classification_loss: 0.0098 444/500 [=========================>....] - ETA: 28s - loss: 0.2277 - regression_loss: 0.2179 - classification_loss: 0.0098 445/500 [=========================>....] - ETA: 27s - loss: 0.2275 - regression_loss: 0.2177 - classification_loss: 0.0098 446/500 [=========================>....] - ETA: 27s - loss: 0.2274 - regression_loss: 0.2176 - classification_loss: 0.0098 447/500 [=========================>....] - ETA: 26s - loss: 0.2271 - regression_loss: 0.2174 - classification_loss: 0.0098 448/500 [=========================>....] - ETA: 26s - loss: 0.2270 - regression_loss: 0.2172 - classification_loss: 0.0097 449/500 [=========================>....] - ETA: 25s - loss: 0.2269 - regression_loss: 0.2171 - classification_loss: 0.0097 450/500 [==========================>...] - ETA: 25s - loss: 0.2270 - regression_loss: 0.2172 - classification_loss: 0.0098 451/500 [==========================>...] - ETA: 24s - loss: 0.2271 - regression_loss: 0.2173 - classification_loss: 0.0098 452/500 [==========================>...] - ETA: 24s - loss: 0.2269 - regression_loss: 0.2171 - classification_loss: 0.0098 453/500 [==========================>...] - ETA: 23s - loss: 0.2271 - regression_loss: 0.2173 - classification_loss: 0.0098 454/500 [==========================>...] - ETA: 23s - loss: 0.2270 - regression_loss: 0.2172 - classification_loss: 0.0098 455/500 [==========================>...] - ETA: 22s - loss: 0.2271 - regression_loss: 0.2173 - classification_loss: 0.0098 456/500 [==========================>...] - ETA: 22s - loss: 0.2270 - regression_loss: 0.2173 - classification_loss: 0.0097 457/500 [==========================>...] - ETA: 21s - loss: 0.2272 - regression_loss: 0.2174 - classification_loss: 0.0097 458/500 [==========================>...] - ETA: 21s - loss: 0.2269 - regression_loss: 0.2172 - classification_loss: 0.0097 459/500 [==========================>...] - ETA: 20s - loss: 0.2268 - regression_loss: 0.2171 - classification_loss: 0.0097 460/500 [==========================>...] - ETA: 20s - loss: 0.2268 - regression_loss: 0.2171 - classification_loss: 0.0097 461/500 [==========================>...] - ETA: 19s - loss: 0.2267 - regression_loss: 0.2169 - classification_loss: 0.0097 462/500 [==========================>...] - ETA: 19s - loss: 0.2266 - regression_loss: 0.2169 - classification_loss: 0.0097 463/500 [==========================>...] - ETA: 18s - loss: 0.2266 - regression_loss: 0.2169 - classification_loss: 0.0097 464/500 [==========================>...] - ETA: 18s - loss: 0.2269 - regression_loss: 0.2171 - classification_loss: 0.0097 465/500 [==========================>...] - ETA: 17s - loss: 0.2270 - regression_loss: 0.2173 - classification_loss: 0.0097 466/500 [==========================>...] - ETA: 17s - loss: 0.2274 - regression_loss: 0.2176 - classification_loss: 0.0097 467/500 [===========================>..] - ETA: 16s - loss: 0.2278 - regression_loss: 0.2180 - classification_loss: 0.0098 468/500 [===========================>..] - ETA: 16s - loss: 0.2275 - regression_loss: 0.2178 - classification_loss: 0.0098 469/500 [===========================>..] - ETA: 15s - loss: 0.2276 - regression_loss: 0.2178 - classification_loss: 0.0098 470/500 [===========================>..] - ETA: 15s - loss: 0.2275 - regression_loss: 0.2177 - classification_loss: 0.0098 471/500 [===========================>..] - ETA: 14s - loss: 0.2273 - regression_loss: 0.2176 - classification_loss: 0.0098 472/500 [===========================>..] - ETA: 14s - loss: 0.2274 - regression_loss: 0.2176 - classification_loss: 0.0098 473/500 [===========================>..] - ETA: 13s - loss: 0.2277 - regression_loss: 0.2179 - classification_loss: 0.0098 474/500 [===========================>..] - ETA: 13s - loss: 0.2281 - regression_loss: 0.2183 - classification_loss: 0.0098 475/500 [===========================>..] - ETA: 12s - loss: 0.2278 - regression_loss: 0.2180 - classification_loss: 0.0098 476/500 [===========================>..] - ETA: 12s - loss: 0.2278 - regression_loss: 0.2180 - classification_loss: 0.0098 477/500 [===========================>..] - ETA: 11s - loss: 0.2277 - regression_loss: 0.2180 - classification_loss: 0.0098 478/500 [===========================>..] - ETA: 11s - loss: 0.2276 - regression_loss: 0.2178 - classification_loss: 0.0098 479/500 [===========================>..] - ETA: 10s - loss: 0.2275 - regression_loss: 0.2177 - classification_loss: 0.0098 480/500 [===========================>..] - ETA: 10s - loss: 0.2273 - regression_loss: 0.2176 - classification_loss: 0.0098 481/500 [===========================>..] - ETA: 9s - loss: 0.2272 - regression_loss: 0.2175 - classification_loss: 0.0098  482/500 [===========================>..] - ETA: 9s - loss: 0.2271 - regression_loss: 0.2174 - classification_loss: 0.0097 483/500 [===========================>..] - ETA: 8s - loss: 0.2270 - regression_loss: 0.2172 - classification_loss: 0.0097 484/500 [============================>.] - ETA: 8s - loss: 0.2266 - regression_loss: 0.2169 - classification_loss: 0.0097 485/500 [============================>.] - ETA: 7s - loss: 0.2265 - regression_loss: 0.2168 - classification_loss: 0.0097 486/500 [============================>.] - ETA: 7s - loss: 0.2271 - regression_loss: 0.2174 - classification_loss: 0.0097 487/500 [============================>.] - ETA: 6s - loss: 0.2273 - regression_loss: 0.2176 - classification_loss: 0.0097 488/500 [============================>.] - ETA: 6s - loss: 0.2272 - regression_loss: 0.2175 - classification_loss: 0.0097 489/500 [============================>.] - ETA: 5s - loss: 0.2272 - regression_loss: 0.2175 - classification_loss: 0.0097 490/500 [============================>.] - ETA: 5s - loss: 0.2270 - regression_loss: 0.2173 - classification_loss: 0.0097 491/500 [============================>.] - ETA: 4s - loss: 0.2271 - regression_loss: 0.2174 - classification_loss: 0.0097 492/500 [============================>.] - ETA: 4s - loss: 0.2272 - regression_loss: 0.2175 - classification_loss: 0.0097 493/500 [============================>.] - ETA: 3s - loss: 0.2276 - regression_loss: 0.2178 - classification_loss: 0.0098 494/500 [============================>.] - ETA: 3s - loss: 0.2275 - regression_loss: 0.2177 - classification_loss: 0.0098 495/500 [============================>.] - ETA: 2s - loss: 0.2278 - regression_loss: 0.2181 - classification_loss: 0.0098 496/500 [============================>.] - ETA: 2s - loss: 0.2279 - regression_loss: 0.2182 - classification_loss: 0.0098 497/500 [============================>.] - ETA: 1s - loss: 0.2282 - regression_loss: 0.2184 - classification_loss: 0.0098 498/500 [============================>.] - ETA: 1s - loss: 0.2284 - regression_loss: 0.2186 - classification_loss: 0.0098 499/500 [============================>.] - ETA: 0s - loss: 0.2284 - regression_loss: 0.2186 - classification_loss: 0.0098 500/500 [==============================] - 251s 503ms/step - loss: 0.2282 - regression_loss: 0.2185 - classification_loss: 0.0097 1172 instances of class plum with average precision: 0.7379 mAP: 0.7379 Epoch 00033: saving model to ./training/snapshots/resnet101_pascal_33.h5 Epoch 34/150 1/500 [..............................] - ETA: 4:06 - loss: 0.1656 - regression_loss: 0.1551 - classification_loss: 0.0104 2/500 [..............................] - ETA: 4:11 - loss: 0.2430 - regression_loss: 0.2304 - classification_loss: 0.0126 3/500 [..............................] - ETA: 4:08 - loss: 0.2433 - regression_loss: 0.2331 - classification_loss: 0.0102 4/500 [..............................] - ETA: 4:07 - loss: 0.2321 - regression_loss: 0.2235 - classification_loss: 0.0086 5/500 [..............................] - ETA: 4:08 - loss: 0.3105 - regression_loss: 0.3000 - classification_loss: 0.0106 6/500 [..............................] - ETA: 4:09 - loss: 0.2928 - regression_loss: 0.2837 - classification_loss: 0.0091 7/500 [..............................] - ETA: 4:08 - loss: 0.3109 - regression_loss: 0.3009 - classification_loss: 0.0100 8/500 [..............................] - ETA: 4:08 - loss: 0.3101 - regression_loss: 0.3003 - classification_loss: 0.0098 9/500 [..............................] - ETA: 4:07 - loss: 0.2929 - regression_loss: 0.2838 - classification_loss: 0.0091 10/500 [..............................] - ETA: 4:06 - loss: 0.2788 - regression_loss: 0.2702 - classification_loss: 0.0085 11/500 [..............................] - ETA: 4:06 - loss: 0.2763 - regression_loss: 0.2679 - classification_loss: 0.0084 12/500 [..............................] - ETA: 4:04 - loss: 0.2767 - regression_loss: 0.2674 - classification_loss: 0.0092 13/500 [..............................] - ETA: 4:04 - loss: 0.2667 - regression_loss: 0.2580 - classification_loss: 0.0087 14/500 [..............................] - ETA: 4:04 - loss: 0.2579 - regression_loss: 0.2494 - classification_loss: 0.0086 15/500 [..............................] - ETA: 4:04 - loss: 0.2615 - regression_loss: 0.2521 - classification_loss: 0.0094 16/500 [..............................] - ETA: 4:03 - loss: 0.2641 - regression_loss: 0.2547 - classification_loss: 0.0093 17/500 [>.............................] - ETA: 4:03 - loss: 0.2664 - regression_loss: 0.2571 - classification_loss: 0.0093 18/500 [>.............................] - ETA: 4:02 - loss: 0.2635 - regression_loss: 0.2544 - classification_loss: 0.0092 19/500 [>.............................] - ETA: 4:01 - loss: 0.2661 - regression_loss: 0.2564 - classification_loss: 0.0097 20/500 [>.............................] - ETA: 4:01 - loss: 0.2651 - regression_loss: 0.2556 - classification_loss: 0.0095 21/500 [>.............................] - ETA: 4:00 - loss: 0.2652 - regression_loss: 0.2549 - classification_loss: 0.0103 22/500 [>.............................] - ETA: 4:00 - loss: 0.2690 - regression_loss: 0.2586 - classification_loss: 0.0104 23/500 [>.............................] - ETA: 4:00 - loss: 0.2742 - regression_loss: 0.2634 - classification_loss: 0.0107 24/500 [>.............................] - ETA: 3:59 - loss: 0.2660 - regression_loss: 0.2556 - classification_loss: 0.0104 25/500 [>.............................] - ETA: 3:58 - loss: 0.2726 - regression_loss: 0.2621 - classification_loss: 0.0105 26/500 [>.............................] - ETA: 3:58 - loss: 0.2676 - regression_loss: 0.2575 - classification_loss: 0.0102 27/500 [>.............................] - ETA: 3:58 - loss: 0.2647 - regression_loss: 0.2548 - classification_loss: 0.0099 28/500 [>.............................] - ETA: 3:57 - loss: 0.2707 - regression_loss: 0.2598 - classification_loss: 0.0109 29/500 [>.............................] - ETA: 3:56 - loss: 0.2828 - regression_loss: 0.2695 - classification_loss: 0.0134 30/500 [>.............................] - ETA: 3:56 - loss: 0.2784 - regression_loss: 0.2654 - classification_loss: 0.0130 31/500 [>.............................] - ETA: 3:55 - loss: 0.2781 - regression_loss: 0.2653 - classification_loss: 0.0128 32/500 [>.............................] - ETA: 3:55 - loss: 0.2811 - regression_loss: 0.2676 - classification_loss: 0.0135 33/500 [>.............................] - ETA: 3:54 - loss: 0.2816 - regression_loss: 0.2679 - classification_loss: 0.0137 34/500 [=>............................] - ETA: 3:54 - loss: 0.2905 - regression_loss: 0.2769 - classification_loss: 0.0136 35/500 [=>............................] - ETA: 3:53 - loss: 0.2870 - regression_loss: 0.2733 - classification_loss: 0.0137 36/500 [=>............................] - ETA: 3:53 - loss: 0.2985 - regression_loss: 0.2849 - classification_loss: 0.0136 37/500 [=>............................] - ETA: 3:52 - loss: 0.2987 - regression_loss: 0.2852 - classification_loss: 0.0135 38/500 [=>............................] - ETA: 3:52 - loss: 0.2975 - regression_loss: 0.2842 - classification_loss: 0.0133 39/500 [=>............................] - ETA: 3:51 - loss: 0.3015 - regression_loss: 0.2882 - classification_loss: 0.0133 40/500 [=>............................] - ETA: 3:51 - loss: 0.3045 - regression_loss: 0.2915 - classification_loss: 0.0130 41/500 [=>............................] - ETA: 3:51 - loss: 0.3024 - regression_loss: 0.2895 - classification_loss: 0.0129 42/500 [=>............................] - ETA: 3:50 - loss: 0.2992 - regression_loss: 0.2865 - classification_loss: 0.0127 43/500 [=>............................] - ETA: 3:49 - loss: 0.2975 - regression_loss: 0.2848 - classification_loss: 0.0127 44/500 [=>............................] - ETA: 3:49 - loss: 0.2943 - regression_loss: 0.2818 - classification_loss: 0.0125 45/500 [=>............................] - ETA: 3:48 - loss: 0.2958 - regression_loss: 0.2832 - classification_loss: 0.0127 46/500 [=>............................] - ETA: 3:48 - loss: 0.2949 - regression_loss: 0.2823 - classification_loss: 0.0126 47/500 [=>............................] - ETA: 3:48 - loss: 0.2926 - regression_loss: 0.2801 - classification_loss: 0.0125 48/500 [=>............................] - ETA: 3:47 - loss: 0.2908 - regression_loss: 0.2783 - classification_loss: 0.0124 49/500 [=>............................] - ETA: 3:46 - loss: 0.2890 - regression_loss: 0.2767 - classification_loss: 0.0123 50/500 [==>...........................] - ETA: 3:46 - loss: 0.2856 - regression_loss: 0.2735 - classification_loss: 0.0121 51/500 [==>...........................] - ETA: 3:46 - loss: 0.2863 - regression_loss: 0.2740 - classification_loss: 0.0123 52/500 [==>...........................] - ETA: 3:45 - loss: 0.2832 - regression_loss: 0.2711 - classification_loss: 0.0121 53/500 [==>...........................] - ETA: 3:44 - loss: 0.2817 - regression_loss: 0.2697 - classification_loss: 0.0120 54/500 [==>...........................] - ETA: 3:44 - loss: 0.2830 - regression_loss: 0.2711 - classification_loss: 0.0119 55/500 [==>...........................] - ETA: 3:43 - loss: 0.2789 - regression_loss: 0.2671 - classification_loss: 0.0117 56/500 [==>...........................] - ETA: 3:43 - loss: 0.2762 - regression_loss: 0.2646 - classification_loss: 0.0116 57/500 [==>...........................] - ETA: 3:42 - loss: 0.2762 - regression_loss: 0.2646 - classification_loss: 0.0116 58/500 [==>...........................] - ETA: 3:42 - loss: 0.2747 - regression_loss: 0.2632 - classification_loss: 0.0115 59/500 [==>...........................] - ETA: 3:41 - loss: 0.2724 - regression_loss: 0.2611 - classification_loss: 0.0113 60/500 [==>...........................] - ETA: 3:41 - loss: 0.2722 - regression_loss: 0.2609 - classification_loss: 0.0113 61/500 [==>...........................] - ETA: 3:40 - loss: 0.2704 - regression_loss: 0.2593 - classification_loss: 0.0112 62/500 [==>...........................] - ETA: 3:40 - loss: 0.2716 - regression_loss: 0.2603 - classification_loss: 0.0113 63/500 [==>...........................] - ETA: 3:39 - loss: 0.2717 - regression_loss: 0.2604 - classification_loss: 0.0113 64/500 [==>...........................] - ETA: 3:39 - loss: 0.2706 - regression_loss: 0.2594 - classification_loss: 0.0112 65/500 [==>...........................] - ETA: 3:38 - loss: 0.2700 - regression_loss: 0.2589 - classification_loss: 0.0111 66/500 [==>...........................] - ETA: 3:38 - loss: 0.2689 - regression_loss: 0.2579 - classification_loss: 0.0110 67/500 [===>..........................] - ETA: 3:37 - loss: 0.2679 - regression_loss: 0.2569 - classification_loss: 0.0110 68/500 [===>..........................] - ETA: 3:37 - loss: 0.2685 - regression_loss: 0.2574 - classification_loss: 0.0112 69/500 [===>..........................] - ETA: 3:36 - loss: 0.2684 - regression_loss: 0.2574 - classification_loss: 0.0111 70/500 [===>..........................] - ETA: 3:35 - loss: 0.2709 - regression_loss: 0.2597 - classification_loss: 0.0112 71/500 [===>..........................] - ETA: 3:35 - loss: 0.2726 - regression_loss: 0.2612 - classification_loss: 0.0114 72/500 [===>..........................] - ETA: 3:34 - loss: 0.2736 - regression_loss: 0.2622 - classification_loss: 0.0114 73/500 [===>..........................] - ETA: 3:34 - loss: 0.2739 - regression_loss: 0.2626 - classification_loss: 0.0113 74/500 [===>..........................] - ETA: 3:33 - loss: 0.2752 - regression_loss: 0.2638 - classification_loss: 0.0113 75/500 [===>..........................] - ETA: 3:33 - loss: 0.2767 - regression_loss: 0.2652 - classification_loss: 0.0115 76/500 [===>..........................] - ETA: 3:32 - loss: 0.2750 - regression_loss: 0.2636 - classification_loss: 0.0114 77/500 [===>..........................] - ETA: 3:32 - loss: 0.2736 - regression_loss: 0.2623 - classification_loss: 0.0113 78/500 [===>..........................] - ETA: 3:31 - loss: 0.2734 - regression_loss: 0.2621 - classification_loss: 0.0113 79/500 [===>..........................] - ETA: 3:31 - loss: 0.2744 - regression_loss: 0.2630 - classification_loss: 0.0114 80/500 [===>..........................] - ETA: 3:30 - loss: 0.2739 - regression_loss: 0.2625 - classification_loss: 0.0114 81/500 [===>..........................] - ETA: 3:30 - loss: 0.2732 - regression_loss: 0.2619 - classification_loss: 0.0113 82/500 [===>..........................] - ETA: 3:29 - loss: 0.2716 - regression_loss: 0.2604 - classification_loss: 0.0112 83/500 [===>..........................] - ETA: 3:29 - loss: 0.2717 - regression_loss: 0.2604 - classification_loss: 0.0113 84/500 [====>.........................] - ETA: 3:28 - loss: 0.2703 - regression_loss: 0.2591 - classification_loss: 0.0112 85/500 [====>.........................] - ETA: 3:28 - loss: 0.2686 - regression_loss: 0.2574 - classification_loss: 0.0111 86/500 [====>.........................] - ETA: 3:27 - loss: 0.2678 - regression_loss: 0.2567 - classification_loss: 0.0111 87/500 [====>.........................] - ETA: 3:27 - loss: 0.2674 - regression_loss: 0.2563 - classification_loss: 0.0111 88/500 [====>.........................] - ETA: 3:26 - loss: 0.2660 - regression_loss: 0.2550 - classification_loss: 0.0110 89/500 [====>.........................] - ETA: 3:26 - loss: 0.2663 - regression_loss: 0.2553 - classification_loss: 0.0110 90/500 [====>.........................] - ETA: 3:25 - loss: 0.2658 - regression_loss: 0.2547 - classification_loss: 0.0111 91/500 [====>.........................] - ETA: 3:25 - loss: 0.2638 - regression_loss: 0.2528 - classification_loss: 0.0110 92/500 [====>.........................] - ETA: 3:24 - loss: 0.2635 - regression_loss: 0.2525 - classification_loss: 0.0110 93/500 [====>.........................] - ETA: 3:24 - loss: 0.2643 - regression_loss: 0.2533 - classification_loss: 0.0110 94/500 [====>.........................] - ETA: 3:23 - loss: 0.2638 - regression_loss: 0.2527 - classification_loss: 0.0111 95/500 [====>.........................] - ETA: 3:23 - loss: 0.2625 - regression_loss: 0.2515 - classification_loss: 0.0110 96/500 [====>.........................] - ETA: 3:22 - loss: 0.2623 - regression_loss: 0.2512 - classification_loss: 0.0111 97/500 [====>.........................] - ETA: 3:22 - loss: 0.2625 - regression_loss: 0.2514 - classification_loss: 0.0111 98/500 [====>.........................] - ETA: 3:21 - loss: 0.2620 - regression_loss: 0.2510 - classification_loss: 0.0111 99/500 [====>.........................] - ETA: 3:21 - loss: 0.2611 - regression_loss: 0.2501 - classification_loss: 0.0110 100/500 [=====>........................] - ETA: 3:20 - loss: 0.2596 - regression_loss: 0.2486 - classification_loss: 0.0110 101/500 [=====>........................] - ETA: 3:20 - loss: 0.2588 - regression_loss: 0.2479 - classification_loss: 0.0109 102/500 [=====>........................] - ETA: 3:19 - loss: 0.2571 - regression_loss: 0.2463 - classification_loss: 0.0108 103/500 [=====>........................] - ETA: 3:19 - loss: 0.2573 - regression_loss: 0.2463 - classification_loss: 0.0110 104/500 [=====>........................] - ETA: 3:18 - loss: 0.2576 - regression_loss: 0.2467 - classification_loss: 0.0110 105/500 [=====>........................] - ETA: 3:18 - loss: 0.2567 - regression_loss: 0.2458 - classification_loss: 0.0109 106/500 [=====>........................] - ETA: 3:17 - loss: 0.2562 - regression_loss: 0.2453 - classification_loss: 0.0109 107/500 [=====>........................] - ETA: 3:17 - loss: 0.2555 - regression_loss: 0.2447 - classification_loss: 0.0108 108/500 [=====>........................] - ETA: 3:16 - loss: 0.2541 - regression_loss: 0.2433 - classification_loss: 0.0108 109/500 [=====>........................] - ETA: 3:16 - loss: 0.2555 - regression_loss: 0.2446 - classification_loss: 0.0109 110/500 [=====>........................] - ETA: 3:15 - loss: 0.2564 - regression_loss: 0.2455 - classification_loss: 0.0109 111/500 [=====>........................] - ETA: 3:15 - loss: 0.2552 - regression_loss: 0.2443 - classification_loss: 0.0109 112/500 [=====>........................] - ETA: 3:14 - loss: 0.2548 - regression_loss: 0.2439 - classification_loss: 0.0109 113/500 [=====>........................] - ETA: 3:14 - loss: 0.2536 - regression_loss: 0.2427 - classification_loss: 0.0108 114/500 [=====>........................] - ETA: 3:13 - loss: 0.2545 - regression_loss: 0.2433 - classification_loss: 0.0111 115/500 [=====>........................] - ETA: 3:13 - loss: 0.2537 - regression_loss: 0.2426 - classification_loss: 0.0111 116/500 [=====>........................] - ETA: 3:12 - loss: 0.2536 - regression_loss: 0.2425 - classification_loss: 0.0111 117/500 [======>.......................] - ETA: 3:12 - loss: 0.2522 - regression_loss: 0.2411 - classification_loss: 0.0110 118/500 [======>.......................] - ETA: 3:11 - loss: 0.2519 - regression_loss: 0.2408 - classification_loss: 0.0111 119/500 [======>.......................] - ETA: 3:11 - loss: 0.2518 - regression_loss: 0.2407 - classification_loss: 0.0111 120/500 [======>.......................] - ETA: 3:10 - loss: 0.2508 - regression_loss: 0.2398 - classification_loss: 0.0110 121/500 [======>.......................] - ETA: 3:10 - loss: 0.2497 - regression_loss: 0.2388 - classification_loss: 0.0109 122/500 [======>.......................] - ETA: 3:09 - loss: 0.2495 - regression_loss: 0.2386 - classification_loss: 0.0109 123/500 [======>.......................] - ETA: 3:09 - loss: 0.2481 - regression_loss: 0.2373 - classification_loss: 0.0108 124/500 [======>.......................] - ETA: 3:08 - loss: 0.2474 - regression_loss: 0.2366 - classification_loss: 0.0108 125/500 [======>.......................] - ETA: 3:08 - loss: 0.2466 - regression_loss: 0.2359 - classification_loss: 0.0107 126/500 [======>.......................] - ETA: 3:07 - loss: 0.2458 - regression_loss: 0.2351 - classification_loss: 0.0107 127/500 [======>.......................] - ETA: 3:07 - loss: 0.2453 - regression_loss: 0.2347 - classification_loss: 0.0106 128/500 [======>.......................] - ETA: 3:06 - loss: 0.2447 - regression_loss: 0.2341 - classification_loss: 0.0106 129/500 [======>.......................] - ETA: 3:06 - loss: 0.2442 - regression_loss: 0.2337 - classification_loss: 0.0105 130/500 [======>.......................] - ETA: 3:05 - loss: 0.2447 - regression_loss: 0.2340 - classification_loss: 0.0106 131/500 [======>.......................] - ETA: 3:05 - loss: 0.2442 - regression_loss: 0.2336 - classification_loss: 0.0106 132/500 [======>.......................] - ETA: 3:04 - loss: 0.2430 - regression_loss: 0.2324 - classification_loss: 0.0105 133/500 [======>.......................] - ETA: 3:04 - loss: 0.2428 - regression_loss: 0.2322 - classification_loss: 0.0105 134/500 [=======>......................] - ETA: 3:03 - loss: 0.2420 - regression_loss: 0.2315 - classification_loss: 0.0105 135/500 [=======>......................] - ETA: 3:03 - loss: 0.2414 - regression_loss: 0.2309 - classification_loss: 0.0105 136/500 [=======>......................] - ETA: 3:02 - loss: 0.2405 - regression_loss: 0.2301 - classification_loss: 0.0104 137/500 [=======>......................] - ETA: 3:02 - loss: 0.2404 - regression_loss: 0.2300 - classification_loss: 0.0104 138/500 [=======>......................] - ETA: 3:01 - loss: 0.2390 - regression_loss: 0.2287 - classification_loss: 0.0103 139/500 [=======>......................] - ETA: 3:01 - loss: 0.2374 - regression_loss: 0.2271 - classification_loss: 0.0103 140/500 [=======>......................] - ETA: 3:00 - loss: 0.2372 - regression_loss: 0.2270 - classification_loss: 0.0103 141/500 [=======>......................] - ETA: 3:00 - loss: 0.2359 - regression_loss: 0.2257 - classification_loss: 0.0102 142/500 [=======>......................] - ETA: 2:59 - loss: 0.2357 - regression_loss: 0.2255 - classification_loss: 0.0101 143/500 [=======>......................] - ETA: 2:59 - loss: 0.2343 - regression_loss: 0.2243 - classification_loss: 0.0101 144/500 [=======>......................] - ETA: 2:58 - loss: 0.2352 - regression_loss: 0.2252 - classification_loss: 0.0101 145/500 [=======>......................] - ETA: 2:58 - loss: 0.2352 - regression_loss: 0.2251 - classification_loss: 0.0100 146/500 [=======>......................] - ETA: 2:57 - loss: 0.2351 - regression_loss: 0.2250 - classification_loss: 0.0100 147/500 [=======>......................] - ETA: 2:57 - loss: 0.2352 - regression_loss: 0.2252 - classification_loss: 0.0100 148/500 [=======>......................] - ETA: 2:56 - loss: 0.2347 - regression_loss: 0.2247 - classification_loss: 0.0100 149/500 [=======>......................] - ETA: 2:56 - loss: 0.2341 - regression_loss: 0.2242 - classification_loss: 0.0099 150/500 [========>.....................] - ETA: 2:55 - loss: 0.2351 - regression_loss: 0.2250 - classification_loss: 0.0101 151/500 [========>.....................] - ETA: 2:55 - loss: 0.2348 - regression_loss: 0.2248 - classification_loss: 0.0101 152/500 [========>.....................] - ETA: 2:54 - loss: 0.2359 - regression_loss: 0.2258 - classification_loss: 0.0101 153/500 [========>.....................] - ETA: 2:54 - loss: 0.2364 - regression_loss: 0.2263 - classification_loss: 0.0102 154/500 [========>.....................] - ETA: 2:53 - loss: 0.2360 - regression_loss: 0.2258 - classification_loss: 0.0102 155/500 [========>.....................] - ETA: 2:53 - loss: 0.2347 - regression_loss: 0.2246 - classification_loss: 0.0101 156/500 [========>.....................] - ETA: 2:52 - loss: 0.2350 - regression_loss: 0.2248 - classification_loss: 0.0101 157/500 [========>.....................] - ETA: 2:52 - loss: 0.2352 - regression_loss: 0.2251 - classification_loss: 0.0101 158/500 [========>.....................] - ETA: 2:51 - loss: 0.2375 - regression_loss: 0.2273 - classification_loss: 0.0102 159/500 [========>.....................] - ETA: 2:51 - loss: 0.2378 - regression_loss: 0.2277 - classification_loss: 0.0101 160/500 [========>.....................] - ETA: 2:50 - loss: 0.2377 - regression_loss: 0.2275 - classification_loss: 0.0102 161/500 [========>.....................] - ETA: 2:50 - loss: 0.2379 - regression_loss: 0.2277 - classification_loss: 0.0102 162/500 [========>.....................] - ETA: 2:49 - loss: 0.2385 - regression_loss: 0.2283 - classification_loss: 0.0102 163/500 [========>.....................] - ETA: 2:49 - loss: 0.2393 - regression_loss: 0.2291 - classification_loss: 0.0102 164/500 [========>.....................] - ETA: 2:48 - loss: 0.2389 - regression_loss: 0.2287 - classification_loss: 0.0102 165/500 [========>.....................] - ETA: 2:48 - loss: 0.2383 - regression_loss: 0.2282 - classification_loss: 0.0102 166/500 [========>.....................] - ETA: 2:47 - loss: 0.2387 - regression_loss: 0.2285 - classification_loss: 0.0102 167/500 [=========>....................] - ETA: 2:47 - loss: 0.2391 - regression_loss: 0.2289 - classification_loss: 0.0102 168/500 [=========>....................] - ETA: 2:46 - loss: 0.2386 - regression_loss: 0.2285 - classification_loss: 0.0101 169/500 [=========>....................] - ETA: 2:46 - loss: 0.2380 - regression_loss: 0.2279 - classification_loss: 0.0101 170/500 [=========>....................] - ETA: 2:45 - loss: 0.2377 - regression_loss: 0.2276 - classification_loss: 0.0101 171/500 [=========>....................] - ETA: 2:45 - loss: 0.2381 - regression_loss: 0.2280 - classification_loss: 0.0101 172/500 [=========>....................] - ETA: 2:44 - loss: 0.2379 - regression_loss: 0.2279 - classification_loss: 0.0100 173/500 [=========>....................] - ETA: 2:44 - loss: 0.2374 - regression_loss: 0.2274 - classification_loss: 0.0100 174/500 [=========>....................] - ETA: 2:43 - loss: 0.2374 - regression_loss: 0.2273 - classification_loss: 0.0101 175/500 [=========>....................] - ETA: 2:43 - loss: 0.2364 - regression_loss: 0.2264 - classification_loss: 0.0100 176/500 [=========>....................] - ETA: 2:42 - loss: 0.2365 - regression_loss: 0.2265 - classification_loss: 0.0101 177/500 [=========>....................] - ETA: 2:42 - loss: 0.2363 - regression_loss: 0.2262 - classification_loss: 0.0100 178/500 [=========>....................] - ETA: 2:41 - loss: 0.2357 - regression_loss: 0.2257 - classification_loss: 0.0100 179/500 [=========>....................] - ETA: 2:41 - loss: 0.2355 - regression_loss: 0.2255 - classification_loss: 0.0100 180/500 [=========>....................] - ETA: 2:40 - loss: 0.2347 - regression_loss: 0.2248 - classification_loss: 0.0099 181/500 [=========>....................] - ETA: 2:40 - loss: 0.2346 - regression_loss: 0.2247 - classification_loss: 0.0099 182/500 [=========>....................] - ETA: 2:39 - loss: 0.2350 - regression_loss: 0.2251 - classification_loss: 0.0099 183/500 [=========>....................] - ETA: 2:39 - loss: 0.2345 - regression_loss: 0.2246 - classification_loss: 0.0099 184/500 [==========>...................] - ETA: 2:38 - loss: 0.2340 - regression_loss: 0.2241 - classification_loss: 0.0098 185/500 [==========>...................] - ETA: 2:38 - loss: 0.2332 - regression_loss: 0.2234 - classification_loss: 0.0098 186/500 [==========>...................] - ETA: 2:37 - loss: 0.2327 - regression_loss: 0.2229 - classification_loss: 0.0098 187/500 [==========>...................] - ETA: 2:37 - loss: 0.2319 - regression_loss: 0.2222 - classification_loss: 0.0097 188/500 [==========>...................] - ETA: 2:36 - loss: 0.2314 - regression_loss: 0.2217 - classification_loss: 0.0097 189/500 [==========>...................] - ETA: 2:36 - loss: 0.2309 - regression_loss: 0.2212 - classification_loss: 0.0097 190/500 [==========>...................] - ETA: 2:35 - loss: 0.2307 - regression_loss: 0.2211 - classification_loss: 0.0097 191/500 [==========>...................] - ETA: 2:35 - loss: 0.2304 - regression_loss: 0.2208 - classification_loss: 0.0096 192/500 [==========>...................] - ETA: 2:34 - loss: 0.2306 - regression_loss: 0.2208 - classification_loss: 0.0097 193/500 [==========>...................] - ETA: 2:34 - loss: 0.2317 - regression_loss: 0.2217 - classification_loss: 0.0101 194/500 [==========>...................] - ETA: 2:33 - loss: 0.2307 - regression_loss: 0.2207 - classification_loss: 0.0100 195/500 [==========>...................] - ETA: 2:33 - loss: 0.2302 - regression_loss: 0.2202 - classification_loss: 0.0100 196/500 [==========>...................] - ETA: 2:32 - loss: 0.2304 - regression_loss: 0.2204 - classification_loss: 0.0100 197/500 [==========>...................] - ETA: 2:32 - loss: 0.2304 - regression_loss: 0.2204 - classification_loss: 0.0100 198/500 [==========>...................] - ETA: 2:31 - loss: 0.2322 - regression_loss: 0.2221 - classification_loss: 0.0101 199/500 [==========>...................] - ETA: 2:31 - loss: 0.2318 - regression_loss: 0.2218 - classification_loss: 0.0101 200/500 [===========>..................] - ETA: 2:30 - loss: 0.2314 - regression_loss: 0.2214 - classification_loss: 0.0100 201/500 [===========>..................] - ETA: 2:30 - loss: 0.2315 - regression_loss: 0.2214 - classification_loss: 0.0101 202/500 [===========>..................] - ETA: 2:29 - loss: 0.2316 - regression_loss: 0.2215 - classification_loss: 0.0101 203/500 [===========>..................] - ETA: 2:29 - loss: 0.2315 - regression_loss: 0.2215 - classification_loss: 0.0101 204/500 [===========>..................] - ETA: 2:28 - loss: 0.2314 - regression_loss: 0.2213 - classification_loss: 0.0101 205/500 [===========>..................] - ETA: 2:28 - loss: 0.2313 - regression_loss: 0.2212 - classification_loss: 0.0101 206/500 [===========>..................] - ETA: 2:27 - loss: 0.2312 - regression_loss: 0.2210 - classification_loss: 0.0102 207/500 [===========>..................] - ETA: 2:27 - loss: 0.2310 - regression_loss: 0.2208 - classification_loss: 0.0102 208/500 [===========>..................] - ETA: 2:26 - loss: 0.2307 - regression_loss: 0.2205 - classification_loss: 0.0102 209/500 [===========>..................] - ETA: 2:26 - loss: 0.2301 - regression_loss: 0.2200 - classification_loss: 0.0101 210/500 [===========>..................] - ETA: 2:25 - loss: 0.2300 - regression_loss: 0.2199 - classification_loss: 0.0101 211/500 [===========>..................] - ETA: 2:25 - loss: 0.2305 - regression_loss: 0.2203 - classification_loss: 0.0102 212/500 [===========>..................] - ETA: 2:24 - loss: 0.2304 - regression_loss: 0.2202 - classification_loss: 0.0102 213/500 [===========>..................] - ETA: 2:24 - loss: 0.2310 - regression_loss: 0.2205 - classification_loss: 0.0104 214/500 [===========>..................] - ETA: 2:23 - loss: 0.2305 - regression_loss: 0.2201 - classification_loss: 0.0104 215/500 [===========>..................] - ETA: 2:23 - loss: 0.2316 - regression_loss: 0.2211 - classification_loss: 0.0105 216/500 [===========>..................] - ETA: 2:22 - loss: 0.2318 - regression_loss: 0.2213 - classification_loss: 0.0105 217/500 [============>.................] - ETA: 2:22 - loss: 0.2323 - regression_loss: 0.2218 - classification_loss: 0.0105 218/500 [============>.................] - ETA: 2:21 - loss: 0.2326 - regression_loss: 0.2221 - classification_loss: 0.0105 219/500 [============>.................] - ETA: 2:21 - loss: 0.2327 - regression_loss: 0.2222 - classification_loss: 0.0105 220/500 [============>.................] - ETA: 2:20 - loss: 0.2325 - regression_loss: 0.2220 - classification_loss: 0.0105 221/500 [============>.................] - ETA: 2:20 - loss: 0.2323 - regression_loss: 0.2218 - classification_loss: 0.0105 222/500 [============>.................] - ETA: 2:19 - loss: 0.2323 - regression_loss: 0.2219 - classification_loss: 0.0105 223/500 [============>.................] - ETA: 2:19 - loss: 0.2329 - regression_loss: 0.2225 - classification_loss: 0.0105 224/500 [============>.................] - ETA: 2:18 - loss: 0.2346 - regression_loss: 0.2240 - classification_loss: 0.0105 225/500 [============>.................] - ETA: 2:18 - loss: 0.2344 - regression_loss: 0.2239 - classification_loss: 0.0105 226/500 [============>.................] - ETA: 2:17 - loss: 0.2349 - regression_loss: 0.2244 - classification_loss: 0.0105 227/500 [============>.................] - ETA: 2:17 - loss: 0.2353 - regression_loss: 0.2247 - classification_loss: 0.0106 228/500 [============>.................] - ETA: 2:16 - loss: 0.2349 - regression_loss: 0.2244 - classification_loss: 0.0105 229/500 [============>.................] - ETA: 2:16 - loss: 0.2348 - regression_loss: 0.2242 - classification_loss: 0.0105 230/500 [============>.................] - ETA: 2:15 - loss: 0.2356 - regression_loss: 0.2250 - classification_loss: 0.0106 231/500 [============>.................] - ETA: 2:15 - loss: 0.2353 - regression_loss: 0.2248 - classification_loss: 0.0105 232/500 [============>.................] - ETA: 2:14 - loss: 0.2357 - regression_loss: 0.2251 - classification_loss: 0.0106 233/500 [============>.................] - ETA: 2:14 - loss: 0.2357 - regression_loss: 0.2251 - classification_loss: 0.0106 234/500 [=============>................] - ETA: 2:13 - loss: 0.2356 - regression_loss: 0.2250 - classification_loss: 0.0106 235/500 [=============>................] - ETA: 2:13 - loss: 0.2352 - regression_loss: 0.2247 - classification_loss: 0.0106 236/500 [=============>................] - ETA: 2:12 - loss: 0.2348 - regression_loss: 0.2243 - classification_loss: 0.0106 237/500 [=============>................] - ETA: 2:12 - loss: 0.2344 - regression_loss: 0.2239 - classification_loss: 0.0105 238/500 [=============>................] - ETA: 2:11 - loss: 0.2338 - regression_loss: 0.2233 - classification_loss: 0.0105 239/500 [=============>................] - ETA: 2:11 - loss: 0.2338 - regression_loss: 0.2233 - classification_loss: 0.0105 240/500 [=============>................] - ETA: 2:10 - loss: 0.2339 - regression_loss: 0.2234 - classification_loss: 0.0105 241/500 [=============>................] - ETA: 2:10 - loss: 0.2337 - regression_loss: 0.2233 - classification_loss: 0.0104 242/500 [=============>................] - ETA: 2:09 - loss: 0.2338 - regression_loss: 0.2234 - classification_loss: 0.0105 243/500 [=============>................] - ETA: 2:09 - loss: 0.2338 - regression_loss: 0.2234 - classification_loss: 0.0105 244/500 [=============>................] - ETA: 2:08 - loss: 0.2340 - regression_loss: 0.2236 - classification_loss: 0.0104 245/500 [=============>................] - ETA: 2:08 - loss: 0.2340 - regression_loss: 0.2235 - classification_loss: 0.0105 246/500 [=============>................] - ETA: 2:07 - loss: 0.2336 - regression_loss: 0.2231 - classification_loss: 0.0104 247/500 [=============>................] - ETA: 2:07 - loss: 0.2336 - regression_loss: 0.2232 - classification_loss: 0.0104 248/500 [=============>................] - ETA: 2:06 - loss: 0.2336 - regression_loss: 0.2232 - classification_loss: 0.0105 249/500 [=============>................] - ETA: 2:06 - loss: 0.2333 - regression_loss: 0.2229 - classification_loss: 0.0104 250/500 [==============>...............] - ETA: 2:05 - loss: 0.2331 - regression_loss: 0.2227 - classification_loss: 0.0104 251/500 [==============>...............] - ETA: 2:05 - loss: 0.2332 - regression_loss: 0.2228 - classification_loss: 0.0104 252/500 [==============>...............] - ETA: 2:04 - loss: 0.2328 - regression_loss: 0.2224 - classification_loss: 0.0104 253/500 [==============>...............] - ETA: 2:04 - loss: 0.2332 - regression_loss: 0.2227 - classification_loss: 0.0105 254/500 [==============>...............] - ETA: 2:03 - loss: 0.2333 - regression_loss: 0.2228 - classification_loss: 0.0105 255/500 [==============>...............] - ETA: 2:03 - loss: 0.2329 - regression_loss: 0.2224 - classification_loss: 0.0105 256/500 [==============>...............] - ETA: 2:02 - loss: 0.2327 - regression_loss: 0.2222 - classification_loss: 0.0105 257/500 [==============>...............] - ETA: 2:02 - loss: 0.2325 - regression_loss: 0.2220 - classification_loss: 0.0105 258/500 [==============>...............] - ETA: 2:01 - loss: 0.2328 - regression_loss: 0.2222 - classification_loss: 0.0105 259/500 [==============>...............] - ETA: 2:01 - loss: 0.2326 - regression_loss: 0.2221 - classification_loss: 0.0105 260/500 [==============>...............] - ETA: 2:00 - loss: 0.2325 - regression_loss: 0.2219 - classification_loss: 0.0105 261/500 [==============>...............] - ETA: 2:00 - loss: 0.2332 - regression_loss: 0.2224 - classification_loss: 0.0108 262/500 [==============>...............] - ETA: 1:59 - loss: 0.2327 - regression_loss: 0.2220 - classification_loss: 0.0107 263/500 [==============>...............] - ETA: 1:59 - loss: 0.2341 - regression_loss: 0.2232 - classification_loss: 0.0108 264/500 [==============>...............] - ETA: 1:58 - loss: 0.2338 - regression_loss: 0.2230 - classification_loss: 0.0108 265/500 [==============>...............] - ETA: 1:58 - loss: 0.2340 - regression_loss: 0.2232 - classification_loss: 0.0108 266/500 [==============>...............] - ETA: 1:57 - loss: 0.2339 - regression_loss: 0.2231 - classification_loss: 0.0108 267/500 [===============>..............] - ETA: 1:57 - loss: 0.2341 - regression_loss: 0.2233 - classification_loss: 0.0108 268/500 [===============>..............] - ETA: 1:56 - loss: 0.2343 - regression_loss: 0.2234 - classification_loss: 0.0108 269/500 [===============>..............] - ETA: 1:56 - loss: 0.2340 - regression_loss: 0.2231 - classification_loss: 0.0108 270/500 [===============>..............] - ETA: 1:55 - loss: 0.2341 - regression_loss: 0.2232 - classification_loss: 0.0108 271/500 [===============>..............] - ETA: 1:55 - loss: 0.2337 - regression_loss: 0.2229 - classification_loss: 0.0108 272/500 [===============>..............] - ETA: 1:54 - loss: 0.2334 - regression_loss: 0.2226 - classification_loss: 0.0108 273/500 [===============>..............] - ETA: 1:54 - loss: 0.2336 - regression_loss: 0.2228 - classification_loss: 0.0108 274/500 [===============>..............] - ETA: 1:53 - loss: 0.2338 - regression_loss: 0.2230 - classification_loss: 0.0108 275/500 [===============>..............] - ETA: 1:53 - loss: 0.2333 - regression_loss: 0.2225 - classification_loss: 0.0108 276/500 [===============>..............] - ETA: 1:52 - loss: 0.2330 - regression_loss: 0.2222 - classification_loss: 0.0108 277/500 [===============>..............] - ETA: 1:52 - loss: 0.2334 - regression_loss: 0.2226 - classification_loss: 0.0108 278/500 [===============>..............] - ETA: 1:51 - loss: 0.2333 - regression_loss: 0.2225 - classification_loss: 0.0108 279/500 [===============>..............] - ETA: 1:51 - loss: 0.2332 - regression_loss: 0.2224 - classification_loss: 0.0108 280/500 [===============>..............] - ETA: 1:50 - loss: 0.2329 - regression_loss: 0.2221 - classification_loss: 0.0108 281/500 [===============>..............] - ETA: 1:50 - loss: 0.2326 - regression_loss: 0.2219 - classification_loss: 0.0107 282/500 [===============>..............] - ETA: 1:49 - loss: 0.2325 - regression_loss: 0.2218 - classification_loss: 0.0107 283/500 [===============>..............] - ETA: 1:49 - loss: 0.2322 - regression_loss: 0.2215 - classification_loss: 0.0107 284/500 [================>.............] - ETA: 1:48 - loss: 0.2321 - regression_loss: 0.2214 - classification_loss: 0.0107 285/500 [================>.............] - ETA: 1:48 - loss: 0.2315 - regression_loss: 0.2209 - classification_loss: 0.0107 286/500 [================>.............] - ETA: 1:47 - loss: 0.2313 - regression_loss: 0.2206 - classification_loss: 0.0107 287/500 [================>.............] - ETA: 1:47 - loss: 0.2309 - regression_loss: 0.2202 - classification_loss: 0.0106 288/500 [================>.............] - ETA: 1:46 - loss: 0.2302 - regression_loss: 0.2196 - classification_loss: 0.0106 289/500 [================>.............] - ETA: 1:46 - loss: 0.2303 - regression_loss: 0.2197 - classification_loss: 0.0106 290/500 [================>.............] - ETA: 1:45 - loss: 0.2303 - regression_loss: 0.2197 - classification_loss: 0.0106 291/500 [================>.............] - ETA: 1:45 - loss: 0.2303 - regression_loss: 0.2197 - classification_loss: 0.0106 292/500 [================>.............] - ETA: 1:44 - loss: 0.2303 - regression_loss: 0.2197 - classification_loss: 0.0106 293/500 [================>.............] - ETA: 1:44 - loss: 0.2305 - regression_loss: 0.2199 - classification_loss: 0.0107 294/500 [================>.............] - ETA: 1:43 - loss: 0.2300 - regression_loss: 0.2193 - classification_loss: 0.0106 295/500 [================>.............] - ETA: 1:43 - loss: 0.2296 - regression_loss: 0.2190 - classification_loss: 0.0106 296/500 [================>.............] - ETA: 1:42 - loss: 0.2295 - regression_loss: 0.2189 - classification_loss: 0.0106 297/500 [================>.............] - ETA: 1:42 - loss: 0.2290 - regression_loss: 0.2185 - classification_loss: 0.0106 298/500 [================>.............] - ETA: 1:41 - loss: 0.2288 - regression_loss: 0.2183 - classification_loss: 0.0105 299/500 [================>.............] - ETA: 1:41 - loss: 0.2290 - regression_loss: 0.2184 - classification_loss: 0.0106 300/500 [=================>............] - ETA: 1:40 - loss: 0.2287 - regression_loss: 0.2182 - classification_loss: 0.0105 301/500 [=================>............] - ETA: 1:40 - loss: 0.2289 - regression_loss: 0.2183 - classification_loss: 0.0105 302/500 [=================>............] - ETA: 1:39 - loss: 0.2295 - regression_loss: 0.2190 - classification_loss: 0.0105 303/500 [=================>............] - ETA: 1:39 - loss: 0.2296 - regression_loss: 0.2190 - classification_loss: 0.0105 304/500 [=================>............] - ETA: 1:38 - loss: 0.2295 - regression_loss: 0.2190 - classification_loss: 0.0105 305/500 [=================>............] - ETA: 1:38 - loss: 0.2293 - regression_loss: 0.2188 - classification_loss: 0.0105 306/500 [=================>............] - ETA: 1:37 - loss: 0.2295 - regression_loss: 0.2190 - classification_loss: 0.0105 307/500 [=================>............] - ETA: 1:37 - loss: 0.2292 - regression_loss: 0.2188 - classification_loss: 0.0105 308/500 [=================>............] - ETA: 1:36 - loss: 0.2292 - regression_loss: 0.2187 - classification_loss: 0.0105 309/500 [=================>............] - ETA: 1:36 - loss: 0.2291 - regression_loss: 0.2186 - classification_loss: 0.0104 310/500 [=================>............] - ETA: 1:35 - loss: 0.2292 - regression_loss: 0.2188 - classification_loss: 0.0104 311/500 [=================>............] - ETA: 1:35 - loss: 0.2294 - regression_loss: 0.2189 - classification_loss: 0.0105 312/500 [=================>............] - ETA: 1:34 - loss: 0.2291 - regression_loss: 0.2186 - classification_loss: 0.0105 313/500 [=================>............] - ETA: 1:34 - loss: 0.2291 - regression_loss: 0.2187 - classification_loss: 0.0104 314/500 [=================>............] - ETA: 1:33 - loss: 0.2289 - regression_loss: 0.2184 - classification_loss: 0.0104 315/500 [=================>............] - ETA: 1:33 - loss: 0.2290 - regression_loss: 0.2186 - classification_loss: 0.0104 316/500 [=================>............] - ETA: 1:32 - loss: 0.2292 - regression_loss: 0.2188 - classification_loss: 0.0104 317/500 [==================>...........] - ETA: 1:31 - loss: 0.2295 - regression_loss: 0.2191 - classification_loss: 0.0104 318/500 [==================>...........] - ETA: 1:31 - loss: 0.2297 - regression_loss: 0.2193 - classification_loss: 0.0104 319/500 [==================>...........] - ETA: 1:30 - loss: 0.2295 - regression_loss: 0.2191 - classification_loss: 0.0104 320/500 [==================>...........] - ETA: 1:30 - loss: 0.2295 - regression_loss: 0.2191 - classification_loss: 0.0104 321/500 [==================>...........] - ETA: 1:29 - loss: 0.2294 - regression_loss: 0.2190 - classification_loss: 0.0104 322/500 [==================>...........] - ETA: 1:29 - loss: 0.2295 - regression_loss: 0.2191 - classification_loss: 0.0104 323/500 [==================>...........] - ETA: 1:28 - loss: 0.2293 - regression_loss: 0.2190 - classification_loss: 0.0104 324/500 [==================>...........] - ETA: 1:28 - loss: 0.2292 - regression_loss: 0.2188 - classification_loss: 0.0103 325/500 [==================>...........] - ETA: 1:27 - loss: 0.2291 - regression_loss: 0.2188 - classification_loss: 0.0103 326/500 [==================>...........] - ETA: 1:27 - loss: 0.2291 - regression_loss: 0.2188 - classification_loss: 0.0103 327/500 [==================>...........] - ETA: 1:26 - loss: 0.2294 - regression_loss: 0.2190 - classification_loss: 0.0104 328/500 [==================>...........] - ETA: 1:26 - loss: 0.2295 - regression_loss: 0.2191 - classification_loss: 0.0104 329/500 [==================>...........] - ETA: 1:25 - loss: 0.2290 - regression_loss: 0.2186 - classification_loss: 0.0104 330/500 [==================>...........] - ETA: 1:25 - loss: 0.2286 - regression_loss: 0.2182 - classification_loss: 0.0104 331/500 [==================>...........] - ETA: 1:24 - loss: 0.2291 - regression_loss: 0.2187 - classification_loss: 0.0104 332/500 [==================>...........] - ETA: 1:24 - loss: 0.2287 - regression_loss: 0.2183 - classification_loss: 0.0104 333/500 [==================>...........] - ETA: 1:23 - loss: 0.2287 - regression_loss: 0.2183 - classification_loss: 0.0104 334/500 [===================>..........] - ETA: 1:23 - loss: 0.2283 - regression_loss: 0.2180 - classification_loss: 0.0104 335/500 [===================>..........] - ETA: 1:22 - loss: 0.2280 - regression_loss: 0.2176 - classification_loss: 0.0104 336/500 [===================>..........] - ETA: 1:22 - loss: 0.2281 - regression_loss: 0.2177 - classification_loss: 0.0104 337/500 [===================>..........] - ETA: 1:21 - loss: 0.2285 - regression_loss: 0.2181 - classification_loss: 0.0104 338/500 [===================>..........] - ETA: 1:21 - loss: 0.2283 - regression_loss: 0.2179 - classification_loss: 0.0104 339/500 [===================>..........] - ETA: 1:20 - loss: 0.2285 - regression_loss: 0.2181 - classification_loss: 0.0104 340/500 [===================>..........] - ETA: 1:20 - loss: 0.2284 - regression_loss: 0.2179 - classification_loss: 0.0104 341/500 [===================>..........] - ETA: 1:19 - loss: 0.2279 - regression_loss: 0.2175 - classification_loss: 0.0104 342/500 [===================>..........] - ETA: 1:19 - loss: 0.2277 - regression_loss: 0.2173 - classification_loss: 0.0104 343/500 [===================>..........] - ETA: 1:18 - loss: 0.2275 - regression_loss: 0.2171 - classification_loss: 0.0104 344/500 [===================>..........] - ETA: 1:18 - loss: 0.2274 - regression_loss: 0.2171 - classification_loss: 0.0103 345/500 [===================>..........] - ETA: 1:17 - loss: 0.2270 - regression_loss: 0.2167 - classification_loss: 0.0103 346/500 [===================>..........] - ETA: 1:17 - loss: 0.2268 - regression_loss: 0.2165 - classification_loss: 0.0103 347/500 [===================>..........] - ETA: 1:16 - loss: 0.2269 - regression_loss: 0.2166 - classification_loss: 0.0103 348/500 [===================>..........] - ETA: 1:16 - loss: 0.2268 - regression_loss: 0.2165 - classification_loss: 0.0103 349/500 [===================>..........] - ETA: 1:15 - loss: 0.2267 - regression_loss: 0.2164 - classification_loss: 0.0103 350/500 [====================>.........] - ETA: 1:15 - loss: 0.2264 - regression_loss: 0.2161 - classification_loss: 0.0103 351/500 [====================>.........] - ETA: 1:14 - loss: 0.2259 - regression_loss: 0.2156 - classification_loss: 0.0103 352/500 [====================>.........] - ETA: 1:14 - loss: 0.2260 - regression_loss: 0.2157 - classification_loss: 0.0102 353/500 [====================>.........] - ETA: 1:13 - loss: 0.2261 - regression_loss: 0.2159 - classification_loss: 0.0102 354/500 [====================>.........] - ETA: 1:13 - loss: 0.2261 - regression_loss: 0.2159 - classification_loss: 0.0102 355/500 [====================>.........] - ETA: 1:12 - loss: 0.2260 - regression_loss: 0.2158 - classification_loss: 0.0102 356/500 [====================>.........] - ETA: 1:12 - loss: 0.2262 - regression_loss: 0.2160 - classification_loss: 0.0102 357/500 [====================>.........] - ETA: 1:11 - loss: 0.2263 - regression_loss: 0.2160 - classification_loss: 0.0102 358/500 [====================>.........] - ETA: 1:11 - loss: 0.2261 - regression_loss: 0.2159 - classification_loss: 0.0102 359/500 [====================>.........] - ETA: 1:10 - loss: 0.2258 - regression_loss: 0.2156 - classification_loss: 0.0102 360/500 [====================>.........] - ETA: 1:10 - loss: 0.2263 - regression_loss: 0.2161 - classification_loss: 0.0102 361/500 [====================>.........] - ETA: 1:09 - loss: 0.2263 - regression_loss: 0.2161 - classification_loss: 0.0102 362/500 [====================>.........] - ETA: 1:09 - loss: 0.2265 - regression_loss: 0.2163 - classification_loss: 0.0102 363/500 [====================>.........] - ETA: 1:08 - loss: 0.2264 - regression_loss: 0.2162 - classification_loss: 0.0102 364/500 [====================>.........] - ETA: 1:08 - loss: 0.2261 - regression_loss: 0.2160 - classification_loss: 0.0101 365/500 [====================>.........] - ETA: 1:07 - loss: 0.2269 - regression_loss: 0.2167 - classification_loss: 0.0101 366/500 [====================>.........] - ETA: 1:07 - loss: 0.2269 - regression_loss: 0.2168 - classification_loss: 0.0101 367/500 [=====================>........] - ETA: 1:06 - loss: 0.2265 - regression_loss: 0.2163 - classification_loss: 0.0101 368/500 [=====================>........] - ETA: 1:06 - loss: 0.2264 - regression_loss: 0.2162 - classification_loss: 0.0101 369/500 [=====================>........] - ETA: 1:05 - loss: 0.2265 - regression_loss: 0.2163 - classification_loss: 0.0102 370/500 [=====================>........] - ETA: 1:05 - loss: 0.2263 - regression_loss: 0.2162 - classification_loss: 0.0101 371/500 [=====================>........] - ETA: 1:04 - loss: 0.2261 - regression_loss: 0.2160 - classification_loss: 0.0101 372/500 [=====================>........] - ETA: 1:04 - loss: 0.2259 - regression_loss: 0.2158 - classification_loss: 0.0101 373/500 [=====================>........] - ETA: 1:03 - loss: 0.2255 - regression_loss: 0.2154 - classification_loss: 0.0101 374/500 [=====================>........] - ETA: 1:03 - loss: 0.2254 - regression_loss: 0.2153 - classification_loss: 0.0101 375/500 [=====================>........] - ETA: 1:02 - loss: 0.2258 - regression_loss: 0.2157 - classification_loss: 0.0101 376/500 [=====================>........] - ETA: 1:02 - loss: 0.2260 - regression_loss: 0.2159 - classification_loss: 0.0101 377/500 [=====================>........] - ETA: 1:01 - loss: 0.2258 - regression_loss: 0.2157 - classification_loss: 0.0101 378/500 [=====================>........] - ETA: 1:01 - loss: 0.2254 - regression_loss: 0.2153 - classification_loss: 0.0101 379/500 [=====================>........] - ETA: 1:00 - loss: 0.2253 - regression_loss: 0.2152 - classification_loss: 0.0101 380/500 [=====================>........] - ETA: 1:00 - loss: 0.2250 - regression_loss: 0.2149 - classification_loss: 0.0101 381/500 [=====================>........] - ETA: 59s - loss: 0.2251 - regression_loss: 0.2149 - classification_loss: 0.0101  382/500 [=====================>........] - ETA: 59s - loss: 0.2254 - regression_loss: 0.2152 - classification_loss: 0.0101 383/500 [=====================>........] - ETA: 58s - loss: 0.2257 - regression_loss: 0.2155 - classification_loss: 0.0101 384/500 [======================>.......] - ETA: 58s - loss: 0.2264 - regression_loss: 0.2163 - classification_loss: 0.0102 385/500 [======================>.......] - ETA: 57s - loss: 0.2267 - regression_loss: 0.2165 - classification_loss: 0.0102 386/500 [======================>.......] - ETA: 57s - loss: 0.2266 - regression_loss: 0.2165 - classification_loss: 0.0102 387/500 [======================>.......] - ETA: 56s - loss: 0.2265 - regression_loss: 0.2163 - classification_loss: 0.0102 388/500 [======================>.......] - ETA: 56s - loss: 0.2262 - regression_loss: 0.2161 - classification_loss: 0.0101 389/500 [======================>.......] - ETA: 55s - loss: 0.2262 - regression_loss: 0.2160 - classification_loss: 0.0101 390/500 [======================>.......] - ETA: 55s - loss: 0.2262 - regression_loss: 0.2160 - classification_loss: 0.0102 391/500 [======================>.......] - ETA: 54s - loss: 0.2259 - regression_loss: 0.2158 - classification_loss: 0.0101 392/500 [======================>.......] - ETA: 54s - loss: 0.2261 - regression_loss: 0.2159 - classification_loss: 0.0101 393/500 [======================>.......] - ETA: 53s - loss: 0.2258 - regression_loss: 0.2157 - classification_loss: 0.0101 394/500 [======================>.......] - ETA: 53s - loss: 0.2267 - regression_loss: 0.2165 - classification_loss: 0.0101 395/500 [======================>.......] - ETA: 52s - loss: 0.2266 - regression_loss: 0.2165 - classification_loss: 0.0101 396/500 [======================>.......] - ETA: 52s - loss: 0.2267 - regression_loss: 0.2165 - classification_loss: 0.0101 397/500 [======================>.......] - ETA: 51s - loss: 0.2263 - regression_loss: 0.2161 - classification_loss: 0.0101 398/500 [======================>.......] - ETA: 51s - loss: 0.2260 - regression_loss: 0.2159 - classification_loss: 0.0101 399/500 [======================>.......] - ETA: 50s - loss: 0.2266 - regression_loss: 0.2164 - classification_loss: 0.0102 400/500 [=======================>......] - ETA: 50s - loss: 0.2264 - regression_loss: 0.2163 - classification_loss: 0.0101 401/500 [=======================>......] - ETA: 49s - loss: 0.2264 - regression_loss: 0.2162 - classification_loss: 0.0101 402/500 [=======================>......] - ETA: 49s - loss: 0.2260 - regression_loss: 0.2159 - classification_loss: 0.0101 403/500 [=======================>......] - ETA: 48s - loss: 0.2260 - regression_loss: 0.2159 - classification_loss: 0.0101 404/500 [=======================>......] - ETA: 48s - loss: 0.2259 - regression_loss: 0.2158 - classification_loss: 0.0101 405/500 [=======================>......] - ETA: 47s - loss: 0.2264 - regression_loss: 0.2163 - classification_loss: 0.0101 406/500 [=======================>......] - ETA: 47s - loss: 0.2265 - regression_loss: 0.2164 - classification_loss: 0.0101 407/500 [=======================>......] - ETA: 46s - loss: 0.2264 - regression_loss: 0.2163 - classification_loss: 0.0101 408/500 [=======================>......] - ETA: 46s - loss: 0.2265 - regression_loss: 0.2164 - classification_loss: 0.0101 409/500 [=======================>......] - ETA: 45s - loss: 0.2264 - regression_loss: 0.2163 - classification_loss: 0.0101 410/500 [=======================>......] - ETA: 45s - loss: 0.2261 - regression_loss: 0.2161 - classification_loss: 0.0101 411/500 [=======================>......] - ETA: 44s - loss: 0.2259 - regression_loss: 0.2159 - classification_loss: 0.0100 412/500 [=======================>......] - ETA: 44s - loss: 0.2257 - regression_loss: 0.2157 - classification_loss: 0.0100 413/500 [=======================>......] - ETA: 43s - loss: 0.2256 - regression_loss: 0.2156 - classification_loss: 0.0100 414/500 [=======================>......] - ETA: 43s - loss: 0.2257 - regression_loss: 0.2157 - classification_loss: 0.0100 415/500 [=======================>......] - ETA: 42s - loss: 0.2261 - regression_loss: 0.2161 - classification_loss: 0.0100 416/500 [=======================>......] - ETA: 42s - loss: 0.2259 - regression_loss: 0.2159 - classification_loss: 0.0100 417/500 [========================>.....] - ETA: 41s - loss: 0.2255 - regression_loss: 0.2156 - classification_loss: 0.0100 418/500 [========================>.....] - ETA: 41s - loss: 0.2253 - regression_loss: 0.2154 - classification_loss: 0.0099 419/500 [========================>.....] - ETA: 40s - loss: 0.2255 - regression_loss: 0.2155 - classification_loss: 0.0100 420/500 [========================>.....] - ETA: 40s - loss: 0.2252 - regression_loss: 0.2153 - classification_loss: 0.0100 421/500 [========================>.....] - ETA: 39s - loss: 0.2251 - regression_loss: 0.2151 - classification_loss: 0.0100 422/500 [========================>.....] - ETA: 39s - loss: 0.2248 - regression_loss: 0.2149 - classification_loss: 0.0100 423/500 [========================>.....] - ETA: 38s - loss: 0.2245 - regression_loss: 0.2146 - classification_loss: 0.0099 424/500 [========================>.....] - ETA: 38s - loss: 0.2246 - regression_loss: 0.2147 - classification_loss: 0.0099 425/500 [========================>.....] - ETA: 37s - loss: 0.2247 - regression_loss: 0.2147 - classification_loss: 0.0099 426/500 [========================>.....] - ETA: 37s - loss: 0.2246 - regression_loss: 0.2147 - classification_loss: 0.0099 427/500 [========================>.....] - ETA: 36s - loss: 0.2245 - regression_loss: 0.2146 - classification_loss: 0.0099 428/500 [========================>.....] - ETA: 36s - loss: 0.2243 - regression_loss: 0.2144 - classification_loss: 0.0099 429/500 [========================>.....] - ETA: 35s - loss: 0.2244 - regression_loss: 0.2145 - classification_loss: 0.0099 430/500 [========================>.....] - ETA: 35s - loss: 0.2243 - regression_loss: 0.2144 - classification_loss: 0.0099 431/500 [========================>.....] - ETA: 34s - loss: 0.2241 - regression_loss: 0.2142 - classification_loss: 0.0099 432/500 [========================>.....] - ETA: 34s - loss: 0.2243 - regression_loss: 0.2145 - classification_loss: 0.0099 433/500 [========================>.....] - ETA: 33s - loss: 0.2242 - regression_loss: 0.2143 - classification_loss: 0.0099 434/500 [=========================>....] - ETA: 33s - loss: 0.2239 - regression_loss: 0.2141 - classification_loss: 0.0099 435/500 [=========================>....] - ETA: 32s - loss: 0.2237 - regression_loss: 0.2138 - classification_loss: 0.0099 436/500 [=========================>....] - ETA: 32s - loss: 0.2240 - regression_loss: 0.2141 - classification_loss: 0.0099 437/500 [=========================>....] - ETA: 31s - loss: 0.2240 - regression_loss: 0.2142 - classification_loss: 0.0099 438/500 [=========================>....] - ETA: 31s - loss: 0.2241 - regression_loss: 0.2142 - classification_loss: 0.0099 439/500 [=========================>....] - ETA: 30s - loss: 0.2243 - regression_loss: 0.2144 - classification_loss: 0.0099 440/500 [=========================>....] - ETA: 30s - loss: 0.2245 - regression_loss: 0.2146 - classification_loss: 0.0099 441/500 [=========================>....] - ETA: 29s - loss: 0.2245 - regression_loss: 0.2147 - classification_loss: 0.0099 442/500 [=========================>....] - ETA: 29s - loss: 0.2245 - regression_loss: 0.2147 - classification_loss: 0.0099 443/500 [=========================>....] - ETA: 28s - loss: 0.2245 - regression_loss: 0.2146 - classification_loss: 0.0099 444/500 [=========================>....] - ETA: 28s - loss: 0.2243 - regression_loss: 0.2144 - classification_loss: 0.0099 445/500 [=========================>....] - ETA: 27s - loss: 0.2241 - regression_loss: 0.2142 - classification_loss: 0.0099 446/500 [=========================>....] - ETA: 27s - loss: 0.2238 - regression_loss: 0.2140 - classification_loss: 0.0099 447/500 [=========================>....] - ETA: 26s - loss: 0.2237 - regression_loss: 0.2138 - classification_loss: 0.0098 448/500 [=========================>....] - ETA: 26s - loss: 0.2234 - regression_loss: 0.2135 - classification_loss: 0.0098 449/500 [=========================>....] - ETA: 25s - loss: 0.2233 - regression_loss: 0.2135 - classification_loss: 0.0098 450/500 [==========================>...] - ETA: 25s - loss: 0.2231 - regression_loss: 0.2133 - classification_loss: 0.0098 451/500 [==========================>...] - ETA: 24s - loss: 0.2232 - regression_loss: 0.2134 - classification_loss: 0.0098 452/500 [==========================>...] - ETA: 24s - loss: 0.2229 - regression_loss: 0.2131 - classification_loss: 0.0098 453/500 [==========================>...] - ETA: 23s - loss: 0.2228 - regression_loss: 0.2130 - classification_loss: 0.0098 454/500 [==========================>...] - ETA: 23s - loss: 0.2228 - regression_loss: 0.2130 - classification_loss: 0.0098 455/500 [==========================>...] - ETA: 22s - loss: 0.2227 - regression_loss: 0.2129 - classification_loss: 0.0098 456/500 [==========================>...] - ETA: 22s - loss: 0.2228 - regression_loss: 0.2130 - classification_loss: 0.0098 457/500 [==========================>...] - ETA: 21s - loss: 0.2227 - regression_loss: 0.2129 - classification_loss: 0.0098 458/500 [==========================>...] - ETA: 21s - loss: 0.2226 - regression_loss: 0.2128 - classification_loss: 0.0098 459/500 [==========================>...] - ETA: 20s - loss: 0.2224 - regression_loss: 0.2126 - classification_loss: 0.0098 460/500 [==========================>...] - ETA: 20s - loss: 0.2224 - regression_loss: 0.2126 - classification_loss: 0.0098 461/500 [==========================>...] - ETA: 19s - loss: 0.2223 - regression_loss: 0.2125 - classification_loss: 0.0098 462/500 [==========================>...] - ETA: 19s - loss: 0.2224 - regression_loss: 0.2126 - classification_loss: 0.0098 463/500 [==========================>...] - ETA: 18s - loss: 0.2223 - regression_loss: 0.2125 - classification_loss: 0.0098 464/500 [==========================>...] - ETA: 18s - loss: 0.2223 - regression_loss: 0.2125 - classification_loss: 0.0098 465/500 [==========================>...] - ETA: 17s - loss: 0.2224 - regression_loss: 0.2126 - classification_loss: 0.0098 466/500 [==========================>...] - ETA: 17s - loss: 0.2220 - regression_loss: 0.2123 - classification_loss: 0.0098 467/500 [===========================>..] - ETA: 16s - loss: 0.2219 - regression_loss: 0.2121 - classification_loss: 0.0098 468/500 [===========================>..] - ETA: 16s - loss: 0.2216 - regression_loss: 0.2119 - classification_loss: 0.0097 469/500 [===========================>..] - ETA: 15s - loss: 0.2215 - regression_loss: 0.2118 - classification_loss: 0.0098 470/500 [===========================>..] - ETA: 15s - loss: 0.2216 - regression_loss: 0.2118 - classification_loss: 0.0098 471/500 [===========================>..] - ETA: 14s - loss: 0.2217 - regression_loss: 0.2119 - classification_loss: 0.0098 472/500 [===========================>..] - ETA: 14s - loss: 0.2215 - regression_loss: 0.2117 - classification_loss: 0.0097 473/500 [===========================>..] - ETA: 13s - loss: 0.2215 - regression_loss: 0.2117 - classification_loss: 0.0098 474/500 [===========================>..] - ETA: 13s - loss: 0.2219 - regression_loss: 0.2120 - classification_loss: 0.0098 475/500 [===========================>..] - ETA: 12s - loss: 0.2216 - regression_loss: 0.2118 - classification_loss: 0.0098 476/500 [===========================>..] - ETA: 12s - loss: 0.2215 - regression_loss: 0.2117 - classification_loss: 0.0098 477/500 [===========================>..] - ETA: 11s - loss: 0.2212 - regression_loss: 0.2114 - classification_loss: 0.0098 478/500 [===========================>..] - ETA: 11s - loss: 0.2213 - regression_loss: 0.2115 - classification_loss: 0.0098 479/500 [===========================>..] - ETA: 10s - loss: 0.2210 - regression_loss: 0.2113 - classification_loss: 0.0098 480/500 [===========================>..] - ETA: 10s - loss: 0.2208 - regression_loss: 0.2110 - classification_loss: 0.0098 481/500 [===========================>..] - ETA: 9s - loss: 0.2207 - regression_loss: 0.2110 - classification_loss: 0.0098  482/500 [===========================>..] - ETA: 9s - loss: 0.2207 - regression_loss: 0.2109 - classification_loss: 0.0098 483/500 [===========================>..] - ETA: 8s - loss: 0.2207 - regression_loss: 0.2109 - classification_loss: 0.0098 484/500 [============================>.] - ETA: 8s - loss: 0.2206 - regression_loss: 0.2108 - classification_loss: 0.0098 485/500 [============================>.] - ETA: 7s - loss: 0.2206 - regression_loss: 0.2109 - classification_loss: 0.0098 486/500 [============================>.] - ETA: 7s - loss: 0.2206 - regression_loss: 0.2108 - classification_loss: 0.0098 487/500 [============================>.] - ETA: 6s - loss: 0.2205 - regression_loss: 0.2108 - classification_loss: 0.0098 488/500 [============================>.] - ETA: 6s - loss: 0.2203 - regression_loss: 0.2106 - classification_loss: 0.0098 489/500 [============================>.] - ETA: 5s - loss: 0.2204 - regression_loss: 0.2106 - classification_loss: 0.0098 490/500 [============================>.] - ETA: 5s - loss: 0.2203 - regression_loss: 0.2105 - classification_loss: 0.0097 491/500 [============================>.] - ETA: 4s - loss: 0.2201 - regression_loss: 0.2103 - classification_loss: 0.0097 492/500 [============================>.] - ETA: 4s - loss: 0.2198 - regression_loss: 0.2101 - classification_loss: 0.0097 493/500 [============================>.] - ETA: 3s - loss: 0.2196 - regression_loss: 0.2099 - classification_loss: 0.0097 494/500 [============================>.] - ETA: 3s - loss: 0.2200 - regression_loss: 0.2102 - classification_loss: 0.0098 495/500 [============================>.] - ETA: 2s - loss: 0.2199 - regression_loss: 0.2101 - classification_loss: 0.0098 496/500 [============================>.] - ETA: 2s - loss: 0.2199 - regression_loss: 0.2101 - classification_loss: 0.0098 497/500 [============================>.] - ETA: 1s - loss: 0.2201 - regression_loss: 0.2103 - classification_loss: 0.0098 498/500 [============================>.] - ETA: 1s - loss: 0.2199 - regression_loss: 0.2100 - classification_loss: 0.0098 499/500 [============================>.] - ETA: 0s - loss: 0.2197 - regression_loss: 0.2099 - classification_loss: 0.0098 500/500 [==============================] - 251s 503ms/step - loss: 0.2198 - regression_loss: 0.2100 - classification_loss: 0.0098 1172 instances of class plum with average precision: 0.7394 mAP: 0.7394 Epoch 00034: saving model to ./training/snapshots/resnet101_pascal_34.h5 Epoch 35/150 1/500 [..............................] - ETA: 3:59 - loss: 0.1872 - regression_loss: 0.1820 - classification_loss: 0.0052 2/500 [..............................] - ETA: 3:59 - loss: 0.3657 - regression_loss: 0.3507 - classification_loss: 0.0150 3/500 [..............................] - ETA: 4:01 - loss: 0.2821 - regression_loss: 0.2713 - classification_loss: 0.0108 4/500 [..............................] - ETA: 4:07 - loss: 0.2659 - regression_loss: 0.2568 - classification_loss: 0.0090 5/500 [..............................] - ETA: 4:07 - loss: 0.2866 - regression_loss: 0.2766 - classification_loss: 0.0099 6/500 [..............................] - ETA: 4:05 - loss: 0.2912 - regression_loss: 0.2808 - classification_loss: 0.0104 7/500 [..............................] - ETA: 4:04 - loss: 0.2728 - regression_loss: 0.2629 - classification_loss: 0.0098 8/500 [..............................] - ETA: 4:04 - loss: 0.2853 - regression_loss: 0.2735 - classification_loss: 0.0118 9/500 [..............................] - ETA: 4:04 - loss: 0.2876 - regression_loss: 0.2733 - classification_loss: 0.0143 10/500 [..............................] - ETA: 4:03 - loss: 0.2801 - regression_loss: 0.2668 - classification_loss: 0.0133 11/500 [..............................] - ETA: 4:03 - loss: 0.3076 - regression_loss: 0.2936 - classification_loss: 0.0140 12/500 [..............................] - ETA: 4:04 - loss: 0.3059 - regression_loss: 0.2928 - classification_loss: 0.0132 13/500 [..............................] - ETA: 4:03 - loss: 0.2913 - regression_loss: 0.2791 - classification_loss: 0.0123 14/500 [..............................] - ETA: 4:03 - loss: 0.2844 - regression_loss: 0.2727 - classification_loss: 0.0117 15/500 [..............................] - ETA: 4:02 - loss: 0.2811 - regression_loss: 0.2697 - classification_loss: 0.0114 16/500 [..............................] - ETA: 4:02 - loss: 0.2838 - regression_loss: 0.2720 - classification_loss: 0.0118 17/500 [>.............................] - ETA: 4:01 - loss: 0.2887 - regression_loss: 0.2763 - classification_loss: 0.0125 18/500 [>.............................] - ETA: 4:01 - loss: 0.2871 - regression_loss: 0.2748 - classification_loss: 0.0122 19/500 [>.............................] - ETA: 4:01 - loss: 0.2883 - regression_loss: 0.2760 - classification_loss: 0.0123 20/500 [>.............................] - ETA: 4:00 - loss: 0.2789 - regression_loss: 0.2672 - classification_loss: 0.0117 21/500 [>.............................] - ETA: 4:00 - loss: 0.2748 - regression_loss: 0.2633 - classification_loss: 0.0116 22/500 [>.............................] - ETA: 3:59 - loss: 0.2691 - regression_loss: 0.2578 - classification_loss: 0.0113 23/500 [>.............................] - ETA: 3:59 - loss: 0.2631 - regression_loss: 0.2522 - classification_loss: 0.0109 24/500 [>.............................] - ETA: 3:58 - loss: 0.2633 - regression_loss: 0.2523 - classification_loss: 0.0109 25/500 [>.............................] - ETA: 3:58 - loss: 0.2597 - regression_loss: 0.2490 - classification_loss: 0.0107 26/500 [>.............................] - ETA: 3:57 - loss: 0.2552 - regression_loss: 0.2447 - classification_loss: 0.0105 27/500 [>.............................] - ETA: 3:57 - loss: 0.2581 - regression_loss: 0.2471 - classification_loss: 0.0110 28/500 [>.............................] - ETA: 3:56 - loss: 0.2584 - regression_loss: 0.2476 - classification_loss: 0.0107 29/500 [>.............................] - ETA: 3:56 - loss: 0.2581 - regression_loss: 0.2473 - classification_loss: 0.0108 30/500 [>.............................] - ETA: 3:55 - loss: 0.2586 - regression_loss: 0.2478 - classification_loss: 0.0108 31/500 [>.............................] - ETA: 3:54 - loss: 0.2521 - regression_loss: 0.2415 - classification_loss: 0.0106 32/500 [>.............................] - ETA: 3:54 - loss: 0.2560 - regression_loss: 0.2451 - classification_loss: 0.0109 33/500 [>.............................] - ETA: 3:54 - loss: 0.2619 - regression_loss: 0.2506 - classification_loss: 0.0113 34/500 [=>............................] - ETA: 3:54 - loss: 0.2581 - regression_loss: 0.2469 - classification_loss: 0.0112 35/500 [=>............................] - ETA: 3:53 - loss: 0.2549 - regression_loss: 0.2439 - classification_loss: 0.0110 36/500 [=>............................] - ETA: 3:52 - loss: 0.2520 - regression_loss: 0.2412 - classification_loss: 0.0108 37/500 [=>............................] - ETA: 3:52 - loss: 0.2504 - regression_loss: 0.2394 - classification_loss: 0.0110 38/500 [=>............................] - ETA: 3:51 - loss: 0.2479 - regression_loss: 0.2369 - classification_loss: 0.0110 39/500 [=>............................] - ETA: 3:51 - loss: 0.2486 - regression_loss: 0.2377 - classification_loss: 0.0109 40/500 [=>............................] - ETA: 3:51 - loss: 0.2494 - regression_loss: 0.2384 - classification_loss: 0.0110 41/500 [=>............................] - ETA: 3:50 - loss: 0.2508 - regression_loss: 0.2395 - classification_loss: 0.0112 42/500 [=>............................] - ETA: 3:49 - loss: 0.2487 - regression_loss: 0.2376 - classification_loss: 0.0111 43/500 [=>............................] - ETA: 3:49 - loss: 0.2518 - regression_loss: 0.2407 - classification_loss: 0.0111 44/500 [=>............................] - ETA: 3:48 - loss: 0.2530 - regression_loss: 0.2419 - classification_loss: 0.0111 45/500 [=>............................] - ETA: 3:48 - loss: 0.2537 - regression_loss: 0.2428 - classification_loss: 0.0109 46/500 [=>............................] - ETA: 3:48 - loss: 0.2531 - regression_loss: 0.2421 - classification_loss: 0.0109 47/500 [=>............................] - ETA: 3:47 - loss: 0.2496 - regression_loss: 0.2388 - classification_loss: 0.0108 48/500 [=>............................] - ETA: 3:46 - loss: 0.2460 - regression_loss: 0.2354 - classification_loss: 0.0106 49/500 [=>............................] - ETA: 3:46 - loss: 0.2455 - regression_loss: 0.2350 - classification_loss: 0.0104 50/500 [==>...........................] - ETA: 3:46 - loss: 0.2454 - regression_loss: 0.2350 - classification_loss: 0.0104 51/500 [==>...........................] - ETA: 3:45 - loss: 0.2463 - regression_loss: 0.2359 - classification_loss: 0.0104 52/500 [==>...........................] - ETA: 3:44 - loss: 0.2430 - regression_loss: 0.2327 - classification_loss: 0.0103 53/500 [==>...........................] - ETA: 3:44 - loss: 0.2404 - regression_loss: 0.2302 - classification_loss: 0.0102 54/500 [==>...........................] - ETA: 3:43 - loss: 0.2415 - regression_loss: 0.2313 - classification_loss: 0.0102 55/500 [==>...........................] - ETA: 3:43 - loss: 0.2392 - regression_loss: 0.2291 - classification_loss: 0.0100 56/500 [==>...........................] - ETA: 3:43 - loss: 0.2390 - regression_loss: 0.2290 - classification_loss: 0.0101 57/500 [==>...........................] - ETA: 3:42 - loss: 0.2393 - regression_loss: 0.2292 - classification_loss: 0.0101 58/500 [==>...........................] - ETA: 3:41 - loss: 0.2376 - regression_loss: 0.2273 - classification_loss: 0.0103 59/500 [==>...........................] - ETA: 3:41 - loss: 0.2356 - regression_loss: 0.2254 - classification_loss: 0.0102 60/500 [==>...........................] - ETA: 3:40 - loss: 0.2331 - regression_loss: 0.2230 - classification_loss: 0.0101 61/500 [==>...........................] - ETA: 3:40 - loss: 0.2308 - regression_loss: 0.2208 - classification_loss: 0.0099 62/500 [==>...........................] - ETA: 3:39 - loss: 0.2320 - regression_loss: 0.2220 - classification_loss: 0.0100 63/500 [==>...........................] - ETA: 3:39 - loss: 0.2317 - regression_loss: 0.2216 - classification_loss: 0.0101 64/500 [==>...........................] - ETA: 3:38 - loss: 0.2317 - regression_loss: 0.2217 - classification_loss: 0.0100 65/500 [==>...........................] - ETA: 3:38 - loss: 0.2298 - regression_loss: 0.2199 - classification_loss: 0.0099 66/500 [==>...........................] - ETA: 3:37 - loss: 0.2304 - regression_loss: 0.2202 - classification_loss: 0.0102 67/500 [===>..........................] - ETA: 3:37 - loss: 0.2293 - regression_loss: 0.2191 - classification_loss: 0.0103 68/500 [===>..........................] - ETA: 3:36 - loss: 0.2282 - regression_loss: 0.2179 - classification_loss: 0.0103 69/500 [===>..........................] - ETA: 3:36 - loss: 0.2263 - regression_loss: 0.2160 - classification_loss: 0.0103 70/500 [===>..........................] - ETA: 3:35 - loss: 0.2267 - regression_loss: 0.2164 - classification_loss: 0.0104 71/500 [===>..........................] - ETA: 3:35 - loss: 0.2274 - regression_loss: 0.2170 - classification_loss: 0.0104 72/500 [===>..........................] - ETA: 3:34 - loss: 0.2271 - regression_loss: 0.2168 - classification_loss: 0.0103 73/500 [===>..........................] - ETA: 3:34 - loss: 0.2269 - regression_loss: 0.2166 - classification_loss: 0.0103 74/500 [===>..........................] - ETA: 3:33 - loss: 0.2281 - regression_loss: 0.2179 - classification_loss: 0.0103 75/500 [===>..........................] - ETA: 3:33 - loss: 0.2276 - regression_loss: 0.2173 - classification_loss: 0.0103 76/500 [===>..........................] - ETA: 3:32 - loss: 0.2270 - regression_loss: 0.2168 - classification_loss: 0.0102 77/500 [===>..........................] - ETA: 3:32 - loss: 0.2262 - regression_loss: 0.2161 - classification_loss: 0.0101 78/500 [===>..........................] - ETA: 3:31 - loss: 0.2263 - regression_loss: 0.2161 - classification_loss: 0.0103 79/500 [===>..........................] - ETA: 3:31 - loss: 0.2271 - regression_loss: 0.2168 - classification_loss: 0.0103 80/500 [===>..........................] - ETA: 3:31 - loss: 0.2257 - regression_loss: 0.2154 - classification_loss: 0.0103 81/500 [===>..........................] - ETA: 3:30 - loss: 0.2251 - regression_loss: 0.2149 - classification_loss: 0.0102 82/500 [===>..........................] - ETA: 3:29 - loss: 0.2263 - regression_loss: 0.2161 - classification_loss: 0.0102 83/500 [===>..........................] - ETA: 3:29 - loss: 0.2264 - regression_loss: 0.2162 - classification_loss: 0.0102 84/500 [====>.........................] - ETA: 3:29 - loss: 0.2269 - regression_loss: 0.2167 - classification_loss: 0.0102 85/500 [====>.........................] - ETA: 3:28 - loss: 0.2268 - regression_loss: 0.2166 - classification_loss: 0.0102 86/500 [====>.........................] - ETA: 3:28 - loss: 0.2264 - regression_loss: 0.2162 - classification_loss: 0.0102 87/500 [====>.........................] - ETA: 3:27 - loss: 0.2253 - regression_loss: 0.2152 - classification_loss: 0.0101 88/500 [====>.........................] - ETA: 3:27 - loss: 0.2236 - regression_loss: 0.2135 - classification_loss: 0.0101 89/500 [====>.........................] - ETA: 3:26 - loss: 0.2220 - regression_loss: 0.2120 - classification_loss: 0.0100 90/500 [====>.........................] - ETA: 3:26 - loss: 0.2216 - regression_loss: 0.2117 - classification_loss: 0.0100 91/500 [====>.........................] - ETA: 3:25 - loss: 0.2222 - regression_loss: 0.2122 - classification_loss: 0.0100 92/500 [====>.........................] - ETA: 3:25 - loss: 0.2218 - regression_loss: 0.2118 - classification_loss: 0.0099 93/500 [====>.........................] - ETA: 3:24 - loss: 0.2233 - regression_loss: 0.2133 - classification_loss: 0.0100 94/500 [====>.........................] - ETA: 3:24 - loss: 0.2278 - regression_loss: 0.2179 - classification_loss: 0.0099 95/500 [====>.........................] - ETA: 3:23 - loss: 0.2304 - regression_loss: 0.2206 - classification_loss: 0.0099 96/500 [====>.........................] - ETA: 3:23 - loss: 0.2325 - regression_loss: 0.2224 - classification_loss: 0.0101 97/500 [====>.........................] - ETA: 3:22 - loss: 0.2342 - regression_loss: 0.2239 - classification_loss: 0.0103 98/500 [====>.........................] - ETA: 3:22 - loss: 0.2335 - regression_loss: 0.2232 - classification_loss: 0.0103 99/500 [====>.........................] - ETA: 3:21 - loss: 0.2333 - regression_loss: 0.2230 - classification_loss: 0.0103 100/500 [=====>........................] - ETA: 3:21 - loss: 0.2334 - regression_loss: 0.2231 - classification_loss: 0.0103 101/500 [=====>........................] - ETA: 3:20 - loss: 0.2342 - regression_loss: 0.2239 - classification_loss: 0.0102 102/500 [=====>........................] - ETA: 3:20 - loss: 0.2334 - regression_loss: 0.2232 - classification_loss: 0.0102 103/500 [=====>........................] - ETA: 3:20 - loss: 0.2339 - regression_loss: 0.2238 - classification_loss: 0.0101 104/500 [=====>........................] - ETA: 3:19 - loss: 0.2363 - regression_loss: 0.2260 - classification_loss: 0.0102 105/500 [=====>........................] - ETA: 3:18 - loss: 0.2365 - regression_loss: 0.2263 - classification_loss: 0.0102 106/500 [=====>........................] - ETA: 3:18 - loss: 0.2366 - regression_loss: 0.2264 - classification_loss: 0.0102 107/500 [=====>........................] - ETA: 3:17 - loss: 0.2356 - regression_loss: 0.2255 - classification_loss: 0.0102 108/500 [=====>........................] - ETA: 3:17 - loss: 0.2365 - regression_loss: 0.2263 - classification_loss: 0.0102 109/500 [=====>........................] - ETA: 3:16 - loss: 0.2362 - regression_loss: 0.2259 - classification_loss: 0.0103 110/500 [=====>........................] - ETA: 3:16 - loss: 0.2371 - regression_loss: 0.2266 - classification_loss: 0.0104 111/500 [=====>........................] - ETA: 3:15 - loss: 0.2355 - regression_loss: 0.2252 - classification_loss: 0.0104 112/500 [=====>........................] - ETA: 3:15 - loss: 0.2353 - regression_loss: 0.2250 - classification_loss: 0.0103 113/500 [=====>........................] - ETA: 3:14 - loss: 0.2349 - regression_loss: 0.2246 - classification_loss: 0.0103 114/500 [=====>........................] - ETA: 3:14 - loss: 0.2343 - regression_loss: 0.2240 - classification_loss: 0.0103 115/500 [=====>........................] - ETA: 3:13 - loss: 0.2330 - regression_loss: 0.2227 - classification_loss: 0.0102 116/500 [=====>........................] - ETA: 3:13 - loss: 0.2335 - regression_loss: 0.2232 - classification_loss: 0.0102 117/500 [======>.......................] - ETA: 3:12 - loss: 0.2338 - regression_loss: 0.2235 - classification_loss: 0.0103 118/500 [======>.......................] - ETA: 3:12 - loss: 0.2335 - regression_loss: 0.2233 - classification_loss: 0.0103 119/500 [======>.......................] - ETA: 3:11 - loss: 0.2320 - regression_loss: 0.2218 - classification_loss: 0.0102 120/500 [======>.......................] - ETA: 3:11 - loss: 0.2316 - regression_loss: 0.2214 - classification_loss: 0.0102 121/500 [======>.......................] - ETA: 3:10 - loss: 0.2314 - regression_loss: 0.2211 - classification_loss: 0.0102 122/500 [======>.......................] - ETA: 3:10 - loss: 0.2320 - regression_loss: 0.2219 - classification_loss: 0.0102 123/500 [======>.......................] - ETA: 3:09 - loss: 0.2315 - regression_loss: 0.2214 - classification_loss: 0.0101 124/500 [======>.......................] - ETA: 3:09 - loss: 0.2311 - regression_loss: 0.2210 - classification_loss: 0.0101 125/500 [======>.......................] - ETA: 3:08 - loss: 0.2318 - regression_loss: 0.2217 - classification_loss: 0.0101 126/500 [======>.......................] - ETA: 3:08 - loss: 0.2319 - regression_loss: 0.2218 - classification_loss: 0.0101 127/500 [======>.......................] - ETA: 3:07 - loss: 0.2319 - regression_loss: 0.2218 - classification_loss: 0.0101 128/500 [======>.......................] - ETA: 3:06 - loss: 0.2316 - regression_loss: 0.2215 - classification_loss: 0.0101 129/500 [======>.......................] - ETA: 3:06 - loss: 0.2305 - regression_loss: 0.2204 - classification_loss: 0.0100 130/500 [======>.......................] - ETA: 3:05 - loss: 0.2302 - regression_loss: 0.2202 - classification_loss: 0.0100 131/500 [======>.......................] - ETA: 3:05 - loss: 0.2303 - regression_loss: 0.2203 - classification_loss: 0.0100 132/500 [======>.......................] - ETA: 3:04 - loss: 0.2300 - regression_loss: 0.2200 - classification_loss: 0.0100 133/500 [======>.......................] - ETA: 3:04 - loss: 0.2297 - regression_loss: 0.2197 - classification_loss: 0.0099 134/500 [=======>......................] - ETA: 3:03 - loss: 0.2299 - regression_loss: 0.2200 - classification_loss: 0.0100 135/500 [=======>......................] - ETA: 3:03 - loss: 0.2304 - regression_loss: 0.2204 - classification_loss: 0.0099 136/500 [=======>......................] - ETA: 3:02 - loss: 0.2304 - regression_loss: 0.2205 - classification_loss: 0.0099 137/500 [=======>......................] - ETA: 3:02 - loss: 0.2299 - regression_loss: 0.2200 - classification_loss: 0.0098 138/500 [=======>......................] - ETA: 3:01 - loss: 0.2289 - regression_loss: 0.2191 - classification_loss: 0.0098 139/500 [=======>......................] - ETA: 3:01 - loss: 0.2290 - regression_loss: 0.2192 - classification_loss: 0.0098 140/500 [=======>......................] - ETA: 3:00 - loss: 0.2295 - regression_loss: 0.2198 - classification_loss: 0.0098 141/500 [=======>......................] - ETA: 3:00 - loss: 0.2297 - regression_loss: 0.2198 - classification_loss: 0.0099 142/500 [=======>......................] - ETA: 2:59 - loss: 0.2298 - regression_loss: 0.2199 - classification_loss: 0.0099 143/500 [=======>......................] - ETA: 2:59 - loss: 0.2302 - regression_loss: 0.2202 - classification_loss: 0.0100 144/500 [=======>......................] - ETA: 2:58 - loss: 0.2297 - regression_loss: 0.2198 - classification_loss: 0.0099 145/500 [=======>......................] - ETA: 2:58 - loss: 0.2303 - regression_loss: 0.2203 - classification_loss: 0.0099 146/500 [=======>......................] - ETA: 2:57 - loss: 0.2298 - regression_loss: 0.2199 - classification_loss: 0.0099 147/500 [=======>......................] - ETA: 2:57 - loss: 0.2323 - regression_loss: 0.2225 - classification_loss: 0.0098 148/500 [=======>......................] - ETA: 2:57 - loss: 0.2323 - regression_loss: 0.2225 - classification_loss: 0.0098 149/500 [=======>......................] - ETA: 2:56 - loss: 0.2320 - regression_loss: 0.2222 - classification_loss: 0.0098 150/500 [========>.....................] - ETA: 2:55 - loss: 0.2314 - regression_loss: 0.2216 - classification_loss: 0.0098 151/500 [========>.....................] - ETA: 2:55 - loss: 0.2318 - regression_loss: 0.2220 - classification_loss: 0.0098 152/500 [========>.....................] - ETA: 2:54 - loss: 0.2317 - regression_loss: 0.2219 - classification_loss: 0.0098 153/500 [========>.....................] - ETA: 2:54 - loss: 0.2315 - regression_loss: 0.2217 - classification_loss: 0.0097 154/500 [========>.....................] - ETA: 2:53 - loss: 0.2319 - regression_loss: 0.2221 - classification_loss: 0.0097 155/500 [========>.....................] - ETA: 2:53 - loss: 0.2316 - regression_loss: 0.2219 - classification_loss: 0.0097 156/500 [========>.....................] - ETA: 2:52 - loss: 0.2311 - regression_loss: 0.2214 - classification_loss: 0.0097 157/500 [========>.....................] - ETA: 2:52 - loss: 0.2307 - regression_loss: 0.2209 - classification_loss: 0.0097 158/500 [========>.....................] - ETA: 2:51 - loss: 0.2314 - regression_loss: 0.2217 - classification_loss: 0.0097 159/500 [========>.....................] - ETA: 2:51 - loss: 0.2311 - regression_loss: 0.2214 - classification_loss: 0.0097 160/500 [========>.....................] - ETA: 2:51 - loss: 0.2303 - regression_loss: 0.2207 - classification_loss: 0.0096 161/500 [========>.....................] - ETA: 2:50 - loss: 0.2307 - regression_loss: 0.2210 - classification_loss: 0.0097 162/500 [========>.....................] - ETA: 2:49 - loss: 0.2303 - regression_loss: 0.2206 - classification_loss: 0.0096 163/500 [========>.....................] - ETA: 2:49 - loss: 0.2301 - regression_loss: 0.2205 - classification_loss: 0.0096 164/500 [========>.....................] - ETA: 2:48 - loss: 0.2299 - regression_loss: 0.2203 - classification_loss: 0.0096 165/500 [========>.....................] - ETA: 2:48 - loss: 0.2288 - regression_loss: 0.2193 - classification_loss: 0.0095 166/500 [========>.....................] - ETA: 2:47 - loss: 0.2281 - regression_loss: 0.2186 - classification_loss: 0.0095 167/500 [=========>....................] - ETA: 2:47 - loss: 0.2274 - regression_loss: 0.2179 - classification_loss: 0.0095 168/500 [=========>....................] - ETA: 2:46 - loss: 0.2264 - regression_loss: 0.2170 - classification_loss: 0.0094 169/500 [=========>....................] - ETA: 2:46 - loss: 0.2259 - regression_loss: 0.2165 - classification_loss: 0.0094 170/500 [=========>....................] - ETA: 2:45 - loss: 0.2255 - regression_loss: 0.2161 - classification_loss: 0.0094 171/500 [=========>....................] - ETA: 2:45 - loss: 0.2269 - regression_loss: 0.2175 - classification_loss: 0.0094 172/500 [=========>....................] - ETA: 2:44 - loss: 0.2264 - regression_loss: 0.2170 - classification_loss: 0.0094 173/500 [=========>....................] - ETA: 2:44 - loss: 0.2269 - regression_loss: 0.2175 - classification_loss: 0.0094 174/500 [=========>....................] - ETA: 2:43 - loss: 0.2274 - regression_loss: 0.2180 - classification_loss: 0.0094 175/500 [=========>....................] - ETA: 2:43 - loss: 0.2275 - regression_loss: 0.2181 - classification_loss: 0.0094 176/500 [=========>....................] - ETA: 2:42 - loss: 0.2275 - regression_loss: 0.2181 - classification_loss: 0.0094 177/500 [=========>....................] - ETA: 2:42 - loss: 0.2267 - regression_loss: 0.2173 - classification_loss: 0.0093 178/500 [=========>....................] - ETA: 2:41 - loss: 0.2262 - regression_loss: 0.2169 - classification_loss: 0.0093 179/500 [=========>....................] - ETA: 2:41 - loss: 0.2261 - regression_loss: 0.2168 - classification_loss: 0.0093 180/500 [=========>....................] - ETA: 2:40 - loss: 0.2255 - regression_loss: 0.2162 - classification_loss: 0.0093 181/500 [=========>....................] - ETA: 2:40 - loss: 0.2248 - regression_loss: 0.2156 - classification_loss: 0.0092 182/500 [=========>....................] - ETA: 2:39 - loss: 0.2252 - regression_loss: 0.2159 - classification_loss: 0.0093 183/500 [=========>....................] - ETA: 2:39 - loss: 0.2245 - regression_loss: 0.2153 - classification_loss: 0.0092 184/500 [==========>...................] - ETA: 2:38 - loss: 0.2248 - regression_loss: 0.2156 - classification_loss: 0.0092 185/500 [==========>...................] - ETA: 2:38 - loss: 0.2246 - regression_loss: 0.2153 - classification_loss: 0.0092 186/500 [==========>...................] - ETA: 2:37 - loss: 0.2246 - regression_loss: 0.2154 - classification_loss: 0.0093 187/500 [==========>...................] - ETA: 2:37 - loss: 0.2254 - regression_loss: 0.2161 - classification_loss: 0.0093 188/500 [==========>...................] - ETA: 2:36 - loss: 0.2248 - regression_loss: 0.2156 - classification_loss: 0.0093 189/500 [==========>...................] - ETA: 2:36 - loss: 0.2247 - regression_loss: 0.2154 - classification_loss: 0.0093 190/500 [==========>...................] - ETA: 2:35 - loss: 0.2244 - regression_loss: 0.2151 - classification_loss: 0.0093 191/500 [==========>...................] - ETA: 2:35 - loss: 0.2239 - regression_loss: 0.2147 - classification_loss: 0.0092 192/500 [==========>...................] - ETA: 2:34 - loss: 0.2232 - regression_loss: 0.2140 - classification_loss: 0.0092 193/500 [==========>...................] - ETA: 2:34 - loss: 0.2229 - regression_loss: 0.2137 - classification_loss: 0.0092 194/500 [==========>...................] - ETA: 2:33 - loss: 0.2231 - regression_loss: 0.2139 - classification_loss: 0.0092 195/500 [==========>...................] - ETA: 2:33 - loss: 0.2229 - regression_loss: 0.2138 - classification_loss: 0.0092 196/500 [==========>...................] - ETA: 2:32 - loss: 0.2225 - regression_loss: 0.2134 - classification_loss: 0.0091 197/500 [==========>...................] - ETA: 2:32 - loss: 0.2218 - regression_loss: 0.2127 - classification_loss: 0.0091 198/500 [==========>...................] - ETA: 2:31 - loss: 0.2215 - regression_loss: 0.2124 - classification_loss: 0.0091 199/500 [==========>...................] - ETA: 2:31 - loss: 0.2207 - regression_loss: 0.2117 - classification_loss: 0.0090 200/500 [===========>..................] - ETA: 2:30 - loss: 0.2204 - regression_loss: 0.2114 - classification_loss: 0.0090 201/500 [===========>..................] - ETA: 2:30 - loss: 0.2197 - regression_loss: 0.2107 - classification_loss: 0.0090 202/500 [===========>..................] - ETA: 2:29 - loss: 0.2204 - regression_loss: 0.2114 - classification_loss: 0.0090 203/500 [===========>..................] - ETA: 2:29 - loss: 0.2211 - regression_loss: 0.2120 - classification_loss: 0.0091 204/500 [===========>..................] - ETA: 2:28 - loss: 0.2215 - regression_loss: 0.2124 - classification_loss: 0.0091 205/500 [===========>..................] - ETA: 2:28 - loss: 0.2208 - regression_loss: 0.2117 - classification_loss: 0.0091 206/500 [===========>..................] - ETA: 2:27 - loss: 0.2207 - regression_loss: 0.2117 - classification_loss: 0.0091 207/500 [===========>..................] - ETA: 2:27 - loss: 0.2205 - regression_loss: 0.2114 - classification_loss: 0.0091 208/500 [===========>..................] - ETA: 2:26 - loss: 0.2196 - regression_loss: 0.2106 - classification_loss: 0.0090 209/500 [===========>..................] - ETA: 2:26 - loss: 0.2203 - regression_loss: 0.2110 - classification_loss: 0.0093 210/500 [===========>..................] - ETA: 2:25 - loss: 0.2198 - regression_loss: 0.2105 - classification_loss: 0.0093 211/500 [===========>..................] - ETA: 2:25 - loss: 0.2196 - regression_loss: 0.2104 - classification_loss: 0.0093 212/500 [===========>..................] - ETA: 2:24 - loss: 0.2195 - regression_loss: 0.2102 - classification_loss: 0.0093 213/500 [===========>..................] - ETA: 2:24 - loss: 0.2200 - regression_loss: 0.2108 - classification_loss: 0.0092 214/500 [===========>..................] - ETA: 2:23 - loss: 0.2199 - regression_loss: 0.2107 - classification_loss: 0.0092 215/500 [===========>..................] - ETA: 2:23 - loss: 0.2200 - regression_loss: 0.2108 - classification_loss: 0.0092 216/500 [===========>..................] - ETA: 2:22 - loss: 0.2212 - regression_loss: 0.2119 - classification_loss: 0.0093 217/500 [============>.................] - ETA: 2:22 - loss: 0.2208 - regression_loss: 0.2115 - classification_loss: 0.0093 218/500 [============>.................] - ETA: 2:21 - loss: 0.2215 - regression_loss: 0.2122 - classification_loss: 0.0094 219/500 [============>.................] - ETA: 2:21 - loss: 0.2219 - regression_loss: 0.2125 - classification_loss: 0.0094 220/500 [============>.................] - ETA: 2:20 - loss: 0.2224 - regression_loss: 0.2130 - classification_loss: 0.0094 221/500 [============>.................] - ETA: 2:20 - loss: 0.2221 - regression_loss: 0.2127 - classification_loss: 0.0094 222/500 [============>.................] - ETA: 2:19 - loss: 0.2220 - regression_loss: 0.2126 - classification_loss: 0.0094 223/500 [============>.................] - ETA: 2:19 - loss: 0.2221 - regression_loss: 0.2127 - classification_loss: 0.0094 224/500 [============>.................] - ETA: 2:18 - loss: 0.2223 - regression_loss: 0.2129 - classification_loss: 0.0094 225/500 [============>.................] - ETA: 2:18 - loss: 0.2225 - regression_loss: 0.2131 - classification_loss: 0.0094 226/500 [============>.................] - ETA: 2:17 - loss: 0.2229 - regression_loss: 0.2135 - classification_loss: 0.0094 227/500 [============>.................] - ETA: 2:17 - loss: 0.2237 - regression_loss: 0.2142 - classification_loss: 0.0095 228/500 [============>.................] - ETA: 2:16 - loss: 0.2242 - regression_loss: 0.2147 - classification_loss: 0.0095 229/500 [============>.................] - ETA: 2:16 - loss: 0.2244 - regression_loss: 0.2149 - classification_loss: 0.0095 230/500 [============>.................] - ETA: 2:15 - loss: 0.2238 - regression_loss: 0.2143 - classification_loss: 0.0094 231/500 [============>.................] - ETA: 2:15 - loss: 0.2240 - regression_loss: 0.2145 - classification_loss: 0.0095 232/500 [============>.................] - ETA: 2:14 - loss: 0.2241 - regression_loss: 0.2146 - classification_loss: 0.0095 233/500 [============>.................] - ETA: 2:14 - loss: 0.2242 - regression_loss: 0.2147 - classification_loss: 0.0095 234/500 [=============>................] - ETA: 2:13 - loss: 0.2239 - regression_loss: 0.2144 - classification_loss: 0.0095 235/500 [=============>................] - ETA: 2:13 - loss: 0.2240 - regression_loss: 0.2144 - classification_loss: 0.0095 236/500 [=============>................] - ETA: 2:12 - loss: 0.2234 - regression_loss: 0.2139 - classification_loss: 0.0095 237/500 [=============>................] - ETA: 2:12 - loss: 0.2230 - regression_loss: 0.2136 - classification_loss: 0.0095 238/500 [=============>................] - ETA: 2:11 - loss: 0.2228 - regression_loss: 0.2134 - classification_loss: 0.0094 239/500 [=============>................] - ETA: 2:11 - loss: 0.2226 - regression_loss: 0.2132 - classification_loss: 0.0094 240/500 [=============>................] - ETA: 2:10 - loss: 0.2227 - regression_loss: 0.2133 - classification_loss: 0.0094 241/500 [=============>................] - ETA: 2:10 - loss: 0.2227 - regression_loss: 0.2133 - classification_loss: 0.0094 242/500 [=============>................] - ETA: 2:09 - loss: 0.2236 - regression_loss: 0.2141 - classification_loss: 0.0095 243/500 [=============>................] - ETA: 2:09 - loss: 0.2230 - regression_loss: 0.2135 - classification_loss: 0.0094 244/500 [=============>................] - ETA: 2:08 - loss: 0.2228 - regression_loss: 0.2133 - classification_loss: 0.0094 245/500 [=============>................] - ETA: 2:08 - loss: 0.2228 - regression_loss: 0.2134 - classification_loss: 0.0094 246/500 [=============>................] - ETA: 2:07 - loss: 0.2230 - regression_loss: 0.2136 - classification_loss: 0.0094 247/500 [=============>................] - ETA: 2:07 - loss: 0.2232 - regression_loss: 0.2138 - classification_loss: 0.0094 248/500 [=============>................] - ETA: 2:06 - loss: 0.2233 - regression_loss: 0.2139 - classification_loss: 0.0094 249/500 [=============>................] - ETA: 2:06 - loss: 0.2234 - regression_loss: 0.2140 - classification_loss: 0.0094 250/500 [==============>...............] - ETA: 2:05 - loss: 0.2233 - regression_loss: 0.2139 - classification_loss: 0.0094 251/500 [==============>...............] - ETA: 2:05 - loss: 0.2237 - regression_loss: 0.2143 - classification_loss: 0.0094 252/500 [==============>...............] - ETA: 2:04 - loss: 0.2236 - regression_loss: 0.2142 - classification_loss: 0.0094 253/500 [==============>...............] - ETA: 2:04 - loss: 0.2232 - regression_loss: 0.2138 - classification_loss: 0.0094 254/500 [==============>...............] - ETA: 2:03 - loss: 0.2232 - regression_loss: 0.2138 - classification_loss: 0.0094 255/500 [==============>...............] - ETA: 2:03 - loss: 0.2234 - regression_loss: 0.2139 - classification_loss: 0.0095 256/500 [==============>...............] - ETA: 2:02 - loss: 0.2240 - regression_loss: 0.2144 - classification_loss: 0.0095 257/500 [==============>...............] - ETA: 2:02 - loss: 0.2238 - regression_loss: 0.2143 - classification_loss: 0.0095 258/500 [==============>...............] - ETA: 2:01 - loss: 0.2238 - regression_loss: 0.2142 - classification_loss: 0.0095 259/500 [==============>...............] - ETA: 2:01 - loss: 0.2239 - regression_loss: 0.2144 - classification_loss: 0.0095 260/500 [==============>...............] - ETA: 2:00 - loss: 0.2247 - regression_loss: 0.2151 - classification_loss: 0.0095 261/500 [==============>...............] - ETA: 2:00 - loss: 0.2246 - regression_loss: 0.2150 - classification_loss: 0.0095 262/500 [==============>...............] - ETA: 1:59 - loss: 0.2250 - regression_loss: 0.2154 - classification_loss: 0.0095 263/500 [==============>...............] - ETA: 1:59 - loss: 0.2245 - regression_loss: 0.2150 - classification_loss: 0.0095 264/500 [==============>...............] - ETA: 1:58 - loss: 0.2241 - regression_loss: 0.2146 - classification_loss: 0.0095 265/500 [==============>...............] - ETA: 1:58 - loss: 0.2238 - regression_loss: 0.2143 - classification_loss: 0.0095 266/500 [==============>...............] - ETA: 1:57 - loss: 0.2236 - regression_loss: 0.2142 - classification_loss: 0.0094 267/500 [===============>..............] - ETA: 1:57 - loss: 0.2236 - regression_loss: 0.2142 - classification_loss: 0.0094 268/500 [===============>..............] - ETA: 1:56 - loss: 0.2236 - regression_loss: 0.2142 - classification_loss: 0.0094 269/500 [===============>..............] - ETA: 1:56 - loss: 0.2249 - regression_loss: 0.2155 - classification_loss: 0.0094 270/500 [===============>..............] - ETA: 1:55 - loss: 0.2252 - regression_loss: 0.2157 - classification_loss: 0.0094 271/500 [===============>..............] - ETA: 1:55 - loss: 0.2248 - regression_loss: 0.2154 - classification_loss: 0.0094 272/500 [===============>..............] - ETA: 1:54 - loss: 0.2247 - regression_loss: 0.2153 - classification_loss: 0.0094 273/500 [===============>..............] - ETA: 1:54 - loss: 0.2249 - regression_loss: 0.2155 - classification_loss: 0.0094 274/500 [===============>..............] - ETA: 1:53 - loss: 0.2248 - regression_loss: 0.2154 - classification_loss: 0.0094 275/500 [===============>..............] - ETA: 1:53 - loss: 0.2246 - regression_loss: 0.2152 - classification_loss: 0.0093 276/500 [===============>..............] - ETA: 1:52 - loss: 0.2254 - regression_loss: 0.2160 - classification_loss: 0.0094 277/500 [===============>..............] - ETA: 1:52 - loss: 0.2258 - regression_loss: 0.2164 - classification_loss: 0.0094 278/500 [===============>..............] - ETA: 1:51 - loss: 0.2260 - regression_loss: 0.2166 - classification_loss: 0.0094 279/500 [===============>..............] - ETA: 1:51 - loss: 0.2266 - regression_loss: 0.2171 - classification_loss: 0.0094 280/500 [===============>..............] - ETA: 1:50 - loss: 0.2261 - regression_loss: 0.2167 - classification_loss: 0.0094 281/500 [===============>..............] - ETA: 1:50 - loss: 0.2261 - regression_loss: 0.2167 - classification_loss: 0.0094 282/500 [===============>..............] - ETA: 1:49 - loss: 0.2268 - regression_loss: 0.2173 - classification_loss: 0.0094 283/500 [===============>..............] - ETA: 1:49 - loss: 0.2272 - regression_loss: 0.2178 - classification_loss: 0.0094 284/500 [================>.............] - ETA: 1:48 - loss: 0.2274 - regression_loss: 0.2180 - classification_loss: 0.0094 285/500 [================>.............] - ETA: 1:48 - loss: 0.2277 - regression_loss: 0.2183 - classification_loss: 0.0094 286/500 [================>.............] - ETA: 1:47 - loss: 0.2281 - regression_loss: 0.2186 - classification_loss: 0.0094 287/500 [================>.............] - ETA: 1:47 - loss: 0.2290 - regression_loss: 0.2195 - classification_loss: 0.0095 288/500 [================>.............] - ETA: 1:46 - loss: 0.2289 - regression_loss: 0.2193 - classification_loss: 0.0095 289/500 [================>.............] - ETA: 1:46 - loss: 0.2296 - regression_loss: 0.2198 - classification_loss: 0.0097 290/500 [================>.............] - ETA: 1:45 - loss: 0.2301 - regression_loss: 0.2203 - classification_loss: 0.0098 291/500 [================>.............] - ETA: 1:45 - loss: 0.2298 - regression_loss: 0.2200 - classification_loss: 0.0098 292/500 [================>.............] - ETA: 1:44 - loss: 0.2302 - regression_loss: 0.2204 - classification_loss: 0.0098 293/500 [================>.............] - ETA: 1:44 - loss: 0.2300 - regression_loss: 0.2202 - classification_loss: 0.0098 294/500 [================>.............] - ETA: 1:43 - loss: 0.2303 - regression_loss: 0.2205 - classification_loss: 0.0098 295/500 [================>.............] - ETA: 1:43 - loss: 0.2302 - regression_loss: 0.2204 - classification_loss: 0.0098 296/500 [================>.............] - ETA: 1:42 - loss: 0.2301 - regression_loss: 0.2203 - classification_loss: 0.0098 297/500 [================>.............] - ETA: 1:42 - loss: 0.2297 - regression_loss: 0.2199 - classification_loss: 0.0098 298/500 [================>.............] - ETA: 1:41 - loss: 0.2296 - regression_loss: 0.2198 - classification_loss: 0.0098 299/500 [================>.............] - ETA: 1:41 - loss: 0.2294 - regression_loss: 0.2197 - classification_loss: 0.0098 300/500 [=================>............] - ETA: 1:40 - loss: 0.2290 - regression_loss: 0.2192 - classification_loss: 0.0098 301/500 [=================>............] - ETA: 1:40 - loss: 0.2289 - regression_loss: 0.2191 - classification_loss: 0.0098 302/500 [=================>............] - ETA: 1:39 - loss: 0.2287 - regression_loss: 0.2189 - classification_loss: 0.0097 303/500 [=================>............] - ETA: 1:39 - loss: 0.2283 - regression_loss: 0.2186 - classification_loss: 0.0097 304/500 [=================>............] - ETA: 1:38 - loss: 0.2281 - regression_loss: 0.2184 - classification_loss: 0.0097 305/500 [=================>............] - ETA: 1:38 - loss: 0.2284 - regression_loss: 0.2186 - classification_loss: 0.0097 306/500 [=================>............] - ETA: 1:37 - loss: 0.2281 - regression_loss: 0.2183 - classification_loss: 0.0097 307/500 [=================>............] - ETA: 1:37 - loss: 0.2280 - regression_loss: 0.2183 - classification_loss: 0.0097 308/500 [=================>............] - ETA: 1:36 - loss: 0.2279 - regression_loss: 0.2182 - classification_loss: 0.0097 309/500 [=================>............] - ETA: 1:36 - loss: 0.2276 - regression_loss: 0.2179 - classification_loss: 0.0097 310/500 [=================>............] - ETA: 1:35 - loss: 0.2272 - regression_loss: 0.2175 - classification_loss: 0.0097 311/500 [=================>............] - ETA: 1:35 - loss: 0.2270 - regression_loss: 0.2174 - classification_loss: 0.0097 312/500 [=================>............] - ETA: 1:34 - loss: 0.2266 - regression_loss: 0.2170 - classification_loss: 0.0097 313/500 [=================>............] - ETA: 1:34 - loss: 0.2268 - regression_loss: 0.2171 - classification_loss: 0.0097 314/500 [=================>............] - ETA: 1:33 - loss: 0.2266 - regression_loss: 0.2170 - classification_loss: 0.0096 315/500 [=================>............] - ETA: 1:33 - loss: 0.2269 - regression_loss: 0.2172 - classification_loss: 0.0097 316/500 [=================>............] - ETA: 1:32 - loss: 0.2265 - regression_loss: 0.2168 - classification_loss: 0.0097 317/500 [==================>...........] - ETA: 1:32 - loss: 0.2262 - regression_loss: 0.2165 - classification_loss: 0.0096 318/500 [==================>...........] - ETA: 1:31 - loss: 0.2263 - regression_loss: 0.2166 - classification_loss: 0.0096 319/500 [==================>...........] - ETA: 1:31 - loss: 0.2264 - regression_loss: 0.2167 - classification_loss: 0.0096 320/500 [==================>...........] - ETA: 1:30 - loss: 0.2261 - regression_loss: 0.2165 - classification_loss: 0.0096 321/500 [==================>...........] - ETA: 1:30 - loss: 0.2259 - regression_loss: 0.2163 - classification_loss: 0.0096 322/500 [==================>...........] - ETA: 1:29 - loss: 0.2260 - regression_loss: 0.2164 - classification_loss: 0.0096 323/500 [==================>...........] - ETA: 1:29 - loss: 0.2258 - regression_loss: 0.2162 - classification_loss: 0.0096 324/500 [==================>...........] - ETA: 1:28 - loss: 0.2259 - regression_loss: 0.2163 - classification_loss: 0.0096 325/500 [==================>...........] - ETA: 1:28 - loss: 0.2256 - regression_loss: 0.2160 - classification_loss: 0.0096 326/500 [==================>...........] - ETA: 1:27 - loss: 0.2256 - regression_loss: 0.2160 - classification_loss: 0.0096 327/500 [==================>...........] - ETA: 1:27 - loss: 0.2255 - regression_loss: 0.2159 - classification_loss: 0.0096 328/500 [==================>...........] - ETA: 1:26 - loss: 0.2251 - regression_loss: 0.2156 - classification_loss: 0.0096 329/500 [==================>...........] - ETA: 1:25 - loss: 0.2250 - regression_loss: 0.2155 - classification_loss: 0.0096 330/500 [==================>...........] - ETA: 1:25 - loss: 0.2251 - regression_loss: 0.2155 - classification_loss: 0.0095 331/500 [==================>...........] - ETA: 1:24 - loss: 0.2251 - regression_loss: 0.2155 - classification_loss: 0.0096 332/500 [==================>...........] - ETA: 1:24 - loss: 0.2247 - regression_loss: 0.2152 - classification_loss: 0.0095 333/500 [==================>...........] - ETA: 1:23 - loss: 0.2245 - regression_loss: 0.2150 - classification_loss: 0.0095 334/500 [===================>..........] - ETA: 1:23 - loss: 0.2244 - regression_loss: 0.2149 - classification_loss: 0.0095 335/500 [===================>..........] - ETA: 1:22 - loss: 0.2242 - regression_loss: 0.2147 - classification_loss: 0.0095 336/500 [===================>..........] - ETA: 1:22 - loss: 0.2246 - regression_loss: 0.2150 - classification_loss: 0.0095 337/500 [===================>..........] - ETA: 1:21 - loss: 0.2243 - regression_loss: 0.2148 - classification_loss: 0.0095 338/500 [===================>..........] - ETA: 1:21 - loss: 0.2238 - regression_loss: 0.2143 - classification_loss: 0.0095 339/500 [===================>..........] - ETA: 1:20 - loss: 0.2237 - regression_loss: 0.2142 - classification_loss: 0.0095 340/500 [===================>..........] - ETA: 1:20 - loss: 0.2235 - regression_loss: 0.2141 - classification_loss: 0.0095 341/500 [===================>..........] - ETA: 1:19 - loss: 0.2236 - regression_loss: 0.2142 - classification_loss: 0.0095 342/500 [===================>..........] - ETA: 1:19 - loss: 0.2236 - regression_loss: 0.2141 - classification_loss: 0.0095 343/500 [===================>..........] - ETA: 1:18 - loss: 0.2232 - regression_loss: 0.2138 - classification_loss: 0.0095 344/500 [===================>..........] - ETA: 1:18 - loss: 0.2227 - regression_loss: 0.2133 - classification_loss: 0.0094 345/500 [===================>..........] - ETA: 1:17 - loss: 0.2225 - regression_loss: 0.2131 - classification_loss: 0.0094 346/500 [===================>..........] - ETA: 1:17 - loss: 0.2222 - regression_loss: 0.2128 - classification_loss: 0.0094 347/500 [===================>..........] - ETA: 1:16 - loss: 0.2224 - regression_loss: 0.2129 - classification_loss: 0.0094 348/500 [===================>..........] - ETA: 1:16 - loss: 0.2219 - regression_loss: 0.2125 - classification_loss: 0.0094 349/500 [===================>..........] - ETA: 1:15 - loss: 0.2222 - regression_loss: 0.2128 - classification_loss: 0.0095 350/500 [====================>.........] - ETA: 1:15 - loss: 0.2221 - regression_loss: 0.2127 - classification_loss: 0.0095 351/500 [====================>.........] - ETA: 1:14 - loss: 0.2222 - regression_loss: 0.2128 - classification_loss: 0.0094 352/500 [====================>.........] - ETA: 1:14 - loss: 0.2220 - regression_loss: 0.2126 - classification_loss: 0.0094 353/500 [====================>.........] - ETA: 1:13 - loss: 0.2220 - regression_loss: 0.2126 - classification_loss: 0.0094 354/500 [====================>.........] - ETA: 1:13 - loss: 0.2218 - regression_loss: 0.2124 - classification_loss: 0.0094 355/500 [====================>.........] - ETA: 1:12 - loss: 0.2215 - regression_loss: 0.2122 - classification_loss: 0.0094 356/500 [====================>.........] - ETA: 1:12 - loss: 0.2215 - regression_loss: 0.2122 - classification_loss: 0.0094 357/500 [====================>.........] - ETA: 1:11 - loss: 0.2212 - regression_loss: 0.2119 - classification_loss: 0.0093 358/500 [====================>.........] - ETA: 1:11 - loss: 0.2212 - regression_loss: 0.2119 - classification_loss: 0.0093 359/500 [====================>.........] - ETA: 1:10 - loss: 0.2213 - regression_loss: 0.2120 - classification_loss: 0.0093 360/500 [====================>.........] - ETA: 1:10 - loss: 0.2212 - regression_loss: 0.2118 - classification_loss: 0.0093 361/500 [====================>.........] - ETA: 1:09 - loss: 0.2212 - regression_loss: 0.2118 - classification_loss: 0.0093 362/500 [====================>.........] - ETA: 1:09 - loss: 0.2210 - regression_loss: 0.2117 - classification_loss: 0.0093 363/500 [====================>.........] - ETA: 1:08 - loss: 0.2211 - regression_loss: 0.2118 - classification_loss: 0.0093 364/500 [====================>.........] - ETA: 1:08 - loss: 0.2214 - regression_loss: 0.2121 - classification_loss: 0.0093 365/500 [====================>.........] - ETA: 1:07 - loss: 0.2211 - regression_loss: 0.2118 - classification_loss: 0.0093 366/500 [====================>.........] - ETA: 1:07 - loss: 0.2217 - regression_loss: 0.2123 - classification_loss: 0.0094 367/500 [=====================>........] - ETA: 1:06 - loss: 0.2215 - regression_loss: 0.2122 - classification_loss: 0.0093 368/500 [=====================>........] - ETA: 1:06 - loss: 0.2215 - regression_loss: 0.2122 - classification_loss: 0.0093 369/500 [=====================>........] - ETA: 1:05 - loss: 0.2216 - regression_loss: 0.2123 - classification_loss: 0.0093 370/500 [=====================>........] - ETA: 1:05 - loss: 0.2217 - regression_loss: 0.2123 - classification_loss: 0.0093 371/500 [=====================>........] - ETA: 1:04 - loss: 0.2216 - regression_loss: 0.2122 - classification_loss: 0.0093 372/500 [=====================>........] - ETA: 1:04 - loss: 0.2218 - regression_loss: 0.2124 - classification_loss: 0.0094 373/500 [=====================>........] - ETA: 1:03 - loss: 0.2215 - regression_loss: 0.2122 - classification_loss: 0.0094 374/500 [=====================>........] - ETA: 1:03 - loss: 0.2213 - regression_loss: 0.2119 - classification_loss: 0.0094 375/500 [=====================>........] - ETA: 1:02 - loss: 0.2209 - regression_loss: 0.2116 - classification_loss: 0.0093 376/500 [=====================>........] - ETA: 1:02 - loss: 0.2210 - regression_loss: 0.2116 - classification_loss: 0.0093 377/500 [=====================>........] - ETA: 1:01 - loss: 0.2208 - regression_loss: 0.2115 - classification_loss: 0.0093 378/500 [=====================>........] - ETA: 1:01 - loss: 0.2206 - regression_loss: 0.2112 - classification_loss: 0.0093 379/500 [=====================>........] - ETA: 1:00 - loss: 0.2202 - regression_loss: 0.2109 - classification_loss: 0.0093 380/500 [=====================>........] - ETA: 1:00 - loss: 0.2198 - regression_loss: 0.2105 - classification_loss: 0.0093 381/500 [=====================>........] - ETA: 59s - loss: 0.2205 - regression_loss: 0.2112 - classification_loss: 0.0093  382/500 [=====================>........] - ETA: 59s - loss: 0.2202 - regression_loss: 0.2109 - classification_loss: 0.0093 383/500 [=====================>........] - ETA: 58s - loss: 0.2203 - regression_loss: 0.2110 - classification_loss: 0.0093 384/500 [======================>.......] - ETA: 58s - loss: 0.2199 - regression_loss: 0.2107 - classification_loss: 0.0093 385/500 [======================>.......] - ETA: 57s - loss: 0.2195 - regression_loss: 0.2103 - classification_loss: 0.0092 386/500 [======================>.......] - ETA: 57s - loss: 0.2195 - regression_loss: 0.2103 - classification_loss: 0.0092 387/500 [======================>.......] - ETA: 56s - loss: 0.2196 - regression_loss: 0.2103 - classification_loss: 0.0092 388/500 [======================>.......] - ETA: 56s - loss: 0.2193 - regression_loss: 0.2101 - classification_loss: 0.0092 389/500 [======================>.......] - ETA: 55s - loss: 0.2192 - regression_loss: 0.2100 - classification_loss: 0.0092 390/500 [======================>.......] - ETA: 55s - loss: 0.2192 - regression_loss: 0.2099 - classification_loss: 0.0092 391/500 [======================>.......] - ETA: 54s - loss: 0.2188 - regression_loss: 0.2097 - classification_loss: 0.0092 392/500 [======================>.......] - ETA: 54s - loss: 0.2189 - regression_loss: 0.2097 - classification_loss: 0.0092 393/500 [======================>.......] - ETA: 53s - loss: 0.2190 - regression_loss: 0.2098 - classification_loss: 0.0092 394/500 [======================>.......] - ETA: 53s - loss: 0.2190 - regression_loss: 0.2098 - classification_loss: 0.0092 395/500 [======================>.......] - ETA: 52s - loss: 0.2191 - regression_loss: 0.2099 - classification_loss: 0.0091 396/500 [======================>.......] - ETA: 52s - loss: 0.2193 - regression_loss: 0.2102 - classification_loss: 0.0091 397/500 [======================>.......] - ETA: 51s - loss: 0.2192 - regression_loss: 0.2101 - classification_loss: 0.0091 398/500 [======================>.......] - ETA: 51s - loss: 0.2194 - regression_loss: 0.2103 - classification_loss: 0.0092 399/500 [======================>.......] - ETA: 50s - loss: 0.2194 - regression_loss: 0.2102 - classification_loss: 0.0092 400/500 [=======================>......] - ETA: 50s - loss: 0.2193 - regression_loss: 0.2102 - classification_loss: 0.0092 401/500 [=======================>......] - ETA: 49s - loss: 0.2195 - regression_loss: 0.2103 - classification_loss: 0.0092 402/500 [=======================>......] - ETA: 49s - loss: 0.2196 - regression_loss: 0.2104 - classification_loss: 0.0092 403/500 [=======================>......] - ETA: 48s - loss: 0.2195 - regression_loss: 0.2103 - classification_loss: 0.0092 404/500 [=======================>......] - ETA: 48s - loss: 0.2197 - regression_loss: 0.2105 - classification_loss: 0.0092 405/500 [=======================>......] - ETA: 47s - loss: 0.2195 - regression_loss: 0.2103 - classification_loss: 0.0092 406/500 [=======================>......] - ETA: 47s - loss: 0.2193 - regression_loss: 0.2101 - classification_loss: 0.0092 407/500 [=======================>......] - ETA: 46s - loss: 0.2190 - regression_loss: 0.2099 - classification_loss: 0.0092 408/500 [=======================>......] - ETA: 46s - loss: 0.2192 - regression_loss: 0.2100 - classification_loss: 0.0092 409/500 [=======================>......] - ETA: 45s - loss: 0.2193 - regression_loss: 0.2101 - classification_loss: 0.0092 410/500 [=======================>......] - ETA: 45s - loss: 0.2191 - regression_loss: 0.2099 - classification_loss: 0.0092 411/500 [=======================>......] - ETA: 44s - loss: 0.2190 - regression_loss: 0.2099 - classification_loss: 0.0092 412/500 [=======================>......] - ETA: 44s - loss: 0.2188 - regression_loss: 0.2096 - classification_loss: 0.0091 413/500 [=======================>......] - ETA: 43s - loss: 0.2189 - regression_loss: 0.2097 - classification_loss: 0.0091 414/500 [=======================>......] - ETA: 43s - loss: 0.2187 - regression_loss: 0.2095 - classification_loss: 0.0091 415/500 [=======================>......] - ETA: 42s - loss: 0.2185 - regression_loss: 0.2093 - classification_loss: 0.0091 416/500 [=======================>......] - ETA: 42s - loss: 0.2183 - regression_loss: 0.2092 - classification_loss: 0.0091 417/500 [========================>.....] - ETA: 41s - loss: 0.2181 - regression_loss: 0.2090 - classification_loss: 0.0091 418/500 [========================>.....] - ETA: 41s - loss: 0.2184 - regression_loss: 0.2093 - classification_loss: 0.0091 419/500 [========================>.....] - ETA: 40s - loss: 0.2189 - regression_loss: 0.2097 - classification_loss: 0.0092 420/500 [========================>.....] - ETA: 40s - loss: 0.2191 - regression_loss: 0.2099 - classification_loss: 0.0092 421/500 [========================>.....] - ETA: 39s - loss: 0.2191 - regression_loss: 0.2099 - classification_loss: 0.0092 422/500 [========================>.....] - ETA: 39s - loss: 0.2188 - regression_loss: 0.2096 - classification_loss: 0.0092 423/500 [========================>.....] - ETA: 38s - loss: 0.2188 - regression_loss: 0.2096 - classification_loss: 0.0092 424/500 [========================>.....] - ETA: 38s - loss: 0.2185 - regression_loss: 0.2093 - classification_loss: 0.0092 425/500 [========================>.....] - ETA: 37s - loss: 0.2186 - regression_loss: 0.2094 - classification_loss: 0.0092 426/500 [========================>.....] - ETA: 37s - loss: 0.2186 - regression_loss: 0.2095 - classification_loss: 0.0092 427/500 [========================>.....] - ETA: 36s - loss: 0.2188 - regression_loss: 0.2096 - classification_loss: 0.0092 428/500 [========================>.....] - ETA: 36s - loss: 0.2187 - regression_loss: 0.2095 - classification_loss: 0.0092 429/500 [========================>.....] - ETA: 35s - loss: 0.2188 - regression_loss: 0.2096 - classification_loss: 0.0091 430/500 [========================>.....] - ETA: 35s - loss: 0.2188 - regression_loss: 0.2096 - classification_loss: 0.0091 431/500 [========================>.....] - ETA: 34s - loss: 0.2187 - regression_loss: 0.2096 - classification_loss: 0.0091 432/500 [========================>.....] - ETA: 34s - loss: 0.2188 - regression_loss: 0.2096 - classification_loss: 0.0091 433/500 [========================>.....] - ETA: 33s - loss: 0.2185 - regression_loss: 0.2093 - classification_loss: 0.0091 434/500 [=========================>....] - ETA: 33s - loss: 0.2183 - regression_loss: 0.2092 - classification_loss: 0.0091 435/500 [=========================>....] - ETA: 32s - loss: 0.2183 - regression_loss: 0.2092 - classification_loss: 0.0091 436/500 [=========================>....] - ETA: 32s - loss: 0.2182 - regression_loss: 0.2091 - classification_loss: 0.0091 437/500 [=========================>....] - ETA: 31s - loss: 0.2185 - regression_loss: 0.2094 - classification_loss: 0.0091 438/500 [=========================>....] - ETA: 31s - loss: 0.2191 - regression_loss: 0.2099 - classification_loss: 0.0091 439/500 [=========================>....] - ETA: 30s - loss: 0.2188 - regression_loss: 0.2096 - classification_loss: 0.0091 440/500 [=========================>....] - ETA: 30s - loss: 0.2185 - regression_loss: 0.2094 - classification_loss: 0.0091 441/500 [=========================>....] - ETA: 29s - loss: 0.2187 - regression_loss: 0.2096 - classification_loss: 0.0092 442/500 [=========================>....] - ETA: 29s - loss: 0.2186 - regression_loss: 0.2094 - classification_loss: 0.0092 443/500 [=========================>....] - ETA: 28s - loss: 0.2186 - regression_loss: 0.2094 - classification_loss: 0.0092 444/500 [=========================>....] - ETA: 28s - loss: 0.2188 - regression_loss: 0.2096 - classification_loss: 0.0092 445/500 [=========================>....] - ETA: 27s - loss: 0.2187 - regression_loss: 0.2095 - classification_loss: 0.0092 446/500 [=========================>....] - ETA: 27s - loss: 0.2185 - regression_loss: 0.2093 - classification_loss: 0.0092 447/500 [=========================>....] - ETA: 26s - loss: 0.2181 - regression_loss: 0.2089 - classification_loss: 0.0092 448/500 [=========================>....] - ETA: 26s - loss: 0.2182 - regression_loss: 0.2090 - classification_loss: 0.0092 449/500 [=========================>....] - ETA: 25s - loss: 0.2179 - regression_loss: 0.2087 - classification_loss: 0.0091 450/500 [==========================>...] - ETA: 25s - loss: 0.2178 - regression_loss: 0.2087 - classification_loss: 0.0091 451/500 [==========================>...] - ETA: 24s - loss: 0.2178 - regression_loss: 0.2087 - classification_loss: 0.0091 452/500 [==========================>...] - ETA: 24s - loss: 0.2180 - regression_loss: 0.2089 - classification_loss: 0.0091 453/500 [==========================>...] - ETA: 23s - loss: 0.2177 - regression_loss: 0.2085 - classification_loss: 0.0091 454/500 [==========================>...] - ETA: 23s - loss: 0.2174 - regression_loss: 0.2083 - classification_loss: 0.0091 455/500 [==========================>...] - ETA: 22s - loss: 0.2174 - regression_loss: 0.2083 - classification_loss: 0.0091 456/500 [==========================>...] - ETA: 22s - loss: 0.2172 - regression_loss: 0.2081 - classification_loss: 0.0091 457/500 [==========================>...] - ETA: 21s - loss: 0.2170 - regression_loss: 0.2079 - classification_loss: 0.0091 458/500 [==========================>...] - ETA: 21s - loss: 0.2167 - regression_loss: 0.2077 - classification_loss: 0.0091 459/500 [==========================>...] - ETA: 20s - loss: 0.2167 - regression_loss: 0.2077 - classification_loss: 0.0090 460/500 [==========================>...] - ETA: 20s - loss: 0.2169 - regression_loss: 0.2078 - classification_loss: 0.0090 461/500 [==========================>...] - ETA: 19s - loss: 0.2169 - regression_loss: 0.2078 - classification_loss: 0.0090 462/500 [==========================>...] - ETA: 19s - loss: 0.2168 - regression_loss: 0.2078 - classification_loss: 0.0091 463/500 [==========================>...] - ETA: 18s - loss: 0.2167 - regression_loss: 0.2077 - classification_loss: 0.0090 464/500 [==========================>...] - ETA: 18s - loss: 0.2167 - regression_loss: 0.2077 - classification_loss: 0.0090 465/500 [==========================>...] - ETA: 17s - loss: 0.2166 - regression_loss: 0.2076 - classification_loss: 0.0090 466/500 [==========================>...] - ETA: 17s - loss: 0.2168 - regression_loss: 0.2078 - classification_loss: 0.0090 467/500 [===========================>..] - ETA: 16s - loss: 0.2166 - regression_loss: 0.2076 - classification_loss: 0.0090 468/500 [===========================>..] - ETA: 16s - loss: 0.2166 - regression_loss: 0.2075 - classification_loss: 0.0090 469/500 [===========================>..] - ETA: 15s - loss: 0.2166 - regression_loss: 0.2076 - classification_loss: 0.0090 470/500 [===========================>..] - ETA: 15s - loss: 0.2166 - regression_loss: 0.2076 - classification_loss: 0.0090 471/500 [===========================>..] - ETA: 14s - loss: 0.2168 - regression_loss: 0.2078 - classification_loss: 0.0090 472/500 [===========================>..] - ETA: 14s - loss: 0.2166 - regression_loss: 0.2076 - classification_loss: 0.0090 473/500 [===========================>..] - ETA: 13s - loss: 0.2167 - regression_loss: 0.2077 - classification_loss: 0.0090 474/500 [===========================>..] - ETA: 13s - loss: 0.2168 - regression_loss: 0.2078 - classification_loss: 0.0090 475/500 [===========================>..] - ETA: 12s - loss: 0.2169 - regression_loss: 0.2079 - classification_loss: 0.0090 476/500 [===========================>..] - ETA: 12s - loss: 0.2171 - regression_loss: 0.2081 - classification_loss: 0.0090 477/500 [===========================>..] - ETA: 11s - loss: 0.2177 - regression_loss: 0.2087 - classification_loss: 0.0090 478/500 [===========================>..] - ETA: 11s - loss: 0.2174 - regression_loss: 0.2084 - classification_loss: 0.0090 479/500 [===========================>..] - ETA: 10s - loss: 0.2171 - regression_loss: 0.2082 - classification_loss: 0.0090 480/500 [===========================>..] - ETA: 10s - loss: 0.2175 - regression_loss: 0.2085 - classification_loss: 0.0090 481/500 [===========================>..] - ETA: 9s - loss: 0.2176 - regression_loss: 0.2087 - classification_loss: 0.0090  482/500 [===========================>..] - ETA: 9s - loss: 0.2177 - regression_loss: 0.2087 - classification_loss: 0.0090 483/500 [===========================>..] - ETA: 8s - loss: 0.2175 - regression_loss: 0.2086 - classification_loss: 0.0090 484/500 [============================>.] - ETA: 8s - loss: 0.2174 - regression_loss: 0.2085 - classification_loss: 0.0089 485/500 [============================>.] - ETA: 7s - loss: 0.2176 - regression_loss: 0.2086 - classification_loss: 0.0090 486/500 [============================>.] - ETA: 7s - loss: 0.2175 - regression_loss: 0.2085 - classification_loss: 0.0089 487/500 [============================>.] - ETA: 6s - loss: 0.2174 - regression_loss: 0.2084 - classification_loss: 0.0089 488/500 [============================>.] - ETA: 6s - loss: 0.2172 - regression_loss: 0.2083 - classification_loss: 0.0089 489/500 [============================>.] - ETA: 5s - loss: 0.2177 - regression_loss: 0.2088 - classification_loss: 0.0090 490/500 [============================>.] - ETA: 5s - loss: 0.2177 - regression_loss: 0.2088 - classification_loss: 0.0090 491/500 [============================>.] - ETA: 4s - loss: 0.2181 - regression_loss: 0.2091 - classification_loss: 0.0090 492/500 [============================>.] - ETA: 4s - loss: 0.2181 - regression_loss: 0.2091 - classification_loss: 0.0090 493/500 [============================>.] - ETA: 3s - loss: 0.2182 - regression_loss: 0.2092 - classification_loss: 0.0090 494/500 [============================>.] - ETA: 3s - loss: 0.2181 - regression_loss: 0.2091 - classification_loss: 0.0090 495/500 [============================>.] - ETA: 2s - loss: 0.2182 - regression_loss: 0.2092 - classification_loss: 0.0090 496/500 [============================>.] - ETA: 2s - loss: 0.2182 - regression_loss: 0.2092 - classification_loss: 0.0090 497/500 [============================>.] - ETA: 1s - loss: 0.2184 - regression_loss: 0.2094 - classification_loss: 0.0090 498/500 [============================>.] - ETA: 1s - loss: 0.2182 - regression_loss: 0.2092 - classification_loss: 0.0090 499/500 [============================>.] - ETA: 0s - loss: 0.2181 - regression_loss: 0.2091 - classification_loss: 0.0090 500/500 [==============================] - 251s 503ms/step - loss: 0.2181 - regression_loss: 0.2091 - classification_loss: 0.0090 1172 instances of class plum with average precision: 0.7458 mAP: 0.7458 Epoch 00035: saving model to ./training/snapshots/resnet101_pascal_35.h5 Epoch 36/150 1/500 [..............................] - ETA: 3:57 - loss: 0.1769 - regression_loss: 0.1715 - classification_loss: 0.0054 2/500 [..............................] - ETA: 4:05 - loss: 0.2122 - regression_loss: 0.2027 - classification_loss: 0.0095 3/500 [..............................] - ETA: 4:07 - loss: 0.1907 - regression_loss: 0.1827 - classification_loss: 0.0080 4/500 [..............................] - ETA: 4:05 - loss: 0.1731 - regression_loss: 0.1667 - classification_loss: 0.0064 5/500 [..............................] - ETA: 4:05 - loss: 0.1834 - regression_loss: 0.1772 - classification_loss: 0.0062 6/500 [..............................] - ETA: 4:05 - loss: 0.1788 - regression_loss: 0.1727 - classification_loss: 0.0061 7/500 [..............................] - ETA: 4:05 - loss: 0.1681 - regression_loss: 0.1627 - classification_loss: 0.0054 8/500 [..............................] - ETA: 4:06 - loss: 0.1976 - regression_loss: 0.1926 - classification_loss: 0.0050 9/500 [..............................] - ETA: 4:06 - loss: 0.1873 - regression_loss: 0.1825 - classification_loss: 0.0047 10/500 [..............................] - ETA: 4:05 - loss: 0.2011 - regression_loss: 0.1953 - classification_loss: 0.0059 11/500 [..............................] - ETA: 4:04 - loss: 0.2246 - regression_loss: 0.2169 - classification_loss: 0.0078 12/500 [..............................] - ETA: 4:04 - loss: 0.2270 - regression_loss: 0.2191 - classification_loss: 0.0078 13/500 [..............................] - ETA: 4:03 - loss: 0.2382 - regression_loss: 0.2294 - classification_loss: 0.0088 14/500 [..............................] - ETA: 4:03 - loss: 0.2453 - regression_loss: 0.2346 - classification_loss: 0.0106 15/500 [..............................] - ETA: 4:03 - loss: 0.2395 - regression_loss: 0.2293 - classification_loss: 0.0103 16/500 [..............................] - ETA: 4:02 - loss: 0.2346 - regression_loss: 0.2249 - classification_loss: 0.0097 17/500 [>.............................] - ETA: 4:01 - loss: 0.2233 - regression_loss: 0.2141 - classification_loss: 0.0092 18/500 [>.............................] - ETA: 4:01 - loss: 0.2176 - regression_loss: 0.2088 - classification_loss: 0.0088 19/500 [>.............................] - ETA: 4:00 - loss: 0.2117 - regression_loss: 0.2031 - classification_loss: 0.0086 20/500 [>.............................] - ETA: 4:00 - loss: 0.2218 - regression_loss: 0.2113 - classification_loss: 0.0104 21/500 [>.............................] - ETA: 3:59 - loss: 0.2240 - regression_loss: 0.2134 - classification_loss: 0.0106 22/500 [>.............................] - ETA: 3:59 - loss: 0.2353 - regression_loss: 0.2244 - classification_loss: 0.0109 23/500 [>.............................] - ETA: 3:59 - loss: 0.2350 - regression_loss: 0.2243 - classification_loss: 0.0107 24/500 [>.............................] - ETA: 3:59 - loss: 0.2309 - regression_loss: 0.2207 - classification_loss: 0.0102 25/500 [>.............................] - ETA: 3:58 - loss: 0.2304 - regression_loss: 0.2200 - classification_loss: 0.0103 26/500 [>.............................] - ETA: 3:58 - loss: 0.2296 - regression_loss: 0.2192 - classification_loss: 0.0104 27/500 [>.............................] - ETA: 3:57 - loss: 0.2233 - regression_loss: 0.2132 - classification_loss: 0.0101 28/500 [>.............................] - ETA: 3:56 - loss: 0.2212 - regression_loss: 0.2112 - classification_loss: 0.0099 29/500 [>.............................] - ETA: 3:56 - loss: 0.2258 - regression_loss: 0.2159 - classification_loss: 0.0099 30/500 [>.............................] - ETA: 3:56 - loss: 0.2301 - regression_loss: 0.2200 - classification_loss: 0.0100 31/500 [>.............................] - ETA: 3:55 - loss: 0.2326 - regression_loss: 0.2223 - classification_loss: 0.0103 32/500 [>.............................] - ETA: 3:54 - loss: 0.2319 - regression_loss: 0.2218 - classification_loss: 0.0102 33/500 [>.............................] - ETA: 3:54 - loss: 0.2330 - regression_loss: 0.2231 - classification_loss: 0.0100 34/500 [=>............................] - ETA: 3:54 - loss: 0.2324 - regression_loss: 0.2224 - classification_loss: 0.0100 35/500 [=>............................] - ETA: 3:53 - loss: 0.2304 - regression_loss: 0.2205 - classification_loss: 0.0099 36/500 [=>............................] - ETA: 3:53 - loss: 0.2258 - regression_loss: 0.2161 - classification_loss: 0.0097 37/500 [=>............................] - ETA: 3:52 - loss: 0.2212 - regression_loss: 0.2117 - classification_loss: 0.0095 38/500 [=>............................] - ETA: 3:51 - loss: 0.2230 - regression_loss: 0.2130 - classification_loss: 0.0100 39/500 [=>............................] - ETA: 3:51 - loss: 0.2228 - regression_loss: 0.2129 - classification_loss: 0.0099 40/500 [=>............................] - ETA: 3:50 - loss: 0.2187 - regression_loss: 0.2089 - classification_loss: 0.0097 41/500 [=>............................] - ETA: 3:50 - loss: 0.2154 - regression_loss: 0.2059 - classification_loss: 0.0096 42/500 [=>............................] - ETA: 3:50 - loss: 0.2161 - regression_loss: 0.2065 - classification_loss: 0.0096 43/500 [=>............................] - ETA: 3:49 - loss: 0.2148 - regression_loss: 0.2052 - classification_loss: 0.0095 44/500 [=>............................] - ETA: 3:49 - loss: 0.2114 - regression_loss: 0.2020 - classification_loss: 0.0093 45/500 [=>............................] - ETA: 3:48 - loss: 0.2133 - regression_loss: 0.2039 - classification_loss: 0.0094 46/500 [=>............................] - ETA: 3:48 - loss: 0.2110 - regression_loss: 0.2018 - classification_loss: 0.0092 47/500 [=>............................] - ETA: 3:47 - loss: 0.2091 - regression_loss: 0.2000 - classification_loss: 0.0091 48/500 [=>............................] - ETA: 3:46 - loss: 0.2093 - regression_loss: 0.2001 - classification_loss: 0.0092 49/500 [=>............................] - ETA: 3:46 - loss: 0.2082 - regression_loss: 0.1991 - classification_loss: 0.0091 50/500 [==>...........................] - ETA: 3:46 - loss: 0.2102 - regression_loss: 0.2011 - classification_loss: 0.0090 51/500 [==>...........................] - ETA: 3:45 - loss: 0.2087 - regression_loss: 0.1996 - classification_loss: 0.0091 52/500 [==>...........................] - ETA: 3:45 - loss: 0.2061 - regression_loss: 0.1972 - classification_loss: 0.0090 53/500 [==>...........................] - ETA: 3:44 - loss: 0.2093 - regression_loss: 0.1990 - classification_loss: 0.0102 54/500 [==>...........................] - ETA: 3:44 - loss: 0.2102 - regression_loss: 0.2000 - classification_loss: 0.0102 55/500 [==>...........................] - ETA: 3:43 - loss: 0.2078 - regression_loss: 0.1978 - classification_loss: 0.0101 56/500 [==>...........................] - ETA: 3:43 - loss: 0.2089 - regression_loss: 0.1989 - classification_loss: 0.0100 57/500 [==>...........................] - ETA: 3:42 - loss: 0.2113 - regression_loss: 0.2010 - classification_loss: 0.0103 58/500 [==>...........................] - ETA: 3:42 - loss: 0.2128 - regression_loss: 0.2024 - classification_loss: 0.0104 59/500 [==>...........................] - ETA: 3:41 - loss: 0.2127 - regression_loss: 0.2023 - classification_loss: 0.0103 60/500 [==>...........................] - ETA: 3:41 - loss: 0.2127 - regression_loss: 0.2024 - classification_loss: 0.0103 61/500 [==>...........................] - ETA: 3:40 - loss: 0.2120 - regression_loss: 0.2018 - classification_loss: 0.0103 62/500 [==>...........................] - ETA: 3:40 - loss: 0.2130 - regression_loss: 0.2027 - classification_loss: 0.0103 63/500 [==>...........................] - ETA: 3:39 - loss: 0.2153 - regression_loss: 0.2049 - classification_loss: 0.0104 64/500 [==>...........................] - ETA: 3:39 - loss: 0.2155 - regression_loss: 0.2051 - classification_loss: 0.0104 65/500 [==>...........................] - ETA: 3:38 - loss: 0.2146 - regression_loss: 0.2044 - classification_loss: 0.0103 66/500 [==>...........................] - ETA: 3:38 - loss: 0.2153 - regression_loss: 0.2049 - classification_loss: 0.0104 67/500 [===>..........................] - ETA: 3:37 - loss: 0.2164 - regression_loss: 0.2060 - classification_loss: 0.0104 68/500 [===>..........................] - ETA: 3:37 - loss: 0.2171 - regression_loss: 0.2066 - classification_loss: 0.0105 69/500 [===>..........................] - ETA: 3:36 - loss: 0.2179 - regression_loss: 0.2075 - classification_loss: 0.0104 70/500 [===>..........................] - ETA: 3:36 - loss: 0.2178 - regression_loss: 0.2074 - classification_loss: 0.0104 71/500 [===>..........................] - ETA: 3:35 - loss: 0.2176 - regression_loss: 0.2073 - classification_loss: 0.0103 72/500 [===>..........................] - ETA: 3:35 - loss: 0.2180 - regression_loss: 0.2075 - classification_loss: 0.0105 73/500 [===>..........................] - ETA: 3:34 - loss: 0.2173 - regression_loss: 0.2069 - classification_loss: 0.0105 74/500 [===>..........................] - ETA: 3:34 - loss: 0.2185 - regression_loss: 0.2079 - classification_loss: 0.0106 75/500 [===>..........................] - ETA: 3:33 - loss: 0.2183 - regression_loss: 0.2078 - classification_loss: 0.0106 76/500 [===>..........................] - ETA: 3:33 - loss: 0.2209 - regression_loss: 0.2104 - classification_loss: 0.0105 77/500 [===>..........................] - ETA: 3:32 - loss: 0.2206 - regression_loss: 0.2102 - classification_loss: 0.0104 78/500 [===>..........................] - ETA: 3:32 - loss: 0.2214 - regression_loss: 0.2109 - classification_loss: 0.0105 79/500 [===>..........................] - ETA: 3:31 - loss: 0.2213 - regression_loss: 0.2108 - classification_loss: 0.0104 80/500 [===>..........................] - ETA: 3:31 - loss: 0.2219 - regression_loss: 0.2115 - classification_loss: 0.0104 81/500 [===>..........................] - ETA: 3:30 - loss: 0.2213 - regression_loss: 0.2110 - classification_loss: 0.0103 82/500 [===>..........................] - ETA: 3:30 - loss: 0.2210 - regression_loss: 0.2107 - classification_loss: 0.0103 83/500 [===>..........................] - ETA: 3:29 - loss: 0.2195 - regression_loss: 0.2093 - classification_loss: 0.0102 84/500 [====>.........................] - ETA: 3:29 - loss: 0.2180 - regression_loss: 0.2079 - classification_loss: 0.0101 85/500 [====>.........................] - ETA: 3:28 - loss: 0.2176 - regression_loss: 0.2076 - classification_loss: 0.0100 86/500 [====>.........................] - ETA: 3:28 - loss: 0.2166 - regression_loss: 0.2067 - classification_loss: 0.0099 87/500 [====>.........................] - ETA: 3:27 - loss: 0.2160 - regression_loss: 0.2061 - classification_loss: 0.0100 88/500 [====>.........................] - ETA: 3:27 - loss: 0.2146 - regression_loss: 0.2047 - classification_loss: 0.0099 89/500 [====>.........................] - ETA: 3:26 - loss: 0.2133 - regression_loss: 0.2035 - classification_loss: 0.0098 90/500 [====>.........................] - ETA: 3:26 - loss: 0.2128 - regression_loss: 0.2031 - classification_loss: 0.0097 91/500 [====>.........................] - ETA: 3:25 - loss: 0.2167 - regression_loss: 0.2069 - classification_loss: 0.0098 92/500 [====>.........................] - ETA: 3:25 - loss: 0.2168 - regression_loss: 0.2070 - classification_loss: 0.0098 93/500 [====>.........................] - ETA: 3:24 - loss: 0.2156 - regression_loss: 0.2059 - classification_loss: 0.0097 94/500 [====>.........................] - ETA: 3:24 - loss: 0.2154 - regression_loss: 0.2058 - classification_loss: 0.0096 95/500 [====>.........................] - ETA: 3:23 - loss: 0.2168 - regression_loss: 0.2070 - classification_loss: 0.0099 96/500 [====>.........................] - ETA: 3:23 - loss: 0.2162 - regression_loss: 0.2064 - classification_loss: 0.0098 97/500 [====>.........................] - ETA: 3:22 - loss: 0.2162 - regression_loss: 0.2064 - classification_loss: 0.0098 98/500 [====>.........................] - ETA: 3:22 - loss: 0.2152 - regression_loss: 0.2054 - classification_loss: 0.0098 99/500 [====>.........................] - ETA: 3:21 - loss: 0.2172 - regression_loss: 0.2074 - classification_loss: 0.0099 100/500 [=====>........................] - ETA: 3:21 - loss: 0.2187 - regression_loss: 0.2086 - classification_loss: 0.0101 101/500 [=====>........................] - ETA: 3:20 - loss: 0.2192 - regression_loss: 0.2090 - classification_loss: 0.0102 102/500 [=====>........................] - ETA: 3:20 - loss: 0.2192 - regression_loss: 0.2090 - classification_loss: 0.0102 103/500 [=====>........................] - ETA: 3:19 - loss: 0.2188 - regression_loss: 0.2087 - classification_loss: 0.0101 104/500 [=====>........................] - ETA: 3:19 - loss: 0.2179 - regression_loss: 0.2078 - classification_loss: 0.0101 105/500 [=====>........................] - ETA: 3:18 - loss: 0.2192 - regression_loss: 0.2090 - classification_loss: 0.0102 106/500 [=====>........................] - ETA: 3:18 - loss: 0.2187 - regression_loss: 0.2085 - classification_loss: 0.0102 107/500 [=====>........................] - ETA: 3:17 - loss: 0.2176 - regression_loss: 0.2075 - classification_loss: 0.0101 108/500 [=====>........................] - ETA: 3:17 - loss: 0.2164 - regression_loss: 0.2064 - classification_loss: 0.0100 109/500 [=====>........................] - ETA: 3:16 - loss: 0.2179 - regression_loss: 0.2074 - classification_loss: 0.0105 110/500 [=====>........................] - ETA: 3:16 - loss: 0.2171 - regression_loss: 0.2066 - classification_loss: 0.0105 111/500 [=====>........................] - ETA: 3:15 - loss: 0.2157 - regression_loss: 0.2052 - classification_loss: 0.0104 112/500 [=====>........................] - ETA: 3:15 - loss: 0.2149 - regression_loss: 0.2046 - classification_loss: 0.0104 113/500 [=====>........................] - ETA: 3:14 - loss: 0.2147 - regression_loss: 0.2043 - classification_loss: 0.0104 114/500 [=====>........................] - ETA: 3:14 - loss: 0.2143 - regression_loss: 0.2039 - classification_loss: 0.0104 115/500 [=====>........................] - ETA: 3:13 - loss: 0.2148 - regression_loss: 0.2044 - classification_loss: 0.0104 116/500 [=====>........................] - ETA: 3:13 - loss: 0.2156 - regression_loss: 0.2052 - classification_loss: 0.0104 117/500 [======>.......................] - ETA: 3:12 - loss: 0.2155 - regression_loss: 0.2051 - classification_loss: 0.0104 118/500 [======>.......................] - ETA: 3:12 - loss: 0.2152 - regression_loss: 0.2048 - classification_loss: 0.0104 119/500 [======>.......................] - ETA: 3:11 - loss: 0.2155 - regression_loss: 0.2052 - classification_loss: 0.0103 120/500 [======>.......................] - ETA: 3:11 - loss: 0.2145 - regression_loss: 0.2043 - classification_loss: 0.0103 121/500 [======>.......................] - ETA: 3:10 - loss: 0.2141 - regression_loss: 0.2039 - classification_loss: 0.0102 122/500 [======>.......................] - ETA: 3:10 - loss: 0.2136 - regression_loss: 0.2034 - classification_loss: 0.0102 123/500 [======>.......................] - ETA: 3:09 - loss: 0.2136 - regression_loss: 0.2034 - classification_loss: 0.0102 124/500 [======>.......................] - ETA: 3:09 - loss: 0.2137 - regression_loss: 0.2035 - classification_loss: 0.0102 125/500 [======>.......................] - ETA: 3:08 - loss: 0.2135 - regression_loss: 0.2033 - classification_loss: 0.0101 126/500 [======>.......................] - ETA: 3:08 - loss: 0.2136 - regression_loss: 0.2034 - classification_loss: 0.0101 127/500 [======>.......................] - ETA: 3:07 - loss: 0.2140 - regression_loss: 0.2039 - classification_loss: 0.0101 128/500 [======>.......................] - ETA: 3:07 - loss: 0.2138 - regression_loss: 0.2038 - classification_loss: 0.0101 129/500 [======>.......................] - ETA: 3:06 - loss: 0.2148 - regression_loss: 0.2047 - classification_loss: 0.0101 130/500 [======>.......................] - ETA: 3:06 - loss: 0.2150 - regression_loss: 0.2048 - classification_loss: 0.0102 131/500 [======>.......................] - ETA: 3:05 - loss: 0.2158 - regression_loss: 0.2056 - classification_loss: 0.0102 132/500 [======>.......................] - ETA: 3:05 - loss: 0.2160 - regression_loss: 0.2058 - classification_loss: 0.0102 133/500 [======>.......................] - ETA: 3:04 - loss: 0.2160 - regression_loss: 0.2058 - classification_loss: 0.0102 134/500 [=======>......................] - ETA: 3:04 - loss: 0.2165 - regression_loss: 0.2062 - classification_loss: 0.0103 135/500 [=======>......................] - ETA: 3:03 - loss: 0.2184 - regression_loss: 0.2081 - classification_loss: 0.0103 136/500 [=======>......................] - ETA: 3:03 - loss: 0.2172 - regression_loss: 0.2070 - classification_loss: 0.0102 137/500 [=======>......................] - ETA: 3:02 - loss: 0.2171 - regression_loss: 0.2069 - classification_loss: 0.0102 138/500 [=======>......................] - ETA: 3:01 - loss: 0.2167 - regression_loss: 0.2066 - classification_loss: 0.0101 139/500 [=======>......................] - ETA: 3:01 - loss: 0.2173 - regression_loss: 0.2071 - classification_loss: 0.0102 140/500 [=======>......................] - ETA: 3:01 - loss: 0.2173 - regression_loss: 0.2071 - classification_loss: 0.0102 141/500 [=======>......................] - ETA: 3:00 - loss: 0.2164 - regression_loss: 0.2063 - classification_loss: 0.0102 142/500 [=======>......................] - ETA: 3:00 - loss: 0.2171 - regression_loss: 0.2069 - classification_loss: 0.0102 143/500 [=======>......................] - ETA: 2:59 - loss: 0.2167 - regression_loss: 0.2066 - classification_loss: 0.0101 144/500 [=======>......................] - ETA: 2:59 - loss: 0.2159 - regression_loss: 0.2059 - classification_loss: 0.0101 145/500 [=======>......................] - ETA: 2:58 - loss: 0.2159 - regression_loss: 0.2059 - classification_loss: 0.0100 146/500 [=======>......................] - ETA: 2:58 - loss: 0.2163 - regression_loss: 0.2063 - classification_loss: 0.0100 147/500 [=======>......................] - ETA: 2:57 - loss: 0.2156 - regression_loss: 0.2056 - classification_loss: 0.0100 148/500 [=======>......................] - ETA: 2:56 - loss: 0.2161 - regression_loss: 0.2061 - classification_loss: 0.0100 149/500 [=======>......................] - ETA: 2:56 - loss: 0.2159 - regression_loss: 0.2060 - classification_loss: 0.0099 150/500 [========>.....................] - ETA: 2:55 - loss: 0.2161 - regression_loss: 0.2062 - classification_loss: 0.0099 151/500 [========>.....................] - ETA: 2:55 - loss: 0.2166 - regression_loss: 0.2067 - classification_loss: 0.0099 152/500 [========>.....................] - ETA: 2:55 - loss: 0.2160 - regression_loss: 0.2061 - classification_loss: 0.0099 153/500 [========>.....................] - ETA: 2:54 - loss: 0.2167 - regression_loss: 0.2068 - classification_loss: 0.0099 154/500 [========>.....................] - ETA: 2:54 - loss: 0.2159 - regression_loss: 0.2061 - classification_loss: 0.0098 155/500 [========>.....................] - ETA: 2:53 - loss: 0.2156 - regression_loss: 0.2057 - classification_loss: 0.0098 156/500 [========>.....................] - ETA: 2:53 - loss: 0.2162 - regression_loss: 0.2065 - classification_loss: 0.0098 157/500 [========>.....................] - ETA: 2:52 - loss: 0.2160 - regression_loss: 0.2062 - classification_loss: 0.0098 158/500 [========>.....................] - ETA: 2:51 - loss: 0.2162 - regression_loss: 0.2064 - classification_loss: 0.0098 159/500 [========>.....................] - ETA: 2:51 - loss: 0.2163 - regression_loss: 0.2066 - classification_loss: 0.0097 160/500 [========>.....................] - ETA: 2:51 - loss: 0.2163 - regression_loss: 0.2066 - classification_loss: 0.0097 161/500 [========>.....................] - ETA: 2:50 - loss: 0.2158 - regression_loss: 0.2062 - classification_loss: 0.0096 162/500 [========>.....................] - ETA: 2:50 - loss: 0.2161 - regression_loss: 0.2064 - classification_loss: 0.0096 163/500 [========>.....................] - ETA: 2:49 - loss: 0.2156 - regression_loss: 0.2060 - classification_loss: 0.0096 164/500 [========>.....................] - ETA: 2:49 - loss: 0.2156 - regression_loss: 0.2061 - classification_loss: 0.0095 165/500 [========>.....................] - ETA: 2:48 - loss: 0.2164 - regression_loss: 0.2069 - classification_loss: 0.0095 166/500 [========>.....................] - ETA: 2:48 - loss: 0.2177 - regression_loss: 0.2081 - classification_loss: 0.0096 167/500 [=========>....................] - ETA: 2:47 - loss: 0.2185 - regression_loss: 0.2089 - classification_loss: 0.0096 168/500 [=========>....................] - ETA: 2:47 - loss: 0.2184 - regression_loss: 0.2088 - classification_loss: 0.0096 169/500 [=========>....................] - ETA: 2:46 - loss: 0.2185 - regression_loss: 0.2089 - classification_loss: 0.0096 170/500 [=========>....................] - ETA: 2:46 - loss: 0.2176 - regression_loss: 0.2081 - classification_loss: 0.0095 171/500 [=========>....................] - ETA: 2:45 - loss: 0.2188 - regression_loss: 0.2093 - classification_loss: 0.0095 172/500 [=========>....................] - ETA: 2:45 - loss: 0.2184 - regression_loss: 0.2090 - classification_loss: 0.0095 173/500 [=========>....................] - ETA: 2:44 - loss: 0.2183 - regression_loss: 0.2088 - classification_loss: 0.0095 174/500 [=========>....................] - ETA: 2:44 - loss: 0.2190 - regression_loss: 0.2095 - classification_loss: 0.0095 175/500 [=========>....................] - ETA: 2:43 - loss: 0.2185 - regression_loss: 0.2091 - classification_loss: 0.0094 176/500 [=========>....................] - ETA: 2:43 - loss: 0.2194 - regression_loss: 0.2098 - classification_loss: 0.0096 177/500 [=========>....................] - ETA: 2:42 - loss: 0.2192 - regression_loss: 0.2097 - classification_loss: 0.0095 178/500 [=========>....................] - ETA: 2:42 - loss: 0.2202 - regression_loss: 0.2106 - classification_loss: 0.0096 179/500 [=========>....................] - ETA: 2:41 - loss: 0.2201 - regression_loss: 0.2106 - classification_loss: 0.0096 180/500 [=========>....................] - ETA: 2:41 - loss: 0.2206 - regression_loss: 0.2110 - classification_loss: 0.0096 181/500 [=========>....................] - ETA: 2:40 - loss: 0.2200 - regression_loss: 0.2104 - classification_loss: 0.0096 182/500 [=========>....................] - ETA: 2:40 - loss: 0.2199 - regression_loss: 0.2104 - classification_loss: 0.0095 183/500 [=========>....................] - ETA: 2:39 - loss: 0.2201 - regression_loss: 0.2106 - classification_loss: 0.0096 184/500 [==========>...................] - ETA: 2:39 - loss: 0.2202 - regression_loss: 0.2106 - classification_loss: 0.0096 185/500 [==========>...................] - ETA: 2:38 - loss: 0.2196 - regression_loss: 0.2100 - classification_loss: 0.0095 186/500 [==========>...................] - ETA: 2:38 - loss: 0.2197 - regression_loss: 0.2101 - classification_loss: 0.0096 187/500 [==========>...................] - ETA: 2:37 - loss: 0.2190 - regression_loss: 0.2095 - classification_loss: 0.0095 188/500 [==========>...................] - ETA: 2:37 - loss: 0.2197 - regression_loss: 0.2100 - classification_loss: 0.0096 189/500 [==========>...................] - ETA: 2:36 - loss: 0.2212 - regression_loss: 0.2116 - classification_loss: 0.0096 190/500 [==========>...................] - ETA: 2:36 - loss: 0.2212 - regression_loss: 0.2116 - classification_loss: 0.0096 191/500 [==========>...................] - ETA: 2:35 - loss: 0.2204 - regression_loss: 0.2109 - classification_loss: 0.0095 192/500 [==========>...................] - ETA: 2:35 - loss: 0.2208 - regression_loss: 0.2113 - classification_loss: 0.0095 193/500 [==========>...................] - ETA: 2:34 - loss: 0.2213 - regression_loss: 0.2118 - classification_loss: 0.0096 194/500 [==========>...................] - ETA: 2:34 - loss: 0.2221 - regression_loss: 0.2126 - classification_loss: 0.0096 195/500 [==========>...................] - ETA: 2:33 - loss: 0.2228 - regression_loss: 0.2132 - classification_loss: 0.0096 196/500 [==========>...................] - ETA: 2:33 - loss: 0.2220 - regression_loss: 0.2125 - classification_loss: 0.0095 197/500 [==========>...................] - ETA: 2:32 - loss: 0.2221 - regression_loss: 0.2126 - classification_loss: 0.0096 198/500 [==========>...................] - ETA: 2:32 - loss: 0.2219 - regression_loss: 0.2123 - classification_loss: 0.0095 199/500 [==========>...................] - ETA: 2:31 - loss: 0.2218 - regression_loss: 0.2123 - classification_loss: 0.0095 200/500 [===========>..................] - ETA: 2:31 - loss: 0.2216 - regression_loss: 0.2121 - classification_loss: 0.0095 201/500 [===========>..................] - ETA: 2:30 - loss: 0.2212 - regression_loss: 0.2117 - classification_loss: 0.0095 202/500 [===========>..................] - ETA: 2:30 - loss: 0.2228 - regression_loss: 0.2134 - classification_loss: 0.0095 203/500 [===========>..................] - ETA: 2:29 - loss: 0.2231 - regression_loss: 0.2137 - classification_loss: 0.0094 204/500 [===========>..................] - ETA: 2:28 - loss: 0.2231 - regression_loss: 0.2137 - classification_loss: 0.0094 205/500 [===========>..................] - ETA: 2:28 - loss: 0.2232 - regression_loss: 0.2138 - classification_loss: 0.0094 206/500 [===========>..................] - ETA: 2:27 - loss: 0.2227 - regression_loss: 0.2133 - classification_loss: 0.0094 207/500 [===========>..................] - ETA: 2:27 - loss: 0.2227 - regression_loss: 0.2133 - classification_loss: 0.0093 208/500 [===========>..................] - ETA: 2:26 - loss: 0.2229 - regression_loss: 0.2135 - classification_loss: 0.0093 209/500 [===========>..................] - ETA: 2:26 - loss: 0.2227 - regression_loss: 0.2133 - classification_loss: 0.0093 210/500 [===========>..................] - ETA: 2:25 - loss: 0.2236 - regression_loss: 0.2143 - classification_loss: 0.0094 211/500 [===========>..................] - ETA: 2:25 - loss: 0.2233 - regression_loss: 0.2140 - classification_loss: 0.0093 212/500 [===========>..................] - ETA: 2:24 - loss: 0.2240 - regression_loss: 0.2146 - classification_loss: 0.0095 213/500 [===========>..................] - ETA: 2:24 - loss: 0.2233 - regression_loss: 0.2139 - classification_loss: 0.0094 214/500 [===========>..................] - ETA: 2:23 - loss: 0.2229 - regression_loss: 0.2135 - classification_loss: 0.0094 215/500 [===========>..................] - ETA: 2:23 - loss: 0.2231 - regression_loss: 0.2137 - classification_loss: 0.0094 216/500 [===========>..................] - ETA: 2:22 - loss: 0.2230 - regression_loss: 0.2136 - classification_loss: 0.0094 217/500 [============>.................] - ETA: 2:22 - loss: 0.2231 - regression_loss: 0.2137 - classification_loss: 0.0094 218/500 [============>.................] - ETA: 2:21 - loss: 0.2231 - regression_loss: 0.2137 - classification_loss: 0.0094 219/500 [============>.................] - ETA: 2:21 - loss: 0.2227 - regression_loss: 0.2134 - classification_loss: 0.0094 220/500 [============>.................] - ETA: 2:20 - loss: 0.2223 - regression_loss: 0.2130 - classification_loss: 0.0093 221/500 [============>.................] - ETA: 2:20 - loss: 0.2222 - regression_loss: 0.2128 - classification_loss: 0.0093 222/500 [============>.................] - ETA: 2:19 - loss: 0.2219 - regression_loss: 0.2126 - classification_loss: 0.0093 223/500 [============>.................] - ETA: 2:19 - loss: 0.2219 - regression_loss: 0.2125 - classification_loss: 0.0093 224/500 [============>.................] - ETA: 2:18 - loss: 0.2223 - regression_loss: 0.2129 - classification_loss: 0.0094 225/500 [============>.................] - ETA: 2:18 - loss: 0.2216 - regression_loss: 0.2122 - classification_loss: 0.0094 226/500 [============>.................] - ETA: 2:17 - loss: 0.2216 - regression_loss: 0.2122 - classification_loss: 0.0094 227/500 [============>.................] - ETA: 2:17 - loss: 0.2215 - regression_loss: 0.2121 - classification_loss: 0.0094 228/500 [============>.................] - ETA: 2:16 - loss: 0.2215 - regression_loss: 0.2121 - classification_loss: 0.0094 229/500 [============>.................] - ETA: 2:16 - loss: 0.2211 - regression_loss: 0.2117 - classification_loss: 0.0094 230/500 [============>.................] - ETA: 2:15 - loss: 0.2207 - regression_loss: 0.2113 - classification_loss: 0.0094 231/500 [============>.................] - ETA: 2:15 - loss: 0.2205 - regression_loss: 0.2111 - classification_loss: 0.0094 232/500 [============>.................] - ETA: 2:14 - loss: 0.2200 - regression_loss: 0.2106 - classification_loss: 0.0094 233/500 [============>.................] - ETA: 2:14 - loss: 0.2196 - regression_loss: 0.2103 - classification_loss: 0.0094 234/500 [=============>................] - ETA: 2:13 - loss: 0.2195 - regression_loss: 0.2101 - classification_loss: 0.0094 235/500 [=============>................] - ETA: 2:13 - loss: 0.2191 - regression_loss: 0.2098 - classification_loss: 0.0093 236/500 [=============>................] - ETA: 2:12 - loss: 0.2188 - regression_loss: 0.2095 - classification_loss: 0.0093 237/500 [=============>................] - ETA: 2:12 - loss: 0.2186 - regression_loss: 0.2093 - classification_loss: 0.0093 238/500 [=============>................] - ETA: 2:11 - loss: 0.2183 - regression_loss: 0.2090 - classification_loss: 0.0092 239/500 [=============>................] - ETA: 2:11 - loss: 0.2180 - regression_loss: 0.2088 - classification_loss: 0.0092 240/500 [=============>................] - ETA: 2:10 - loss: 0.2180 - regression_loss: 0.2088 - classification_loss: 0.0092 241/500 [=============>................] - ETA: 2:10 - loss: 0.2181 - regression_loss: 0.2089 - classification_loss: 0.0092 242/500 [=============>................] - ETA: 2:09 - loss: 0.2177 - regression_loss: 0.2084 - classification_loss: 0.0092 243/500 [=============>................] - ETA: 2:09 - loss: 0.2173 - regression_loss: 0.2081 - classification_loss: 0.0092 244/500 [=============>................] - ETA: 2:08 - loss: 0.2175 - regression_loss: 0.2083 - classification_loss: 0.0092 245/500 [=============>................] - ETA: 2:08 - loss: 0.2173 - regression_loss: 0.2081 - classification_loss: 0.0092 246/500 [=============>................] - ETA: 2:07 - loss: 0.2177 - regression_loss: 0.2085 - classification_loss: 0.0092 247/500 [=============>................] - ETA: 2:07 - loss: 0.2179 - regression_loss: 0.2087 - classification_loss: 0.0092 248/500 [=============>................] - ETA: 2:06 - loss: 0.2177 - regression_loss: 0.2085 - classification_loss: 0.0092 249/500 [=============>................] - ETA: 2:06 - loss: 0.2178 - regression_loss: 0.2086 - classification_loss: 0.0092 250/500 [==============>...............] - ETA: 2:05 - loss: 0.2173 - regression_loss: 0.2082 - classification_loss: 0.0091 251/500 [==============>...............] - ETA: 2:05 - loss: 0.2180 - regression_loss: 0.2088 - classification_loss: 0.0092 252/500 [==============>...............] - ETA: 2:04 - loss: 0.2181 - regression_loss: 0.2089 - classification_loss: 0.0092 253/500 [==============>...............] - ETA: 2:04 - loss: 0.2190 - regression_loss: 0.2098 - classification_loss: 0.0092 254/500 [==============>...............] - ETA: 2:03 - loss: 0.2191 - regression_loss: 0.2098 - classification_loss: 0.0092 255/500 [==============>...............] - ETA: 2:03 - loss: 0.2196 - regression_loss: 0.2103 - classification_loss: 0.0093 256/500 [==============>...............] - ETA: 2:02 - loss: 0.2197 - regression_loss: 0.2104 - classification_loss: 0.0092 257/500 [==============>...............] - ETA: 2:02 - loss: 0.2200 - regression_loss: 0.2107 - classification_loss: 0.0093 258/500 [==============>...............] - ETA: 2:01 - loss: 0.2197 - regression_loss: 0.2104 - classification_loss: 0.0093 259/500 [==============>...............] - ETA: 2:01 - loss: 0.2195 - regression_loss: 0.2103 - classification_loss: 0.0092 260/500 [==============>...............] - ETA: 2:00 - loss: 0.2191 - regression_loss: 0.2098 - classification_loss: 0.0092 261/500 [==============>...............] - ETA: 2:00 - loss: 0.2188 - regression_loss: 0.2096 - classification_loss: 0.0092 262/500 [==============>...............] - ETA: 1:59 - loss: 0.2187 - regression_loss: 0.2094 - classification_loss: 0.0092 263/500 [==============>...............] - ETA: 1:59 - loss: 0.2184 - regression_loss: 0.2092 - classification_loss: 0.0092 264/500 [==============>...............] - ETA: 1:58 - loss: 0.2189 - regression_loss: 0.2097 - classification_loss: 0.0092 265/500 [==============>...............] - ETA: 1:58 - loss: 0.2187 - regression_loss: 0.2095 - classification_loss: 0.0092 266/500 [==============>...............] - ETA: 1:57 - loss: 0.2189 - regression_loss: 0.2097 - classification_loss: 0.0092 267/500 [===============>..............] - ETA: 1:57 - loss: 0.2189 - regression_loss: 0.2097 - classification_loss: 0.0092 268/500 [===============>..............] - ETA: 1:56 - loss: 0.2194 - regression_loss: 0.2103 - classification_loss: 0.0092 269/500 [===============>..............] - ETA: 1:56 - loss: 0.2191 - regression_loss: 0.2100 - classification_loss: 0.0091 270/500 [===============>..............] - ETA: 1:55 - loss: 0.2193 - regression_loss: 0.2101 - classification_loss: 0.0091 271/500 [===============>..............] - ETA: 1:55 - loss: 0.2190 - regression_loss: 0.2099 - classification_loss: 0.0091 272/500 [===============>..............] - ETA: 1:54 - loss: 0.2185 - regression_loss: 0.2094 - classification_loss: 0.0091 273/500 [===============>..............] - ETA: 1:54 - loss: 0.2181 - regression_loss: 0.2090 - classification_loss: 0.0091 274/500 [===============>..............] - ETA: 1:53 - loss: 0.2181 - regression_loss: 0.2090 - classification_loss: 0.0091 275/500 [===============>..............] - ETA: 1:53 - loss: 0.2181 - regression_loss: 0.2090 - classification_loss: 0.0091 276/500 [===============>..............] - ETA: 1:52 - loss: 0.2180 - regression_loss: 0.2089 - classification_loss: 0.0091 277/500 [===============>..............] - ETA: 1:52 - loss: 0.2179 - regression_loss: 0.2089 - classification_loss: 0.0090 278/500 [===============>..............] - ETA: 1:51 - loss: 0.2176 - regression_loss: 0.2086 - classification_loss: 0.0090 279/500 [===============>..............] - ETA: 1:51 - loss: 0.2185 - regression_loss: 0.2094 - classification_loss: 0.0091 280/500 [===============>..............] - ETA: 1:50 - loss: 0.2183 - regression_loss: 0.2092 - classification_loss: 0.0091 281/500 [===============>..............] - ETA: 1:50 - loss: 0.2189 - regression_loss: 0.2098 - classification_loss: 0.0091 282/500 [===============>..............] - ETA: 1:49 - loss: 0.2189 - regression_loss: 0.2098 - classification_loss: 0.0091 283/500 [===============>..............] - ETA: 1:49 - loss: 0.2186 - regression_loss: 0.2096 - classification_loss: 0.0091 284/500 [================>.............] - ETA: 1:48 - loss: 0.2184 - regression_loss: 0.2093 - classification_loss: 0.0091 285/500 [================>.............] - ETA: 1:48 - loss: 0.2184 - regression_loss: 0.2093 - classification_loss: 0.0091 286/500 [================>.............] - ETA: 1:47 - loss: 0.2191 - regression_loss: 0.2100 - classification_loss: 0.0091 287/500 [================>.............] - ETA: 1:47 - loss: 0.2201 - regression_loss: 0.2110 - classification_loss: 0.0091 288/500 [================>.............] - ETA: 1:46 - loss: 0.2207 - regression_loss: 0.2116 - classification_loss: 0.0091 289/500 [================>.............] - ETA: 1:46 - loss: 0.2202 - regression_loss: 0.2111 - classification_loss: 0.0091 290/500 [================>.............] - ETA: 1:45 - loss: 0.2202 - regression_loss: 0.2111 - classification_loss: 0.0091 291/500 [================>.............] - ETA: 1:45 - loss: 0.2202 - regression_loss: 0.2111 - classification_loss: 0.0091 292/500 [================>.............] - ETA: 1:44 - loss: 0.2201 - regression_loss: 0.2110 - classification_loss: 0.0091 293/500 [================>.............] - ETA: 1:44 - loss: 0.2200 - regression_loss: 0.2109 - classification_loss: 0.0091 294/500 [================>.............] - ETA: 1:43 - loss: 0.2198 - regression_loss: 0.2107 - classification_loss: 0.0091 295/500 [================>.............] - ETA: 1:43 - loss: 0.2195 - regression_loss: 0.2104 - classification_loss: 0.0091 296/500 [================>.............] - ETA: 1:42 - loss: 0.2192 - regression_loss: 0.2102 - classification_loss: 0.0091 297/500 [================>.............] - ETA: 1:42 - loss: 0.2197 - regression_loss: 0.2106 - classification_loss: 0.0091 298/500 [================>.............] - ETA: 1:41 - loss: 0.2198 - regression_loss: 0.2107 - classification_loss: 0.0091 299/500 [================>.............] - ETA: 1:41 - loss: 0.2192 - regression_loss: 0.2102 - classification_loss: 0.0091 300/500 [=================>............] - ETA: 1:40 - loss: 0.2187 - regression_loss: 0.2096 - classification_loss: 0.0090 301/500 [=================>............] - ETA: 1:40 - loss: 0.2199 - regression_loss: 0.2109 - classification_loss: 0.0091 302/500 [=================>............] - ETA: 1:39 - loss: 0.2199 - regression_loss: 0.2108 - classification_loss: 0.0090 303/500 [=================>............] - ETA: 1:39 - loss: 0.2198 - regression_loss: 0.2107 - classification_loss: 0.0090 304/500 [=================>............] - ETA: 1:38 - loss: 0.2195 - regression_loss: 0.2105 - classification_loss: 0.0090 305/500 [=================>............] - ETA: 1:38 - loss: 0.2192 - regression_loss: 0.2102 - classification_loss: 0.0090 306/500 [=================>............] - ETA: 1:37 - loss: 0.2193 - regression_loss: 0.2103 - classification_loss: 0.0090 307/500 [=================>............] - ETA: 1:37 - loss: 0.2193 - regression_loss: 0.2103 - classification_loss: 0.0090 308/500 [=================>............] - ETA: 1:36 - loss: 0.2192 - regression_loss: 0.2102 - classification_loss: 0.0090 309/500 [=================>............] - ETA: 1:36 - loss: 0.2192 - regression_loss: 0.2102 - classification_loss: 0.0090 310/500 [=================>............] - ETA: 1:35 - loss: 0.2193 - regression_loss: 0.2103 - classification_loss: 0.0091 311/500 [=================>............] - ETA: 1:35 - loss: 0.2194 - regression_loss: 0.2104 - classification_loss: 0.0090 312/500 [=================>............] - ETA: 1:34 - loss: 0.2195 - regression_loss: 0.2104 - classification_loss: 0.0091 313/500 [=================>............] - ETA: 1:33 - loss: 0.2195 - regression_loss: 0.2105 - classification_loss: 0.0091 314/500 [=================>............] - ETA: 1:33 - loss: 0.2202 - regression_loss: 0.2111 - classification_loss: 0.0091 315/500 [=================>............] - ETA: 1:32 - loss: 0.2200 - regression_loss: 0.2109 - classification_loss: 0.0091 316/500 [=================>............] - ETA: 1:32 - loss: 0.2198 - regression_loss: 0.2107 - classification_loss: 0.0091 317/500 [==================>...........] - ETA: 1:31 - loss: 0.2195 - regression_loss: 0.2104 - classification_loss: 0.0091 318/500 [==================>...........] - ETA: 1:31 - loss: 0.2195 - regression_loss: 0.2104 - classification_loss: 0.0091 319/500 [==================>...........] - ETA: 1:30 - loss: 0.2193 - regression_loss: 0.2102 - classification_loss: 0.0090 320/500 [==================>...........] - ETA: 1:30 - loss: 0.2189 - regression_loss: 0.2099 - classification_loss: 0.0090 321/500 [==================>...........] - ETA: 1:29 - loss: 0.2185 - regression_loss: 0.2095 - classification_loss: 0.0090 322/500 [==================>...........] - ETA: 1:29 - loss: 0.2182 - regression_loss: 0.2092 - classification_loss: 0.0090 323/500 [==================>...........] - ETA: 1:28 - loss: 0.2184 - regression_loss: 0.2094 - classification_loss: 0.0090 324/500 [==================>...........] - ETA: 1:28 - loss: 0.2182 - regression_loss: 0.2092 - classification_loss: 0.0090 325/500 [==================>...........] - ETA: 1:27 - loss: 0.2183 - regression_loss: 0.2093 - classification_loss: 0.0090 326/500 [==================>...........] - ETA: 1:27 - loss: 0.2181 - regression_loss: 0.2091 - classification_loss: 0.0090 327/500 [==================>...........] - ETA: 1:26 - loss: 0.2178 - regression_loss: 0.2088 - classification_loss: 0.0090 328/500 [==================>...........] - ETA: 1:26 - loss: 0.2180 - regression_loss: 0.2090 - classification_loss: 0.0090 329/500 [==================>...........] - ETA: 1:25 - loss: 0.2181 - regression_loss: 0.2092 - classification_loss: 0.0090 330/500 [==================>...........] - ETA: 1:25 - loss: 0.2179 - regression_loss: 0.2090 - classification_loss: 0.0089 331/500 [==================>...........] - ETA: 1:24 - loss: 0.2181 - regression_loss: 0.2091 - classification_loss: 0.0090 332/500 [==================>...........] - ETA: 1:24 - loss: 0.2179 - regression_loss: 0.2089 - classification_loss: 0.0089 333/500 [==================>...........] - ETA: 1:23 - loss: 0.2180 - regression_loss: 0.2090 - classification_loss: 0.0089 334/500 [===================>..........] - ETA: 1:23 - loss: 0.2181 - regression_loss: 0.2092 - classification_loss: 0.0090 335/500 [===================>..........] - ETA: 1:22 - loss: 0.2184 - regression_loss: 0.2094 - classification_loss: 0.0090 336/500 [===================>..........] - ETA: 1:22 - loss: 0.2180 - regression_loss: 0.2091 - classification_loss: 0.0090 337/500 [===================>..........] - ETA: 1:21 - loss: 0.2178 - regression_loss: 0.2089 - classification_loss: 0.0090 338/500 [===================>..........] - ETA: 1:21 - loss: 0.2175 - regression_loss: 0.2085 - classification_loss: 0.0089 339/500 [===================>..........] - ETA: 1:20 - loss: 0.2174 - regression_loss: 0.2085 - classification_loss: 0.0089 340/500 [===================>..........] - ETA: 1:20 - loss: 0.2174 - regression_loss: 0.2085 - classification_loss: 0.0089 341/500 [===================>..........] - ETA: 1:19 - loss: 0.2175 - regression_loss: 0.2086 - classification_loss: 0.0089 342/500 [===================>..........] - ETA: 1:19 - loss: 0.2179 - regression_loss: 0.2089 - classification_loss: 0.0090 343/500 [===================>..........] - ETA: 1:18 - loss: 0.2179 - regression_loss: 0.2088 - classification_loss: 0.0090 344/500 [===================>..........] - ETA: 1:18 - loss: 0.2175 - regression_loss: 0.2085 - classification_loss: 0.0090 345/500 [===================>..........] - ETA: 1:17 - loss: 0.2176 - regression_loss: 0.2086 - classification_loss: 0.0090 346/500 [===================>..........] - ETA: 1:17 - loss: 0.2175 - regression_loss: 0.2085 - classification_loss: 0.0090 347/500 [===================>..........] - ETA: 1:16 - loss: 0.2176 - regression_loss: 0.2086 - classification_loss: 0.0090 348/500 [===================>..........] - ETA: 1:16 - loss: 0.2176 - regression_loss: 0.2086 - classification_loss: 0.0090 349/500 [===================>..........] - ETA: 1:15 - loss: 0.2178 - regression_loss: 0.2088 - classification_loss: 0.0090 350/500 [====================>.........] - ETA: 1:15 - loss: 0.2179 - regression_loss: 0.2088 - classification_loss: 0.0091 351/500 [====================>.........] - ETA: 1:14 - loss: 0.2183 - regression_loss: 0.2093 - classification_loss: 0.0091 352/500 [====================>.........] - ETA: 1:14 - loss: 0.2181 - regression_loss: 0.2090 - classification_loss: 0.0091 353/500 [====================>.........] - ETA: 1:13 - loss: 0.2179 - regression_loss: 0.2089 - classification_loss: 0.0091 354/500 [====================>.........] - ETA: 1:13 - loss: 0.2176 - regression_loss: 0.2086 - classification_loss: 0.0090 355/500 [====================>.........] - ETA: 1:12 - loss: 0.2173 - regression_loss: 0.2082 - classification_loss: 0.0090 356/500 [====================>.........] - ETA: 1:12 - loss: 0.2175 - regression_loss: 0.2084 - classification_loss: 0.0091 357/500 [====================>.........] - ETA: 1:11 - loss: 0.2174 - regression_loss: 0.2084 - classification_loss: 0.0090 358/500 [====================>.........] - ETA: 1:11 - loss: 0.2173 - regression_loss: 0.2083 - classification_loss: 0.0090 359/500 [====================>.........] - ETA: 1:10 - loss: 0.2171 - regression_loss: 0.2081 - classification_loss: 0.0090 360/500 [====================>.........] - ETA: 1:10 - loss: 0.2171 - regression_loss: 0.2080 - classification_loss: 0.0090 361/500 [====================>.........] - ETA: 1:09 - loss: 0.2170 - regression_loss: 0.2080 - classification_loss: 0.0090 362/500 [====================>.........] - ETA: 1:09 - loss: 0.2167 - regression_loss: 0.2077 - classification_loss: 0.0090 363/500 [====================>.........] - ETA: 1:08 - loss: 0.2165 - regression_loss: 0.2075 - classification_loss: 0.0090 364/500 [====================>.........] - ETA: 1:08 - loss: 0.2164 - regression_loss: 0.2074 - classification_loss: 0.0090 365/500 [====================>.........] - ETA: 1:07 - loss: 0.2163 - regression_loss: 0.2073 - classification_loss: 0.0090 366/500 [====================>.........] - ETA: 1:07 - loss: 0.2160 - regression_loss: 0.2070 - classification_loss: 0.0090 367/500 [=====================>........] - ETA: 1:06 - loss: 0.2163 - regression_loss: 0.2073 - classification_loss: 0.0090 368/500 [=====================>........] - ETA: 1:06 - loss: 0.2163 - regression_loss: 0.2074 - classification_loss: 0.0090 369/500 [=====================>........] - ETA: 1:05 - loss: 0.2161 - regression_loss: 0.2072 - classification_loss: 0.0089 370/500 [=====================>........] - ETA: 1:05 - loss: 0.2157 - regression_loss: 0.2068 - classification_loss: 0.0089 371/500 [=====================>........] - ETA: 1:04 - loss: 0.2157 - regression_loss: 0.2067 - classification_loss: 0.0089 372/500 [=====================>........] - ETA: 1:04 - loss: 0.2156 - regression_loss: 0.2067 - classification_loss: 0.0089 373/500 [=====================>........] - ETA: 1:03 - loss: 0.2154 - regression_loss: 0.2065 - classification_loss: 0.0089 374/500 [=====================>........] - ETA: 1:03 - loss: 0.2156 - regression_loss: 0.2067 - classification_loss: 0.0089 375/500 [=====================>........] - ETA: 1:02 - loss: 0.2157 - regression_loss: 0.2068 - classification_loss: 0.0089 376/500 [=====================>........] - ETA: 1:02 - loss: 0.2158 - regression_loss: 0.2069 - classification_loss: 0.0089 377/500 [=====================>........] - ETA: 1:01 - loss: 0.2159 - regression_loss: 0.2069 - classification_loss: 0.0089 378/500 [=====================>........] - ETA: 1:01 - loss: 0.2160 - regression_loss: 0.2070 - classification_loss: 0.0090 379/500 [=====================>........] - ETA: 1:00 - loss: 0.2164 - regression_loss: 0.2072 - classification_loss: 0.0091 380/500 [=====================>........] - ETA: 1:00 - loss: 0.2162 - regression_loss: 0.2071 - classification_loss: 0.0091 381/500 [=====================>........] - ETA: 59s - loss: 0.2160 - regression_loss: 0.2069 - classification_loss: 0.0091  382/500 [=====================>........] - ETA: 59s - loss: 0.2158 - regression_loss: 0.2067 - classification_loss: 0.0091 383/500 [=====================>........] - ETA: 58s - loss: 0.2157 - regression_loss: 0.2066 - classification_loss: 0.0091 384/500 [======================>.......] - ETA: 58s - loss: 0.2156 - regression_loss: 0.2066 - classification_loss: 0.0091 385/500 [======================>.......] - ETA: 57s - loss: 0.2155 - regression_loss: 0.2064 - classification_loss: 0.0091 386/500 [======================>.......] - ETA: 57s - loss: 0.2153 - regression_loss: 0.2063 - classification_loss: 0.0091 387/500 [======================>.......] - ETA: 56s - loss: 0.2153 - regression_loss: 0.2062 - classification_loss: 0.0091 388/500 [======================>.......] - ETA: 56s - loss: 0.2152 - regression_loss: 0.2061 - classification_loss: 0.0091 389/500 [======================>.......] - ETA: 55s - loss: 0.2150 - regression_loss: 0.2060 - classification_loss: 0.0090 390/500 [======================>.......] - ETA: 55s - loss: 0.2151 - regression_loss: 0.2061 - classification_loss: 0.0090 391/500 [======================>.......] - ETA: 54s - loss: 0.2152 - regression_loss: 0.2061 - classification_loss: 0.0090 392/500 [======================>.......] - ETA: 54s - loss: 0.2149 - regression_loss: 0.2059 - classification_loss: 0.0090 393/500 [======================>.......] - ETA: 53s - loss: 0.2149 - regression_loss: 0.2059 - classification_loss: 0.0090 394/500 [======================>.......] - ETA: 53s - loss: 0.2146 - regression_loss: 0.2056 - classification_loss: 0.0090 395/500 [======================>.......] - ETA: 52s - loss: 0.2144 - regression_loss: 0.2054 - classification_loss: 0.0090 396/500 [======================>.......] - ETA: 52s - loss: 0.2146 - regression_loss: 0.2056 - classification_loss: 0.0090 397/500 [======================>.......] - ETA: 51s - loss: 0.2144 - regression_loss: 0.2054 - classification_loss: 0.0090 398/500 [======================>.......] - ETA: 51s - loss: 0.2145 - regression_loss: 0.2056 - classification_loss: 0.0090 399/500 [======================>.......] - ETA: 50s - loss: 0.2142 - regression_loss: 0.2053 - classification_loss: 0.0090 400/500 [=======================>......] - ETA: 50s - loss: 0.2143 - regression_loss: 0.2054 - classification_loss: 0.0090 401/500 [=======================>......] - ETA: 49s - loss: 0.2144 - regression_loss: 0.2054 - classification_loss: 0.0090 402/500 [=======================>......] - ETA: 49s - loss: 0.2143 - regression_loss: 0.2054 - classification_loss: 0.0090 403/500 [=======================>......] - ETA: 48s - loss: 0.2146 - regression_loss: 0.2056 - classification_loss: 0.0089 404/500 [=======================>......] - ETA: 48s - loss: 0.2148 - regression_loss: 0.2058 - classification_loss: 0.0090 405/500 [=======================>......] - ETA: 47s - loss: 0.2148 - regression_loss: 0.2059 - classification_loss: 0.0089 406/500 [=======================>......] - ETA: 47s - loss: 0.2149 - regression_loss: 0.2060 - classification_loss: 0.0089 407/500 [=======================>......] - ETA: 46s - loss: 0.2151 - regression_loss: 0.2062 - classification_loss: 0.0089 408/500 [=======================>......] - ETA: 46s - loss: 0.2151 - regression_loss: 0.2062 - classification_loss: 0.0089 409/500 [=======================>......] - ETA: 45s - loss: 0.2150 - regression_loss: 0.2060 - classification_loss: 0.0089 410/500 [=======================>......] - ETA: 45s - loss: 0.2147 - regression_loss: 0.2058 - classification_loss: 0.0089 411/500 [=======================>......] - ETA: 44s - loss: 0.2149 - regression_loss: 0.2060 - classification_loss: 0.0089 412/500 [=======================>......] - ETA: 44s - loss: 0.2146 - regression_loss: 0.2057 - classification_loss: 0.0089 413/500 [=======================>......] - ETA: 43s - loss: 0.2146 - regression_loss: 0.2057 - classification_loss: 0.0089 414/500 [=======================>......] - ETA: 43s - loss: 0.2146 - regression_loss: 0.2057 - classification_loss: 0.0089 415/500 [=======================>......] - ETA: 42s - loss: 0.2146 - regression_loss: 0.2057 - classification_loss: 0.0089 416/500 [=======================>......] - ETA: 42s - loss: 0.2144 - regression_loss: 0.2055 - classification_loss: 0.0089 417/500 [========================>.....] - ETA: 41s - loss: 0.2144 - regression_loss: 0.2056 - classification_loss: 0.0089 418/500 [========================>.....] - ETA: 41s - loss: 0.2150 - regression_loss: 0.2062 - classification_loss: 0.0089 419/500 [========================>.....] - ETA: 40s - loss: 0.2150 - regression_loss: 0.2061 - classification_loss: 0.0089 420/500 [========================>.....] - ETA: 40s - loss: 0.2152 - regression_loss: 0.2063 - classification_loss: 0.0089 421/500 [========================>.....] - ETA: 39s - loss: 0.2152 - regression_loss: 0.2063 - classification_loss: 0.0089 422/500 [========================>.....] - ETA: 39s - loss: 0.2157 - regression_loss: 0.2068 - classification_loss: 0.0089 423/500 [========================>.....] - ETA: 38s - loss: 0.2156 - regression_loss: 0.2067 - classification_loss: 0.0089 424/500 [========================>.....] - ETA: 38s - loss: 0.2156 - regression_loss: 0.2067 - classification_loss: 0.0090 425/500 [========================>.....] - ETA: 37s - loss: 0.2157 - regression_loss: 0.2067 - classification_loss: 0.0089 426/500 [========================>.....] - ETA: 37s - loss: 0.2155 - regression_loss: 0.2066 - classification_loss: 0.0089 427/500 [========================>.....] - ETA: 36s - loss: 0.2152 - regression_loss: 0.2063 - classification_loss: 0.0089 428/500 [========================>.....] - ETA: 36s - loss: 0.2150 - regression_loss: 0.2061 - classification_loss: 0.0089 429/500 [========================>.....] - ETA: 35s - loss: 0.2154 - regression_loss: 0.2065 - classification_loss: 0.0089 430/500 [========================>.....] - ETA: 35s - loss: 0.2153 - regression_loss: 0.2064 - classification_loss: 0.0089 431/500 [========================>.....] - ETA: 34s - loss: 0.2153 - regression_loss: 0.2064 - classification_loss: 0.0089 432/500 [========================>.....] - ETA: 34s - loss: 0.2152 - regression_loss: 0.2063 - classification_loss: 0.0089 433/500 [========================>.....] - ETA: 33s - loss: 0.2152 - regression_loss: 0.2063 - classification_loss: 0.0089 434/500 [=========================>....] - ETA: 33s - loss: 0.2150 - regression_loss: 0.2062 - classification_loss: 0.0089 435/500 [=========================>....] - ETA: 32s - loss: 0.2149 - regression_loss: 0.2060 - classification_loss: 0.0088 436/500 [=========================>....] - ETA: 32s - loss: 0.2145 - regression_loss: 0.2057 - classification_loss: 0.0088 437/500 [=========================>....] - ETA: 31s - loss: 0.2145 - regression_loss: 0.2057 - classification_loss: 0.0088 438/500 [=========================>....] - ETA: 31s - loss: 0.2147 - regression_loss: 0.2059 - classification_loss: 0.0088 439/500 [=========================>....] - ETA: 30s - loss: 0.2145 - regression_loss: 0.2057 - classification_loss: 0.0088 440/500 [=========================>....] - ETA: 30s - loss: 0.2143 - regression_loss: 0.2055 - classification_loss: 0.0088 441/500 [=========================>....] - ETA: 29s - loss: 0.2144 - regression_loss: 0.2056 - classification_loss: 0.0088 442/500 [=========================>....] - ETA: 29s - loss: 0.2146 - regression_loss: 0.2058 - classification_loss: 0.0088 443/500 [=========================>....] - ETA: 28s - loss: 0.2143 - regression_loss: 0.2055 - classification_loss: 0.0088 444/500 [=========================>....] - ETA: 28s - loss: 0.2141 - regression_loss: 0.2053 - classification_loss: 0.0088 445/500 [=========================>....] - ETA: 27s - loss: 0.2141 - regression_loss: 0.2053 - classification_loss: 0.0088 446/500 [=========================>....] - ETA: 27s - loss: 0.2140 - regression_loss: 0.2051 - classification_loss: 0.0088 447/500 [=========================>....] - ETA: 26s - loss: 0.2135 - regression_loss: 0.2047 - classification_loss: 0.0088 448/500 [=========================>....] - ETA: 26s - loss: 0.2134 - regression_loss: 0.2046 - classification_loss: 0.0088 449/500 [=========================>....] - ETA: 25s - loss: 0.2133 - regression_loss: 0.2045 - classification_loss: 0.0088 450/500 [==========================>...] - ETA: 25s - loss: 0.2132 - regression_loss: 0.2044 - classification_loss: 0.0088 451/500 [==========================>...] - ETA: 24s - loss: 0.2134 - regression_loss: 0.2046 - classification_loss: 0.0088 452/500 [==========================>...] - ETA: 24s - loss: 0.2134 - regression_loss: 0.2045 - classification_loss: 0.0088 453/500 [==========================>...] - ETA: 23s - loss: 0.2135 - regression_loss: 0.2047 - classification_loss: 0.0088 454/500 [==========================>...] - ETA: 23s - loss: 0.2136 - regression_loss: 0.2048 - classification_loss: 0.0088 455/500 [==========================>...] - ETA: 22s - loss: 0.2136 - regression_loss: 0.2048 - classification_loss: 0.0088 456/500 [==========================>...] - ETA: 22s - loss: 0.2139 - regression_loss: 0.2050 - classification_loss: 0.0088 457/500 [==========================>...] - ETA: 21s - loss: 0.2143 - regression_loss: 0.2054 - classification_loss: 0.0088 458/500 [==========================>...] - ETA: 21s - loss: 0.2140 - regression_loss: 0.2052 - classification_loss: 0.0088 459/500 [==========================>...] - ETA: 20s - loss: 0.2140 - regression_loss: 0.2051 - classification_loss: 0.0088 460/500 [==========================>...] - ETA: 20s - loss: 0.2138 - regression_loss: 0.2050 - classification_loss: 0.0088 461/500 [==========================>...] - ETA: 19s - loss: 0.2141 - regression_loss: 0.2052 - classification_loss: 0.0088 462/500 [==========================>...] - ETA: 19s - loss: 0.2138 - regression_loss: 0.2050 - classification_loss: 0.0088 463/500 [==========================>...] - ETA: 18s - loss: 0.2137 - regression_loss: 0.2049 - classification_loss: 0.0088 464/500 [==========================>...] - ETA: 18s - loss: 0.2137 - regression_loss: 0.2049 - classification_loss: 0.0088 465/500 [==========================>...] - ETA: 17s - loss: 0.2142 - regression_loss: 0.2054 - classification_loss: 0.0088 466/500 [==========================>...] - ETA: 17s - loss: 0.2141 - regression_loss: 0.2053 - classification_loss: 0.0088 467/500 [===========================>..] - ETA: 16s - loss: 0.2139 - regression_loss: 0.2052 - classification_loss: 0.0088 468/500 [===========================>..] - ETA: 16s - loss: 0.2136 - regression_loss: 0.2049 - classification_loss: 0.0088 469/500 [===========================>..] - ETA: 15s - loss: 0.2137 - regression_loss: 0.2049 - classification_loss: 0.0088 470/500 [===========================>..] - ETA: 15s - loss: 0.2135 - regression_loss: 0.2047 - classification_loss: 0.0087 471/500 [===========================>..] - ETA: 14s - loss: 0.2136 - regression_loss: 0.2048 - classification_loss: 0.0088 472/500 [===========================>..] - ETA: 14s - loss: 0.2134 - regression_loss: 0.2047 - classification_loss: 0.0087 473/500 [===========================>..] - ETA: 13s - loss: 0.2135 - regression_loss: 0.2048 - classification_loss: 0.0087 474/500 [===========================>..] - ETA: 13s - loss: 0.2134 - regression_loss: 0.2047 - classification_loss: 0.0087 475/500 [===========================>..] - ETA: 12s - loss: 0.2134 - regression_loss: 0.2047 - classification_loss: 0.0087 476/500 [===========================>..] - ETA: 12s - loss: 0.2134 - regression_loss: 0.2047 - classification_loss: 0.0087 477/500 [===========================>..] - ETA: 11s - loss: 0.2133 - regression_loss: 0.2046 - classification_loss: 0.0087 478/500 [===========================>..] - ETA: 11s - loss: 0.2135 - regression_loss: 0.2047 - classification_loss: 0.0087 479/500 [===========================>..] - ETA: 10s - loss: 0.2133 - regression_loss: 0.2046 - classification_loss: 0.0087 480/500 [===========================>..] - ETA: 10s - loss: 0.2132 - regression_loss: 0.2045 - classification_loss: 0.0087 481/500 [===========================>..] - ETA: 9s - loss: 0.2129 - regression_loss: 0.2042 - classification_loss: 0.0087  482/500 [===========================>..] - ETA: 9s - loss: 0.2132 - regression_loss: 0.2044 - classification_loss: 0.0088 483/500 [===========================>..] - ETA: 8s - loss: 0.2129 - regression_loss: 0.2042 - classification_loss: 0.0088 484/500 [============================>.] - ETA: 8s - loss: 0.2130 - regression_loss: 0.2042 - classification_loss: 0.0088 485/500 [============================>.] - ETA: 7s - loss: 0.2130 - regression_loss: 0.2042 - classification_loss: 0.0088 486/500 [============================>.] - ETA: 7s - loss: 0.2133 - regression_loss: 0.2044 - classification_loss: 0.0088 487/500 [============================>.] - ETA: 6s - loss: 0.2132 - regression_loss: 0.2044 - classification_loss: 0.0088 488/500 [============================>.] - ETA: 6s - loss: 0.2134 - regression_loss: 0.2046 - classification_loss: 0.0088 489/500 [============================>.] - ETA: 5s - loss: 0.2136 - regression_loss: 0.2048 - classification_loss: 0.0088 490/500 [============================>.] - ETA: 5s - loss: 0.2135 - regression_loss: 0.2047 - classification_loss: 0.0088 491/500 [============================>.] - ETA: 4s - loss: 0.2135 - regression_loss: 0.2047 - classification_loss: 0.0088 492/500 [============================>.] - ETA: 4s - loss: 0.2138 - regression_loss: 0.2050 - classification_loss: 0.0088 493/500 [============================>.] - ETA: 3s - loss: 0.2139 - regression_loss: 0.2051 - classification_loss: 0.0088 494/500 [============================>.] - ETA: 3s - loss: 0.2137 - regression_loss: 0.2049 - classification_loss: 0.0088 495/500 [============================>.] - ETA: 2s - loss: 0.2135 - regression_loss: 0.2047 - classification_loss: 0.0088 496/500 [============================>.] - ETA: 2s - loss: 0.2132 - regression_loss: 0.2044 - classification_loss: 0.0088 497/500 [============================>.] - ETA: 1s - loss: 0.2130 - regression_loss: 0.2042 - classification_loss: 0.0088 498/500 [============================>.] - ETA: 1s - loss: 0.2129 - regression_loss: 0.2040 - classification_loss: 0.0088 499/500 [============================>.] - ETA: 0s - loss: 0.2130 - regression_loss: 0.2042 - classification_loss: 0.0088 500/500 [==============================] - 251s 503ms/step - loss: 0.2130 - regression_loss: 0.2042 - classification_loss: 0.0088 1172 instances of class plum with average precision: 0.7512 mAP: 0.7512 Epoch 00036: saving model to ./training/snapshots/resnet101_pascal_36.h5 Epoch 37/150 1/500 [..............................] - ETA: 4:09 - loss: 0.3060 - regression_loss: 0.2868 - classification_loss: 0.0192 2/500 [..............................] - ETA: 4:12 - loss: 0.2318 - regression_loss: 0.2201 - classification_loss: 0.0117 3/500 [..............................] - ETA: 4:11 - loss: 0.1875 - regression_loss: 0.1788 - classification_loss: 0.0088 4/500 [..............................] - ETA: 4:12 - loss: 0.1652 - regression_loss: 0.1579 - classification_loss: 0.0073 5/500 [..............................] - ETA: 4:12 - loss: 0.1690 - regression_loss: 0.1616 - classification_loss: 0.0074 6/500 [..............................] - ETA: 4:11 - loss: 0.1621 - regression_loss: 0.1553 - classification_loss: 0.0068 7/500 [..............................] - ETA: 4:08 - loss: 0.1844 - regression_loss: 0.1774 - classification_loss: 0.0070 8/500 [..............................] - ETA: 4:08 - loss: 0.1929 - regression_loss: 0.1846 - classification_loss: 0.0083 9/500 [..............................] - ETA: 4:09 - loss: 0.1946 - regression_loss: 0.1864 - classification_loss: 0.0083 10/500 [..............................] - ETA: 4:08 - loss: 0.1953 - regression_loss: 0.1871 - classification_loss: 0.0081 11/500 [..............................] - ETA: 4:07 - loss: 0.1924 - regression_loss: 0.1845 - classification_loss: 0.0079 12/500 [..............................] - ETA: 4:06 - loss: 0.1874 - regression_loss: 0.1800 - classification_loss: 0.0074 13/500 [..............................] - ETA: 4:05 - loss: 0.1800 - regression_loss: 0.1731 - classification_loss: 0.0069 14/500 [..............................] - ETA: 4:05 - loss: 0.1743 - regression_loss: 0.1675 - classification_loss: 0.0068 15/500 [..............................] - ETA: 4:04 - loss: 0.1713 - regression_loss: 0.1648 - classification_loss: 0.0066 16/500 [..............................] - ETA: 4:04 - loss: 0.1661 - regression_loss: 0.1598 - classification_loss: 0.0063 17/500 [>.............................] - ETA: 4:03 - loss: 0.1659 - regression_loss: 0.1595 - classification_loss: 0.0064 18/500 [>.............................] - ETA: 4:03 - loss: 0.1709 - regression_loss: 0.1642 - classification_loss: 0.0066 19/500 [>.............................] - ETA: 4:02 - loss: 0.1767 - regression_loss: 0.1698 - classification_loss: 0.0069 20/500 [>.............................] - ETA: 4:02 - loss: 0.1796 - regression_loss: 0.1727 - classification_loss: 0.0068 21/500 [>.............................] - ETA: 4:01 - loss: 0.1800 - regression_loss: 0.1732 - classification_loss: 0.0068 22/500 [>.............................] - ETA: 4:01 - loss: 0.1796 - regression_loss: 0.1723 - classification_loss: 0.0072 23/500 [>.............................] - ETA: 4:00 - loss: 0.1796 - regression_loss: 0.1725 - classification_loss: 0.0071 24/500 [>.............................] - ETA: 3:59 - loss: 0.1791 - regression_loss: 0.1718 - classification_loss: 0.0073 25/500 [>.............................] - ETA: 3:59 - loss: 0.1788 - regression_loss: 0.1714 - classification_loss: 0.0073 26/500 [>.............................] - ETA: 3:59 - loss: 0.1830 - regression_loss: 0.1756 - classification_loss: 0.0074 27/500 [>.............................] - ETA: 3:58 - loss: 0.1877 - regression_loss: 0.1798 - classification_loss: 0.0079 28/500 [>.............................] - ETA: 3:57 - loss: 0.1903 - regression_loss: 0.1826 - classification_loss: 0.0077 29/500 [>.............................] - ETA: 3:57 - loss: 0.1943 - regression_loss: 0.1865 - classification_loss: 0.0078 30/500 [>.............................] - ETA: 3:57 - loss: 0.1942 - regression_loss: 0.1863 - classification_loss: 0.0079 31/500 [>.............................] - ETA: 3:56 - loss: 0.2007 - regression_loss: 0.1928 - classification_loss: 0.0080 32/500 [>.............................] - ETA: 3:55 - loss: 0.2023 - regression_loss: 0.1944 - classification_loss: 0.0079 33/500 [>.............................] - ETA: 3:55 - loss: 0.2050 - regression_loss: 0.1969 - classification_loss: 0.0081 34/500 [=>............................] - ETA: 3:54 - loss: 0.2075 - regression_loss: 0.1987 - classification_loss: 0.0088 35/500 [=>............................] - ETA: 3:53 - loss: 0.2075 - regression_loss: 0.1984 - classification_loss: 0.0091 36/500 [=>............................] - ETA: 3:53 - loss: 0.2105 - regression_loss: 0.2014 - classification_loss: 0.0091 37/500 [=>............................] - ETA: 3:53 - loss: 0.2097 - regression_loss: 0.2006 - classification_loss: 0.0091 38/500 [=>............................] - ETA: 3:52 - loss: 0.2064 - regression_loss: 0.1975 - classification_loss: 0.0089 39/500 [=>............................] - ETA: 3:52 - loss: 0.2039 - regression_loss: 0.1952 - classification_loss: 0.0087 40/500 [=>............................] - ETA: 3:51 - loss: 0.2031 - regression_loss: 0.1945 - classification_loss: 0.0086 41/500 [=>............................] - ETA: 3:51 - loss: 0.2073 - regression_loss: 0.1980 - classification_loss: 0.0093 42/500 [=>............................] - ETA: 3:50 - loss: 0.2044 - regression_loss: 0.1953 - classification_loss: 0.0091 43/500 [=>............................] - ETA: 3:50 - loss: 0.2018 - regression_loss: 0.1929 - classification_loss: 0.0089 44/500 [=>............................] - ETA: 3:49 - loss: 0.1997 - regression_loss: 0.1909 - classification_loss: 0.0088 45/500 [=>............................] - ETA: 3:49 - loss: 0.2024 - regression_loss: 0.1934 - classification_loss: 0.0091 46/500 [=>............................] - ETA: 3:48 - loss: 0.2005 - regression_loss: 0.1916 - classification_loss: 0.0089 47/500 [=>............................] - ETA: 3:48 - loss: 0.2024 - regression_loss: 0.1936 - classification_loss: 0.0087 48/500 [=>............................] - ETA: 3:47 - loss: 0.2013 - regression_loss: 0.1925 - classification_loss: 0.0088 49/500 [=>............................] - ETA: 3:47 - loss: 0.2035 - regression_loss: 0.1948 - classification_loss: 0.0087 50/500 [==>...........................] - ETA: 3:46 - loss: 0.2020 - regression_loss: 0.1934 - classification_loss: 0.0086 51/500 [==>...........................] - ETA: 3:46 - loss: 0.2017 - regression_loss: 0.1933 - classification_loss: 0.0085 52/500 [==>...........................] - ETA: 3:45 - loss: 0.2024 - regression_loss: 0.1939 - classification_loss: 0.0086 53/500 [==>...........................] - ETA: 3:44 - loss: 0.2038 - regression_loss: 0.1953 - classification_loss: 0.0085 54/500 [==>...........................] - ETA: 3:44 - loss: 0.2031 - regression_loss: 0.1947 - classification_loss: 0.0084 55/500 [==>...........................] - ETA: 3:44 - loss: 0.2022 - regression_loss: 0.1939 - classification_loss: 0.0083 56/500 [==>...........................] - ETA: 3:43 - loss: 0.2007 - regression_loss: 0.1925 - classification_loss: 0.0082 57/500 [==>...........................] - ETA: 3:43 - loss: 0.2012 - regression_loss: 0.1929 - classification_loss: 0.0083 58/500 [==>...........................] - ETA: 3:42 - loss: 0.2010 - regression_loss: 0.1928 - classification_loss: 0.0082 59/500 [==>...........................] - ETA: 3:42 - loss: 0.2009 - regression_loss: 0.1928 - classification_loss: 0.0081 60/500 [==>...........................] - ETA: 3:41 - loss: 0.2016 - regression_loss: 0.1936 - classification_loss: 0.0081 61/500 [==>...........................] - ETA: 3:41 - loss: 0.2010 - regression_loss: 0.1930 - classification_loss: 0.0080 62/500 [==>...........................] - ETA: 3:40 - loss: 0.2012 - regression_loss: 0.1932 - classification_loss: 0.0080 63/500 [==>...........................] - ETA: 3:40 - loss: 0.1997 - regression_loss: 0.1918 - classification_loss: 0.0079 64/500 [==>...........................] - ETA: 3:39 - loss: 0.1987 - regression_loss: 0.1909 - classification_loss: 0.0078 65/500 [==>...........................] - ETA: 3:39 - loss: 0.1987 - regression_loss: 0.1910 - classification_loss: 0.0077 66/500 [==>...........................] - ETA: 3:38 - loss: 0.1990 - regression_loss: 0.1913 - classification_loss: 0.0077 67/500 [===>..........................] - ETA: 3:38 - loss: 0.2012 - regression_loss: 0.1934 - classification_loss: 0.0078 68/500 [===>..........................] - ETA: 3:37 - loss: 0.2018 - regression_loss: 0.1941 - classification_loss: 0.0077 69/500 [===>..........................] - ETA: 3:37 - loss: 0.2007 - regression_loss: 0.1931 - classification_loss: 0.0077 70/500 [===>..........................] - ETA: 3:36 - loss: 0.2020 - regression_loss: 0.1943 - classification_loss: 0.0077 71/500 [===>..........................] - ETA: 3:36 - loss: 0.2022 - regression_loss: 0.1944 - classification_loss: 0.0078 72/500 [===>..........................] - ETA: 3:35 - loss: 0.2033 - regression_loss: 0.1954 - classification_loss: 0.0079 73/500 [===>..........................] - ETA: 3:34 - loss: 0.2040 - regression_loss: 0.1961 - classification_loss: 0.0079 74/500 [===>..........................] - ETA: 3:34 - loss: 0.2028 - regression_loss: 0.1949 - classification_loss: 0.0078 75/500 [===>..........................] - ETA: 3:33 - loss: 0.2031 - regression_loss: 0.1952 - classification_loss: 0.0079 76/500 [===>..........................] - ETA: 3:33 - loss: 0.2026 - regression_loss: 0.1947 - classification_loss: 0.0079 77/500 [===>..........................] - ETA: 3:32 - loss: 0.2015 - regression_loss: 0.1937 - classification_loss: 0.0078 78/500 [===>..........................] - ETA: 3:32 - loss: 0.2017 - regression_loss: 0.1939 - classification_loss: 0.0078 79/500 [===>..........................] - ETA: 3:32 - loss: 0.2005 - regression_loss: 0.1928 - classification_loss: 0.0077 80/500 [===>..........................] - ETA: 3:31 - loss: 0.2007 - regression_loss: 0.1931 - classification_loss: 0.0077 81/500 [===>..........................] - ETA: 3:30 - loss: 0.2009 - regression_loss: 0.1932 - classification_loss: 0.0077 82/500 [===>..........................] - ETA: 3:30 - loss: 0.2026 - regression_loss: 0.1949 - classification_loss: 0.0078 83/500 [===>..........................] - ETA: 3:29 - loss: 0.2059 - regression_loss: 0.1979 - classification_loss: 0.0080 84/500 [====>.........................] - ETA: 3:29 - loss: 0.2057 - regression_loss: 0.1976 - classification_loss: 0.0081 85/500 [====>.........................] - ETA: 3:29 - loss: 0.2036 - regression_loss: 0.1956 - classification_loss: 0.0080 86/500 [====>.........................] - ETA: 3:28 - loss: 0.2032 - regression_loss: 0.1952 - classification_loss: 0.0080 87/500 [====>.........................] - ETA: 3:27 - loss: 0.2018 - regression_loss: 0.1939 - classification_loss: 0.0079 88/500 [====>.........................] - ETA: 3:27 - loss: 0.2017 - regression_loss: 0.1938 - classification_loss: 0.0079 89/500 [====>.........................] - ETA: 3:26 - loss: 0.2023 - regression_loss: 0.1944 - classification_loss: 0.0079 90/500 [====>.........................] - ETA: 3:26 - loss: 0.2009 - regression_loss: 0.1931 - classification_loss: 0.0079 91/500 [====>.........................] - ETA: 3:25 - loss: 0.2012 - regression_loss: 0.1934 - classification_loss: 0.0079 92/500 [====>.........................] - ETA: 3:25 - loss: 0.2004 - regression_loss: 0.1925 - classification_loss: 0.0078 93/500 [====>.........................] - ETA: 3:24 - loss: 0.2001 - regression_loss: 0.1924 - classification_loss: 0.0078 94/500 [====>.........................] - ETA: 3:24 - loss: 0.1990 - regression_loss: 0.1913 - classification_loss: 0.0077 95/500 [====>.........................] - ETA: 3:23 - loss: 0.1997 - regression_loss: 0.1920 - classification_loss: 0.0077 96/500 [====>.........................] - ETA: 3:23 - loss: 0.1994 - regression_loss: 0.1917 - classification_loss: 0.0076 97/500 [====>.........................] - ETA: 3:22 - loss: 0.1994 - regression_loss: 0.1918 - classification_loss: 0.0077 98/500 [====>.........................] - ETA: 3:22 - loss: 0.1992 - regression_loss: 0.1916 - classification_loss: 0.0076 99/500 [====>.........................] - ETA: 3:21 - loss: 0.1993 - regression_loss: 0.1917 - classification_loss: 0.0076 100/500 [=====>........................] - ETA: 3:21 - loss: 0.1997 - regression_loss: 0.1921 - classification_loss: 0.0076 101/500 [=====>........................] - ETA: 3:20 - loss: 0.2003 - regression_loss: 0.1928 - classification_loss: 0.0076 102/500 [=====>........................] - ETA: 3:20 - loss: 0.2002 - regression_loss: 0.1926 - classification_loss: 0.0076 103/500 [=====>........................] - ETA: 3:19 - loss: 0.2006 - regression_loss: 0.1929 - classification_loss: 0.0077 104/500 [=====>........................] - ETA: 3:19 - loss: 0.2004 - regression_loss: 0.1928 - classification_loss: 0.0076 105/500 [=====>........................] - ETA: 3:18 - loss: 0.2017 - regression_loss: 0.1941 - classification_loss: 0.0077 106/500 [=====>........................] - ETA: 3:18 - loss: 0.2016 - regression_loss: 0.1940 - classification_loss: 0.0076 107/500 [=====>........................] - ETA: 3:17 - loss: 0.2017 - regression_loss: 0.1941 - classification_loss: 0.0076 108/500 [=====>........................] - ETA: 3:17 - loss: 0.2019 - regression_loss: 0.1943 - classification_loss: 0.0076 109/500 [=====>........................] - ETA: 3:16 - loss: 0.2014 - regression_loss: 0.1938 - classification_loss: 0.0076 110/500 [=====>........................] - ETA: 3:16 - loss: 0.2005 - regression_loss: 0.1930 - classification_loss: 0.0075 111/500 [=====>........................] - ETA: 3:15 - loss: 0.1998 - regression_loss: 0.1923 - classification_loss: 0.0075 112/500 [=====>........................] - ETA: 3:15 - loss: 0.2004 - regression_loss: 0.1927 - classification_loss: 0.0077 113/500 [=====>........................] - ETA: 3:14 - loss: 0.2008 - regression_loss: 0.1929 - classification_loss: 0.0078 114/500 [=====>........................] - ETA: 3:14 - loss: 0.1996 - regression_loss: 0.1918 - classification_loss: 0.0078 115/500 [=====>........................] - ETA: 3:13 - loss: 0.1993 - regression_loss: 0.1915 - classification_loss: 0.0078 116/500 [=====>........................] - ETA: 3:13 - loss: 0.1985 - regression_loss: 0.1907 - classification_loss: 0.0078 117/500 [======>.......................] - ETA: 3:12 - loss: 0.1994 - regression_loss: 0.1915 - classification_loss: 0.0079 118/500 [======>.......................] - ETA: 3:12 - loss: 0.1993 - regression_loss: 0.1914 - classification_loss: 0.0079 119/500 [======>.......................] - ETA: 3:11 - loss: 0.1985 - regression_loss: 0.1907 - classification_loss: 0.0078 120/500 [======>.......................] - ETA: 3:11 - loss: 0.1979 - regression_loss: 0.1901 - classification_loss: 0.0078 121/500 [======>.......................] - ETA: 3:10 - loss: 0.1968 - regression_loss: 0.1891 - classification_loss: 0.0077 122/500 [======>.......................] - ETA: 3:10 - loss: 0.1971 - regression_loss: 0.1894 - classification_loss: 0.0077 123/500 [======>.......................] - ETA: 3:09 - loss: 0.1974 - regression_loss: 0.1897 - classification_loss: 0.0077 124/500 [======>.......................] - ETA: 3:09 - loss: 0.1975 - regression_loss: 0.1897 - classification_loss: 0.0077 125/500 [======>.......................] - ETA: 3:08 - loss: 0.1976 - regression_loss: 0.1899 - classification_loss: 0.0077 126/500 [======>.......................] - ETA: 3:08 - loss: 0.1982 - regression_loss: 0.1903 - classification_loss: 0.0079 127/500 [======>.......................] - ETA: 3:07 - loss: 0.1979 - regression_loss: 0.1901 - classification_loss: 0.0078 128/500 [======>.......................] - ETA: 3:07 - loss: 0.1980 - regression_loss: 0.1902 - classification_loss: 0.0078 129/500 [======>.......................] - ETA: 3:06 - loss: 0.1978 - regression_loss: 0.1900 - classification_loss: 0.0078 130/500 [======>.......................] - ETA: 3:05 - loss: 0.1982 - regression_loss: 0.1904 - classification_loss: 0.0078 131/500 [======>.......................] - ETA: 3:05 - loss: 0.1979 - regression_loss: 0.1901 - classification_loss: 0.0078 132/500 [======>.......................] - ETA: 3:04 - loss: 0.1973 - regression_loss: 0.1896 - classification_loss: 0.0077 133/500 [======>.......................] - ETA: 3:04 - loss: 0.1972 - regression_loss: 0.1895 - classification_loss: 0.0077 134/500 [=======>......................] - ETA: 3:03 - loss: 0.1991 - regression_loss: 0.1914 - classification_loss: 0.0077 135/500 [=======>......................] - ETA: 3:03 - loss: 0.1994 - regression_loss: 0.1916 - classification_loss: 0.0077 136/500 [=======>......................] - ETA: 3:02 - loss: 0.2005 - regression_loss: 0.1928 - classification_loss: 0.0078 137/500 [=======>......................] - ETA: 3:02 - loss: 0.2007 - regression_loss: 0.1928 - classification_loss: 0.0078 138/500 [=======>......................] - ETA: 3:01 - loss: 0.2007 - regression_loss: 0.1930 - classification_loss: 0.0078 139/500 [=======>......................] - ETA: 3:01 - loss: 0.2013 - regression_loss: 0.1935 - classification_loss: 0.0078 140/500 [=======>......................] - ETA: 3:00 - loss: 0.2012 - regression_loss: 0.1934 - classification_loss: 0.0078 141/500 [=======>......................] - ETA: 3:00 - loss: 0.2020 - regression_loss: 0.1942 - classification_loss: 0.0078 142/500 [=======>......................] - ETA: 2:59 - loss: 0.2021 - regression_loss: 0.1942 - classification_loss: 0.0079 143/500 [=======>......................] - ETA: 2:59 - loss: 0.2022 - regression_loss: 0.1942 - classification_loss: 0.0080 144/500 [=======>......................] - ETA: 2:58 - loss: 0.2021 - regression_loss: 0.1942 - classification_loss: 0.0080 145/500 [=======>......................] - ETA: 2:58 - loss: 0.2020 - regression_loss: 0.1940 - classification_loss: 0.0080 146/500 [=======>......................] - ETA: 2:57 - loss: 0.2021 - regression_loss: 0.1941 - classification_loss: 0.0080 147/500 [=======>......................] - ETA: 2:57 - loss: 0.2026 - regression_loss: 0.1945 - classification_loss: 0.0081 148/500 [=======>......................] - ETA: 2:56 - loss: 0.2034 - regression_loss: 0.1949 - classification_loss: 0.0085 149/500 [=======>......................] - ETA: 2:56 - loss: 0.2037 - regression_loss: 0.1951 - classification_loss: 0.0086 150/500 [========>.....................] - ETA: 2:55 - loss: 0.2031 - regression_loss: 0.1946 - classification_loss: 0.0085 151/500 [========>.....................] - ETA: 2:55 - loss: 0.2024 - regression_loss: 0.1939 - classification_loss: 0.0085 152/500 [========>.....................] - ETA: 2:54 - loss: 0.2020 - regression_loss: 0.1935 - classification_loss: 0.0084 153/500 [========>.....................] - ETA: 2:54 - loss: 0.2013 - regression_loss: 0.1929 - classification_loss: 0.0084 154/500 [========>.....................] - ETA: 2:53 - loss: 0.2016 - regression_loss: 0.1932 - classification_loss: 0.0084 155/500 [========>.....................] - ETA: 2:53 - loss: 0.2033 - regression_loss: 0.1949 - classification_loss: 0.0084 156/500 [========>.....................] - ETA: 2:52 - loss: 0.2036 - regression_loss: 0.1952 - classification_loss: 0.0084 157/500 [========>.....................] - ETA: 2:52 - loss: 0.2042 - regression_loss: 0.1958 - classification_loss: 0.0084 158/500 [========>.....................] - ETA: 2:51 - loss: 0.2055 - regression_loss: 0.1971 - classification_loss: 0.0085 159/500 [========>.....................] - ETA: 2:51 - loss: 0.2056 - regression_loss: 0.1972 - classification_loss: 0.0085 160/500 [========>.....................] - ETA: 2:50 - loss: 0.2054 - regression_loss: 0.1968 - classification_loss: 0.0085 161/500 [========>.....................] - ETA: 2:50 - loss: 0.2057 - regression_loss: 0.1972 - classification_loss: 0.0085 162/500 [========>.....................] - ETA: 2:49 - loss: 0.2060 - regression_loss: 0.1975 - classification_loss: 0.0085 163/500 [========>.....................] - ETA: 2:49 - loss: 0.2054 - regression_loss: 0.1969 - classification_loss: 0.0085 164/500 [========>.....................] - ETA: 2:48 - loss: 0.2054 - regression_loss: 0.1970 - classification_loss: 0.0084 165/500 [========>.....................] - ETA: 2:48 - loss: 0.2057 - regression_loss: 0.1972 - classification_loss: 0.0085 166/500 [========>.....................] - ETA: 2:48 - loss: 0.2061 - regression_loss: 0.1976 - classification_loss: 0.0085 167/500 [=========>....................] - ETA: 2:47 - loss: 0.2062 - regression_loss: 0.1977 - classification_loss: 0.0085 168/500 [=========>....................] - ETA: 2:47 - loss: 0.2059 - regression_loss: 0.1974 - classification_loss: 0.0085 169/500 [=========>....................] - ETA: 2:46 - loss: 0.2053 - regression_loss: 0.1968 - classification_loss: 0.0085 170/500 [=========>....................] - ETA: 2:46 - loss: 0.2063 - regression_loss: 0.1978 - classification_loss: 0.0085 171/500 [=========>....................] - ETA: 2:45 - loss: 0.2060 - regression_loss: 0.1976 - classification_loss: 0.0085 172/500 [=========>....................] - ETA: 2:45 - loss: 0.2052 - regression_loss: 0.1968 - classification_loss: 0.0084 173/500 [=========>....................] - ETA: 2:44 - loss: 0.2052 - regression_loss: 0.1968 - classification_loss: 0.0084 174/500 [=========>....................] - ETA: 2:44 - loss: 0.2059 - regression_loss: 0.1975 - classification_loss: 0.0084 175/500 [=========>....................] - ETA: 2:43 - loss: 0.2055 - regression_loss: 0.1971 - classification_loss: 0.0084 176/500 [=========>....................] - ETA: 2:43 - loss: 0.2051 - regression_loss: 0.1967 - classification_loss: 0.0083 177/500 [=========>....................] - ETA: 2:42 - loss: 0.2043 - regression_loss: 0.1960 - classification_loss: 0.0083 178/500 [=========>....................] - ETA: 2:42 - loss: 0.2047 - regression_loss: 0.1963 - classification_loss: 0.0084 179/500 [=========>....................] - ETA: 2:41 - loss: 0.2040 - regression_loss: 0.1956 - classification_loss: 0.0084 180/500 [=========>....................] - ETA: 2:41 - loss: 0.2039 - regression_loss: 0.1955 - classification_loss: 0.0084 181/500 [=========>....................] - ETA: 2:40 - loss: 0.2030 - regression_loss: 0.1947 - classification_loss: 0.0084 182/500 [=========>....................] - ETA: 2:40 - loss: 0.2032 - regression_loss: 0.1948 - classification_loss: 0.0083 183/500 [=========>....................] - ETA: 2:39 - loss: 0.2030 - regression_loss: 0.1947 - classification_loss: 0.0083 184/500 [==========>...................] - ETA: 2:39 - loss: 0.2031 - regression_loss: 0.1948 - classification_loss: 0.0083 185/500 [==========>...................] - ETA: 2:38 - loss: 0.2026 - regression_loss: 0.1943 - classification_loss: 0.0083 186/500 [==========>...................] - ETA: 2:37 - loss: 0.2028 - regression_loss: 0.1945 - classification_loss: 0.0082 187/500 [==========>...................] - ETA: 2:37 - loss: 0.2025 - regression_loss: 0.1943 - classification_loss: 0.0082 188/500 [==========>...................] - ETA: 2:36 - loss: 0.2017 - regression_loss: 0.1935 - classification_loss: 0.0082 189/500 [==========>...................] - ETA: 2:36 - loss: 0.2031 - regression_loss: 0.1948 - classification_loss: 0.0082 190/500 [==========>...................] - ETA: 2:35 - loss: 0.2046 - regression_loss: 0.1963 - classification_loss: 0.0083 191/500 [==========>...................] - ETA: 2:35 - loss: 0.2038 - regression_loss: 0.1956 - classification_loss: 0.0083 192/500 [==========>...................] - ETA: 2:34 - loss: 0.2047 - regression_loss: 0.1964 - classification_loss: 0.0083 193/500 [==========>...................] - ETA: 2:34 - loss: 0.2040 - regression_loss: 0.1958 - classification_loss: 0.0082 194/500 [==========>...................] - ETA: 2:33 - loss: 0.2049 - regression_loss: 0.1965 - classification_loss: 0.0083 195/500 [==========>...................] - ETA: 2:33 - loss: 0.2044 - regression_loss: 0.1961 - classification_loss: 0.0083 196/500 [==========>...................] - ETA: 2:32 - loss: 0.2047 - regression_loss: 0.1964 - classification_loss: 0.0083 197/500 [==========>...................] - ETA: 2:32 - loss: 0.2046 - regression_loss: 0.1963 - classification_loss: 0.0083 198/500 [==========>...................] - ETA: 2:31 - loss: 0.2056 - regression_loss: 0.1973 - classification_loss: 0.0083 199/500 [==========>...................] - ETA: 2:31 - loss: 0.2056 - regression_loss: 0.1974 - classification_loss: 0.0083 200/500 [===========>..................] - ETA: 2:30 - loss: 0.2060 - regression_loss: 0.1977 - classification_loss: 0.0083 201/500 [===========>..................] - ETA: 2:30 - loss: 0.2065 - regression_loss: 0.1982 - classification_loss: 0.0083 202/500 [===========>..................] - ETA: 2:29 - loss: 0.2073 - regression_loss: 0.1990 - classification_loss: 0.0083 203/500 [===========>..................] - ETA: 2:29 - loss: 0.2073 - regression_loss: 0.1990 - classification_loss: 0.0083 204/500 [===========>..................] - ETA: 2:28 - loss: 0.2076 - regression_loss: 0.1993 - classification_loss: 0.0084 205/500 [===========>..................] - ETA: 2:28 - loss: 0.2081 - regression_loss: 0.1997 - classification_loss: 0.0084 206/500 [===========>..................] - ETA: 2:27 - loss: 0.2084 - regression_loss: 0.2000 - classification_loss: 0.0084 207/500 [===========>..................] - ETA: 2:27 - loss: 0.2081 - regression_loss: 0.1997 - classification_loss: 0.0083 208/500 [===========>..................] - ETA: 2:26 - loss: 0.2088 - regression_loss: 0.2004 - classification_loss: 0.0084 209/500 [===========>..................] - ETA: 2:26 - loss: 0.2090 - regression_loss: 0.2006 - classification_loss: 0.0084 210/500 [===========>..................] - ETA: 2:25 - loss: 0.2096 - regression_loss: 0.2012 - classification_loss: 0.0084 211/500 [===========>..................] - ETA: 2:25 - loss: 0.2100 - regression_loss: 0.2016 - classification_loss: 0.0084 212/500 [===========>..................] - ETA: 2:24 - loss: 0.2100 - regression_loss: 0.2016 - classification_loss: 0.0084 213/500 [===========>..................] - ETA: 2:24 - loss: 0.2102 - regression_loss: 0.2018 - classification_loss: 0.0084 214/500 [===========>..................] - ETA: 2:23 - loss: 0.2102 - regression_loss: 0.2018 - classification_loss: 0.0084 215/500 [===========>..................] - ETA: 2:23 - loss: 0.2098 - regression_loss: 0.2015 - classification_loss: 0.0084 216/500 [===========>..................] - ETA: 2:22 - loss: 0.2098 - regression_loss: 0.2015 - classification_loss: 0.0083 217/500 [============>.................] - ETA: 2:22 - loss: 0.2097 - regression_loss: 0.2014 - classification_loss: 0.0083 218/500 [============>.................] - ETA: 2:21 - loss: 0.2103 - regression_loss: 0.2020 - classification_loss: 0.0083 219/500 [============>.................] - ETA: 2:21 - loss: 0.2101 - regression_loss: 0.2017 - classification_loss: 0.0083 220/500 [============>.................] - ETA: 2:20 - loss: 0.2100 - regression_loss: 0.2016 - classification_loss: 0.0084 221/500 [============>.................] - ETA: 2:20 - loss: 0.2096 - regression_loss: 0.2012 - classification_loss: 0.0083 222/500 [============>.................] - ETA: 2:19 - loss: 0.2093 - regression_loss: 0.2009 - classification_loss: 0.0083 223/500 [============>.................] - ETA: 2:19 - loss: 0.2087 - regression_loss: 0.2004 - classification_loss: 0.0083 224/500 [============>.................] - ETA: 2:18 - loss: 0.2083 - regression_loss: 0.2001 - classification_loss: 0.0083 225/500 [============>.................] - ETA: 2:18 - loss: 0.2083 - regression_loss: 0.2000 - classification_loss: 0.0083 226/500 [============>.................] - ETA: 2:17 - loss: 0.2081 - regression_loss: 0.1998 - classification_loss: 0.0083 227/500 [============>.................] - ETA: 2:17 - loss: 0.2077 - regression_loss: 0.1995 - classification_loss: 0.0083 228/500 [============>.................] - ETA: 2:16 - loss: 0.2073 - regression_loss: 0.1991 - classification_loss: 0.0083 229/500 [============>.................] - ETA: 2:16 - loss: 0.2070 - regression_loss: 0.1988 - classification_loss: 0.0083 230/500 [============>.................] - ETA: 2:15 - loss: 0.2068 - regression_loss: 0.1985 - classification_loss: 0.0083 231/500 [============>.................] - ETA: 2:15 - loss: 0.2065 - regression_loss: 0.1983 - classification_loss: 0.0082 232/500 [============>.................] - ETA: 2:14 - loss: 0.2072 - regression_loss: 0.1989 - classification_loss: 0.0083 233/500 [============>.................] - ETA: 2:14 - loss: 0.2069 - regression_loss: 0.1986 - classification_loss: 0.0083 234/500 [=============>................] - ETA: 2:13 - loss: 0.2072 - regression_loss: 0.1989 - classification_loss: 0.0083 235/500 [=============>................] - ETA: 2:13 - loss: 0.2073 - regression_loss: 0.1990 - classification_loss: 0.0083 236/500 [=============>................] - ETA: 2:12 - loss: 0.2076 - regression_loss: 0.1993 - classification_loss: 0.0083 237/500 [=============>................] - ETA: 2:12 - loss: 0.2078 - regression_loss: 0.1995 - classification_loss: 0.0083 238/500 [=============>................] - ETA: 2:11 - loss: 0.2081 - regression_loss: 0.1998 - classification_loss: 0.0083 239/500 [=============>................] - ETA: 2:11 - loss: 0.2080 - regression_loss: 0.1997 - classification_loss: 0.0083 240/500 [=============>................] - ETA: 2:10 - loss: 0.2083 - regression_loss: 0.2001 - classification_loss: 0.0083 241/500 [=============>................] - ETA: 2:10 - loss: 0.2084 - regression_loss: 0.2001 - classification_loss: 0.0083 242/500 [=============>................] - ETA: 2:09 - loss: 0.2083 - regression_loss: 0.2000 - classification_loss: 0.0083 243/500 [=============>................] - ETA: 2:09 - loss: 0.2078 - regression_loss: 0.1996 - classification_loss: 0.0083 244/500 [=============>................] - ETA: 2:08 - loss: 0.2077 - regression_loss: 0.1995 - classification_loss: 0.0082 245/500 [=============>................] - ETA: 2:08 - loss: 0.2081 - regression_loss: 0.1998 - classification_loss: 0.0082 246/500 [=============>................] - ETA: 2:07 - loss: 0.2079 - regression_loss: 0.1997 - classification_loss: 0.0082 247/500 [=============>................] - ETA: 2:07 - loss: 0.2080 - regression_loss: 0.1997 - classification_loss: 0.0082 248/500 [=============>................] - ETA: 2:06 - loss: 0.2078 - regression_loss: 0.1995 - classification_loss: 0.0082 249/500 [=============>................] - ETA: 2:06 - loss: 0.2073 - regression_loss: 0.1991 - classification_loss: 0.0082 250/500 [==============>...............] - ETA: 2:05 - loss: 0.2073 - regression_loss: 0.1991 - classification_loss: 0.0082 251/500 [==============>...............] - ETA: 2:05 - loss: 0.2074 - regression_loss: 0.1992 - classification_loss: 0.0082 252/500 [==============>...............] - ETA: 2:04 - loss: 0.2069 - regression_loss: 0.1987 - classification_loss: 0.0082 253/500 [==============>...............] - ETA: 2:04 - loss: 0.2065 - regression_loss: 0.1983 - classification_loss: 0.0082 254/500 [==============>...............] - ETA: 2:03 - loss: 0.2068 - regression_loss: 0.1986 - classification_loss: 0.0082 255/500 [==============>...............] - ETA: 2:03 - loss: 0.2065 - regression_loss: 0.1983 - classification_loss: 0.0082 256/500 [==============>...............] - ETA: 2:02 - loss: 0.2070 - regression_loss: 0.1989 - classification_loss: 0.0082 257/500 [==============>...............] - ETA: 2:02 - loss: 0.2066 - regression_loss: 0.1984 - classification_loss: 0.0082 258/500 [==============>...............] - ETA: 2:01 - loss: 0.2063 - regression_loss: 0.1981 - classification_loss: 0.0081 259/500 [==============>...............] - ETA: 2:01 - loss: 0.2057 - regression_loss: 0.1975 - classification_loss: 0.0081 260/500 [==============>...............] - ETA: 2:00 - loss: 0.2052 - regression_loss: 0.1970 - classification_loss: 0.0081 261/500 [==============>...............] - ETA: 2:00 - loss: 0.2050 - regression_loss: 0.1969 - classification_loss: 0.0081 262/500 [==============>...............] - ETA: 1:59 - loss: 0.2047 - regression_loss: 0.1966 - classification_loss: 0.0081 263/500 [==============>...............] - ETA: 1:59 - loss: 0.2053 - regression_loss: 0.1973 - classification_loss: 0.0080 264/500 [==============>...............] - ETA: 1:58 - loss: 0.2048 - regression_loss: 0.1968 - classification_loss: 0.0080 265/500 [==============>...............] - ETA: 1:58 - loss: 0.2070 - regression_loss: 0.1989 - classification_loss: 0.0081 266/500 [==============>...............] - ETA: 1:57 - loss: 0.2068 - regression_loss: 0.1987 - classification_loss: 0.0081 267/500 [===============>..............] - ETA: 1:57 - loss: 0.2081 - regression_loss: 0.2000 - classification_loss: 0.0081 268/500 [===============>..............] - ETA: 1:56 - loss: 0.2083 - regression_loss: 0.2001 - classification_loss: 0.0081 269/500 [===============>..............] - ETA: 1:56 - loss: 0.2079 - regression_loss: 0.1998 - classification_loss: 0.0081 270/500 [===============>..............] - ETA: 1:55 - loss: 0.2082 - regression_loss: 0.2001 - classification_loss: 0.0081 271/500 [===============>..............] - ETA: 1:55 - loss: 0.2081 - regression_loss: 0.2000 - classification_loss: 0.0081 272/500 [===============>..............] - ETA: 1:54 - loss: 0.2082 - regression_loss: 0.2001 - classification_loss: 0.0081 273/500 [===============>..............] - ETA: 1:54 - loss: 0.2081 - regression_loss: 0.2000 - classification_loss: 0.0080 274/500 [===============>..............] - ETA: 1:53 - loss: 0.2089 - regression_loss: 0.2009 - classification_loss: 0.0080 275/500 [===============>..............] - ETA: 1:53 - loss: 0.2091 - regression_loss: 0.2010 - classification_loss: 0.0081 276/500 [===============>..............] - ETA: 1:52 - loss: 0.2094 - regression_loss: 0.2014 - classification_loss: 0.0081 277/500 [===============>..............] - ETA: 1:52 - loss: 0.2110 - regression_loss: 0.2028 - classification_loss: 0.0083 278/500 [===============>..............] - ETA: 1:51 - loss: 0.2108 - regression_loss: 0.2025 - classification_loss: 0.0082 279/500 [===============>..............] - ETA: 1:51 - loss: 0.2107 - regression_loss: 0.2024 - classification_loss: 0.0082 280/500 [===============>..............] - ETA: 1:50 - loss: 0.2105 - regression_loss: 0.2023 - classification_loss: 0.0082 281/500 [===============>..............] - ETA: 1:50 - loss: 0.2102 - regression_loss: 0.2020 - classification_loss: 0.0082 282/500 [===============>..............] - ETA: 1:49 - loss: 0.2103 - regression_loss: 0.2021 - classification_loss: 0.0082 283/500 [===============>..............] - ETA: 1:49 - loss: 0.2109 - regression_loss: 0.2028 - classification_loss: 0.0082 284/500 [================>.............] - ETA: 1:48 - loss: 0.2109 - regression_loss: 0.2027 - classification_loss: 0.0082 285/500 [================>.............] - ETA: 1:48 - loss: 0.2112 - regression_loss: 0.2030 - classification_loss: 0.0082 286/500 [================>.............] - ETA: 1:47 - loss: 0.2122 - regression_loss: 0.2040 - classification_loss: 0.0083 287/500 [================>.............] - ETA: 1:47 - loss: 0.2125 - regression_loss: 0.2042 - classification_loss: 0.0083 288/500 [================>.............] - ETA: 1:46 - loss: 0.2130 - regression_loss: 0.2046 - classification_loss: 0.0084 289/500 [================>.............] - ETA: 1:46 - loss: 0.2133 - regression_loss: 0.2049 - classification_loss: 0.0084 290/500 [================>.............] - ETA: 1:45 - loss: 0.2128 - regression_loss: 0.2045 - classification_loss: 0.0084 291/500 [================>.............] - ETA: 1:45 - loss: 0.2128 - regression_loss: 0.2044 - classification_loss: 0.0084 292/500 [================>.............] - ETA: 1:44 - loss: 0.2130 - regression_loss: 0.2046 - classification_loss: 0.0084 293/500 [================>.............] - ETA: 1:44 - loss: 0.2134 - regression_loss: 0.2050 - classification_loss: 0.0083 294/500 [================>.............] - ETA: 1:43 - loss: 0.2133 - regression_loss: 0.2049 - classification_loss: 0.0083 295/500 [================>.............] - ETA: 1:43 - loss: 0.2142 - regression_loss: 0.2058 - classification_loss: 0.0084 296/500 [================>.............] - ETA: 1:42 - loss: 0.2140 - regression_loss: 0.2057 - classification_loss: 0.0084 297/500 [================>.............] - ETA: 1:42 - loss: 0.2140 - regression_loss: 0.2056 - classification_loss: 0.0084 298/500 [================>.............] - ETA: 1:41 - loss: 0.2140 - regression_loss: 0.2056 - classification_loss: 0.0084 299/500 [================>.............] - ETA: 1:41 - loss: 0.2136 - regression_loss: 0.2053 - classification_loss: 0.0084 300/500 [=================>............] - ETA: 1:40 - loss: 0.2133 - regression_loss: 0.2049 - classification_loss: 0.0084 301/500 [=================>............] - ETA: 1:40 - loss: 0.2136 - regression_loss: 0.2052 - classification_loss: 0.0084 302/500 [=================>............] - ETA: 1:39 - loss: 0.2135 - regression_loss: 0.2051 - classification_loss: 0.0084 303/500 [=================>............] - ETA: 1:39 - loss: 0.2136 - regression_loss: 0.2052 - classification_loss: 0.0084 304/500 [=================>............] - ETA: 1:38 - loss: 0.2136 - regression_loss: 0.2053 - classification_loss: 0.0083 305/500 [=================>............] - ETA: 1:38 - loss: 0.2131 - regression_loss: 0.2048 - classification_loss: 0.0083 306/500 [=================>............] - ETA: 1:37 - loss: 0.2130 - regression_loss: 0.2047 - classification_loss: 0.0083 307/500 [=================>............] - ETA: 1:37 - loss: 0.2130 - regression_loss: 0.2046 - classification_loss: 0.0084 308/500 [=================>............] - ETA: 1:36 - loss: 0.2126 - regression_loss: 0.2042 - classification_loss: 0.0083 309/500 [=================>............] - ETA: 1:36 - loss: 0.2125 - regression_loss: 0.2042 - classification_loss: 0.0083 310/500 [=================>............] - ETA: 1:35 - loss: 0.2130 - regression_loss: 0.2047 - classification_loss: 0.0083 311/500 [=================>............] - ETA: 1:35 - loss: 0.2132 - regression_loss: 0.2049 - classification_loss: 0.0083 312/500 [=================>............] - ETA: 1:34 - loss: 0.2135 - regression_loss: 0.2052 - classification_loss: 0.0083 313/500 [=================>............] - ETA: 1:34 - loss: 0.2135 - regression_loss: 0.2052 - classification_loss: 0.0083 314/500 [=================>............] - ETA: 1:33 - loss: 0.2134 - regression_loss: 0.2052 - classification_loss: 0.0083 315/500 [=================>............] - ETA: 1:33 - loss: 0.2138 - regression_loss: 0.2055 - classification_loss: 0.0083 316/500 [=================>............] - ETA: 1:32 - loss: 0.2138 - regression_loss: 0.2055 - classification_loss: 0.0083 317/500 [==================>...........] - ETA: 1:32 - loss: 0.2136 - regression_loss: 0.2053 - classification_loss: 0.0083 318/500 [==================>...........] - ETA: 1:31 - loss: 0.2137 - regression_loss: 0.2054 - classification_loss: 0.0083 319/500 [==================>...........] - ETA: 1:31 - loss: 0.2134 - regression_loss: 0.2051 - classification_loss: 0.0083 320/500 [==================>...........] - ETA: 1:30 - loss: 0.2138 - regression_loss: 0.2055 - classification_loss: 0.0083 321/500 [==================>...........] - ETA: 1:30 - loss: 0.2133 - regression_loss: 0.2050 - classification_loss: 0.0082 322/500 [==================>...........] - ETA: 1:29 - loss: 0.2132 - regression_loss: 0.2050 - classification_loss: 0.0082 323/500 [==================>...........] - ETA: 1:29 - loss: 0.2130 - regression_loss: 0.2048 - classification_loss: 0.0082 324/500 [==================>...........] - ETA: 1:28 - loss: 0.2138 - regression_loss: 0.2056 - classification_loss: 0.0082 325/500 [==================>...........] - ETA: 1:27 - loss: 0.2142 - regression_loss: 0.2060 - classification_loss: 0.0082 326/500 [==================>...........] - ETA: 1:27 - loss: 0.2141 - regression_loss: 0.2059 - classification_loss: 0.0082 327/500 [==================>...........] - ETA: 1:26 - loss: 0.2141 - regression_loss: 0.2059 - classification_loss: 0.0082 328/500 [==================>...........] - ETA: 1:26 - loss: 0.2138 - regression_loss: 0.2056 - classification_loss: 0.0082 329/500 [==================>...........] - ETA: 1:25 - loss: 0.2136 - regression_loss: 0.2054 - classification_loss: 0.0082 330/500 [==================>...........] - ETA: 1:25 - loss: 0.2141 - regression_loss: 0.2059 - classification_loss: 0.0082 331/500 [==================>...........] - ETA: 1:24 - loss: 0.2139 - regression_loss: 0.2057 - classification_loss: 0.0082 332/500 [==================>...........] - ETA: 1:24 - loss: 0.2142 - regression_loss: 0.2059 - classification_loss: 0.0082 333/500 [==================>...........] - ETA: 1:23 - loss: 0.2142 - regression_loss: 0.2060 - classification_loss: 0.0082 334/500 [===================>..........] - ETA: 1:23 - loss: 0.2141 - regression_loss: 0.2059 - classification_loss: 0.0082 335/500 [===================>..........] - ETA: 1:22 - loss: 0.2140 - regression_loss: 0.2058 - classification_loss: 0.0082 336/500 [===================>..........] - ETA: 1:22 - loss: 0.2137 - regression_loss: 0.2055 - classification_loss: 0.0082 337/500 [===================>..........] - ETA: 1:21 - loss: 0.2141 - regression_loss: 0.2058 - classification_loss: 0.0082 338/500 [===================>..........] - ETA: 1:21 - loss: 0.2141 - regression_loss: 0.2059 - classification_loss: 0.0082 339/500 [===================>..........] - ETA: 1:20 - loss: 0.2143 - regression_loss: 0.2061 - classification_loss: 0.0082 340/500 [===================>..........] - ETA: 1:20 - loss: 0.2139 - regression_loss: 0.2057 - classification_loss: 0.0082 341/500 [===================>..........] - ETA: 1:19 - loss: 0.2135 - regression_loss: 0.2053 - classification_loss: 0.0082 342/500 [===================>..........] - ETA: 1:19 - loss: 0.2138 - regression_loss: 0.2056 - classification_loss: 0.0082 343/500 [===================>..........] - ETA: 1:18 - loss: 0.2139 - regression_loss: 0.2057 - classification_loss: 0.0082 344/500 [===================>..........] - ETA: 1:18 - loss: 0.2138 - regression_loss: 0.2056 - classification_loss: 0.0082 345/500 [===================>..........] - ETA: 1:17 - loss: 0.2136 - regression_loss: 0.2054 - classification_loss: 0.0082 346/500 [===================>..........] - ETA: 1:17 - loss: 0.2136 - regression_loss: 0.2053 - classification_loss: 0.0082 347/500 [===================>..........] - ETA: 1:16 - loss: 0.2138 - regression_loss: 0.2056 - classification_loss: 0.0082 348/500 [===================>..........] - ETA: 1:16 - loss: 0.2136 - regression_loss: 0.2054 - classification_loss: 0.0082 349/500 [===================>..........] - ETA: 1:15 - loss: 0.2136 - regression_loss: 0.2054 - classification_loss: 0.0082 350/500 [====================>.........] - ETA: 1:15 - loss: 0.2135 - regression_loss: 0.2053 - classification_loss: 0.0082 351/500 [====================>.........] - ETA: 1:14 - loss: 0.2133 - regression_loss: 0.2051 - classification_loss: 0.0082 352/500 [====================>.........] - ETA: 1:14 - loss: 0.2135 - regression_loss: 0.2053 - classification_loss: 0.0082 353/500 [====================>.........] - ETA: 1:13 - loss: 0.2133 - regression_loss: 0.2051 - classification_loss: 0.0082 354/500 [====================>.........] - ETA: 1:13 - loss: 0.2133 - regression_loss: 0.2051 - classification_loss: 0.0082 355/500 [====================>.........] - ETA: 1:12 - loss: 0.2132 - regression_loss: 0.2050 - classification_loss: 0.0082 356/500 [====================>.........] - ETA: 1:12 - loss: 0.2129 - regression_loss: 0.2047 - classification_loss: 0.0082 357/500 [====================>.........] - ETA: 1:11 - loss: 0.2124 - regression_loss: 0.2042 - classification_loss: 0.0082 358/500 [====================>.........] - ETA: 1:11 - loss: 0.2123 - regression_loss: 0.2041 - classification_loss: 0.0082 359/500 [====================>.........] - ETA: 1:10 - loss: 0.2126 - regression_loss: 0.2044 - classification_loss: 0.0082 360/500 [====================>.........] - ETA: 1:10 - loss: 0.2124 - regression_loss: 0.2042 - classification_loss: 0.0082 361/500 [====================>.........] - ETA: 1:09 - loss: 0.2123 - regression_loss: 0.2041 - classification_loss: 0.0082 362/500 [====================>.........] - ETA: 1:09 - loss: 0.2124 - regression_loss: 0.2042 - classification_loss: 0.0082 363/500 [====================>.........] - ETA: 1:08 - loss: 0.2126 - regression_loss: 0.2044 - classification_loss: 0.0082 364/500 [====================>.........] - ETA: 1:08 - loss: 0.2124 - regression_loss: 0.2042 - classification_loss: 0.0082 365/500 [====================>.........] - ETA: 1:07 - loss: 0.2126 - regression_loss: 0.2044 - classification_loss: 0.0082 366/500 [====================>.........] - ETA: 1:07 - loss: 0.2124 - regression_loss: 0.2042 - classification_loss: 0.0082 367/500 [=====================>........] - ETA: 1:06 - loss: 0.2127 - regression_loss: 0.2045 - classification_loss: 0.0082 368/500 [=====================>........] - ETA: 1:06 - loss: 0.2123 - regression_loss: 0.2041 - classification_loss: 0.0082 369/500 [=====================>........] - ETA: 1:05 - loss: 0.2126 - regression_loss: 0.2043 - classification_loss: 0.0082 370/500 [=====================>........] - ETA: 1:05 - loss: 0.2128 - regression_loss: 0.2045 - classification_loss: 0.0082 371/500 [=====================>........] - ETA: 1:04 - loss: 0.2131 - regression_loss: 0.2049 - classification_loss: 0.0082 372/500 [=====================>........] - ETA: 1:04 - loss: 0.2134 - regression_loss: 0.2052 - classification_loss: 0.0082 373/500 [=====================>........] - ETA: 1:03 - loss: 0.2133 - regression_loss: 0.2051 - classification_loss: 0.0082 374/500 [=====================>........] - ETA: 1:03 - loss: 0.2133 - regression_loss: 0.2051 - classification_loss: 0.0082 375/500 [=====================>........] - ETA: 1:02 - loss: 0.2135 - regression_loss: 0.2053 - classification_loss: 0.0082 376/500 [=====================>........] - ETA: 1:02 - loss: 0.2133 - regression_loss: 0.2051 - classification_loss: 0.0082 377/500 [=====================>........] - ETA: 1:01 - loss: 0.2134 - regression_loss: 0.2052 - classification_loss: 0.0082 378/500 [=====================>........] - ETA: 1:01 - loss: 0.2136 - regression_loss: 0.2054 - classification_loss: 0.0083 379/500 [=====================>........] - ETA: 1:00 - loss: 0.2135 - regression_loss: 0.2053 - classification_loss: 0.0083 380/500 [=====================>........] - ETA: 1:00 - loss: 0.2133 - regression_loss: 0.2050 - classification_loss: 0.0083 381/500 [=====================>........] - ETA: 59s - loss: 0.2130 - regression_loss: 0.2047 - classification_loss: 0.0082  382/500 [=====================>........] - ETA: 59s - loss: 0.2128 - regression_loss: 0.2045 - classification_loss: 0.0082 383/500 [=====================>........] - ETA: 58s - loss: 0.2129 - regression_loss: 0.2047 - classification_loss: 0.0082 384/500 [======================>.......] - ETA: 58s - loss: 0.2126 - regression_loss: 0.2044 - classification_loss: 0.0082 385/500 [======================>.......] - ETA: 57s - loss: 0.2126 - regression_loss: 0.2043 - classification_loss: 0.0082 386/500 [======================>.......] - ETA: 57s - loss: 0.2128 - regression_loss: 0.2046 - classification_loss: 0.0082 387/500 [======================>.......] - ETA: 56s - loss: 0.2127 - regression_loss: 0.2045 - classification_loss: 0.0082 388/500 [======================>.......] - ETA: 56s - loss: 0.2127 - regression_loss: 0.2044 - classification_loss: 0.0082 389/500 [======================>.......] - ETA: 55s - loss: 0.2125 - regression_loss: 0.2043 - classification_loss: 0.0082 390/500 [======================>.......] - ETA: 55s - loss: 0.2123 - regression_loss: 0.2041 - classification_loss: 0.0082 391/500 [======================>.......] - ETA: 54s - loss: 0.2121 - regression_loss: 0.2039 - classification_loss: 0.0082 392/500 [======================>.......] - ETA: 54s - loss: 0.2123 - regression_loss: 0.2041 - classification_loss: 0.0082 393/500 [======================>.......] - ETA: 53s - loss: 0.2123 - regression_loss: 0.2041 - classification_loss: 0.0082 394/500 [======================>.......] - ETA: 53s - loss: 0.2123 - regression_loss: 0.2041 - classification_loss: 0.0082 395/500 [======================>.......] - ETA: 52s - loss: 0.2122 - regression_loss: 0.2040 - classification_loss: 0.0082 396/500 [======================>.......] - ETA: 52s - loss: 0.2124 - regression_loss: 0.2042 - classification_loss: 0.0082 397/500 [======================>.......] - ETA: 51s - loss: 0.2121 - regression_loss: 0.2039 - classification_loss: 0.0082 398/500 [======================>.......] - ETA: 51s - loss: 0.2120 - regression_loss: 0.2038 - classification_loss: 0.0082 399/500 [======================>.......] - ETA: 50s - loss: 0.2122 - regression_loss: 0.2040 - classification_loss: 0.0082 400/500 [=======================>......] - ETA: 50s - loss: 0.2120 - regression_loss: 0.2038 - classification_loss: 0.0082 401/500 [=======================>......] - ETA: 49s - loss: 0.2116 - regression_loss: 0.2035 - classification_loss: 0.0082 402/500 [=======================>......] - ETA: 49s - loss: 0.2112 - regression_loss: 0.2031 - classification_loss: 0.0082 403/500 [=======================>......] - ETA: 48s - loss: 0.2115 - regression_loss: 0.2033 - classification_loss: 0.0082 404/500 [=======================>......] - ETA: 48s - loss: 0.2116 - regression_loss: 0.2034 - classification_loss: 0.0082 405/500 [=======================>......] - ETA: 47s - loss: 0.2112 - regression_loss: 0.2030 - classification_loss: 0.0082 406/500 [=======================>......] - ETA: 47s - loss: 0.2112 - regression_loss: 0.2031 - classification_loss: 0.0082 407/500 [=======================>......] - ETA: 46s - loss: 0.2109 - regression_loss: 0.2028 - classification_loss: 0.0082 408/500 [=======================>......] - ETA: 46s - loss: 0.2107 - regression_loss: 0.2026 - classification_loss: 0.0081 409/500 [=======================>......] - ETA: 45s - loss: 0.2107 - regression_loss: 0.2026 - classification_loss: 0.0081 410/500 [=======================>......] - ETA: 45s - loss: 0.2106 - regression_loss: 0.2025 - classification_loss: 0.0081 411/500 [=======================>......] - ETA: 44s - loss: 0.2104 - regression_loss: 0.2023 - classification_loss: 0.0081 412/500 [=======================>......] - ETA: 44s - loss: 0.2103 - regression_loss: 0.2022 - classification_loss: 0.0081 413/500 [=======================>......] - ETA: 43s - loss: 0.2099 - regression_loss: 0.2018 - classification_loss: 0.0081 414/500 [=======================>......] - ETA: 43s - loss: 0.2098 - regression_loss: 0.2017 - classification_loss: 0.0081 415/500 [=======================>......] - ETA: 42s - loss: 0.2097 - regression_loss: 0.2016 - classification_loss: 0.0081 416/500 [=======================>......] - ETA: 42s - loss: 0.2096 - regression_loss: 0.2016 - classification_loss: 0.0081 417/500 [========================>.....] - ETA: 41s - loss: 0.2098 - regression_loss: 0.2017 - classification_loss: 0.0081 418/500 [========================>.....] - ETA: 41s - loss: 0.2099 - regression_loss: 0.2018 - classification_loss: 0.0081 419/500 [========================>.....] - ETA: 40s - loss: 0.2100 - regression_loss: 0.2020 - classification_loss: 0.0081 420/500 [========================>.....] - ETA: 40s - loss: 0.2098 - regression_loss: 0.2018 - classification_loss: 0.0081 421/500 [========================>.....] - ETA: 39s - loss: 0.2100 - regression_loss: 0.2019 - classification_loss: 0.0081 422/500 [========================>.....] - ETA: 39s - loss: 0.2098 - regression_loss: 0.2017 - classification_loss: 0.0080 423/500 [========================>.....] - ETA: 38s - loss: 0.2099 - regression_loss: 0.2019 - classification_loss: 0.0080 424/500 [========================>.....] - ETA: 38s - loss: 0.2100 - regression_loss: 0.2020 - classification_loss: 0.0081 425/500 [========================>.....] - ETA: 37s - loss: 0.2103 - regression_loss: 0.2022 - classification_loss: 0.0081 426/500 [========================>.....] - ETA: 37s - loss: 0.2102 - regression_loss: 0.2022 - classification_loss: 0.0081 427/500 [========================>.....] - ETA: 36s - loss: 0.2101 - regression_loss: 0.2020 - classification_loss: 0.0080 428/500 [========================>.....] - ETA: 36s - loss: 0.2102 - regression_loss: 0.2022 - classification_loss: 0.0080 429/500 [========================>.....] - ETA: 35s - loss: 0.2102 - regression_loss: 0.2022 - classification_loss: 0.0080 430/500 [========================>.....] - ETA: 35s - loss: 0.2101 - regression_loss: 0.2021 - classification_loss: 0.0080 431/500 [========================>.....] - ETA: 34s - loss: 0.2103 - regression_loss: 0.2022 - classification_loss: 0.0081 432/500 [========================>.....] - ETA: 34s - loss: 0.2105 - regression_loss: 0.2024 - classification_loss: 0.0081 433/500 [========================>.....] - ETA: 33s - loss: 0.2105 - regression_loss: 0.2024 - classification_loss: 0.0081 434/500 [=========================>....] - ETA: 33s - loss: 0.2104 - regression_loss: 0.2024 - classification_loss: 0.0081 435/500 [=========================>....] - ETA: 32s - loss: 0.2106 - regression_loss: 0.2025 - classification_loss: 0.0081 436/500 [=========================>....] - ETA: 32s - loss: 0.2108 - regression_loss: 0.2028 - classification_loss: 0.0081 437/500 [=========================>....] - ETA: 31s - loss: 0.2110 - regression_loss: 0.2029 - classification_loss: 0.0081 438/500 [=========================>....] - ETA: 31s - loss: 0.2110 - regression_loss: 0.2029 - classification_loss: 0.0081 439/500 [=========================>....] - ETA: 30s - loss: 0.2110 - regression_loss: 0.2029 - classification_loss: 0.0081 440/500 [=========================>....] - ETA: 30s - loss: 0.2111 - regression_loss: 0.2030 - classification_loss: 0.0081 441/500 [=========================>....] - ETA: 29s - loss: 0.2112 - regression_loss: 0.2031 - classification_loss: 0.0081 442/500 [=========================>....] - ETA: 29s - loss: 0.2112 - regression_loss: 0.2031 - classification_loss: 0.0081 443/500 [=========================>....] - ETA: 28s - loss: 0.2114 - regression_loss: 0.2033 - classification_loss: 0.0081 444/500 [=========================>....] - ETA: 28s - loss: 0.2114 - regression_loss: 0.2033 - classification_loss: 0.0081 445/500 [=========================>....] - ETA: 27s - loss: 0.2117 - regression_loss: 0.2035 - classification_loss: 0.0082 446/500 [=========================>....] - ETA: 27s - loss: 0.2117 - regression_loss: 0.2036 - classification_loss: 0.0082 447/500 [=========================>....] - ETA: 26s - loss: 0.2118 - regression_loss: 0.2037 - classification_loss: 0.0082 448/500 [=========================>....] - ETA: 26s - loss: 0.2116 - regression_loss: 0.2035 - classification_loss: 0.0081 449/500 [=========================>....] - ETA: 25s - loss: 0.2121 - regression_loss: 0.2039 - classification_loss: 0.0082 450/500 [==========================>...] - ETA: 25s - loss: 0.2120 - regression_loss: 0.2038 - classification_loss: 0.0082 451/500 [==========================>...] - ETA: 24s - loss: 0.2125 - regression_loss: 0.2042 - classification_loss: 0.0083 452/500 [==========================>...] - ETA: 24s - loss: 0.2127 - regression_loss: 0.2043 - classification_loss: 0.0083 453/500 [==========================>...] - ETA: 23s - loss: 0.2127 - regression_loss: 0.2044 - classification_loss: 0.0083 454/500 [==========================>...] - ETA: 23s - loss: 0.2130 - regression_loss: 0.2047 - classification_loss: 0.0083 455/500 [==========================>...] - ETA: 22s - loss: 0.2129 - regression_loss: 0.2046 - classification_loss: 0.0083 456/500 [==========================>...] - ETA: 22s - loss: 0.2128 - regression_loss: 0.2045 - classification_loss: 0.0083 457/500 [==========================>...] - ETA: 21s - loss: 0.2126 - regression_loss: 0.2043 - classification_loss: 0.0083 458/500 [==========================>...] - ETA: 21s - loss: 0.2126 - regression_loss: 0.2043 - classification_loss: 0.0083 459/500 [==========================>...] - ETA: 20s - loss: 0.2124 - regression_loss: 0.2041 - classification_loss: 0.0083 460/500 [==========================>...] - ETA: 20s - loss: 0.2123 - regression_loss: 0.2040 - classification_loss: 0.0083 461/500 [==========================>...] - ETA: 19s - loss: 0.2123 - regression_loss: 0.2040 - classification_loss: 0.0083 462/500 [==========================>...] - ETA: 19s - loss: 0.2124 - regression_loss: 0.2041 - classification_loss: 0.0083 463/500 [==========================>...] - ETA: 18s - loss: 0.2123 - regression_loss: 0.2040 - classification_loss: 0.0083 464/500 [==========================>...] - ETA: 18s - loss: 0.2120 - regression_loss: 0.2038 - classification_loss: 0.0083 465/500 [==========================>...] - ETA: 17s - loss: 0.2120 - regression_loss: 0.2037 - classification_loss: 0.0083 466/500 [==========================>...] - ETA: 17s - loss: 0.2124 - regression_loss: 0.2040 - classification_loss: 0.0083 467/500 [===========================>..] - ETA: 16s - loss: 0.2125 - regression_loss: 0.2042 - classification_loss: 0.0083 468/500 [===========================>..] - ETA: 16s - loss: 0.2124 - regression_loss: 0.2041 - classification_loss: 0.0083 469/500 [===========================>..] - ETA: 15s - loss: 0.2129 - regression_loss: 0.2046 - classification_loss: 0.0083 470/500 [===========================>..] - ETA: 15s - loss: 0.2130 - regression_loss: 0.2047 - classification_loss: 0.0083 471/500 [===========================>..] - ETA: 14s - loss: 0.2129 - regression_loss: 0.2046 - classification_loss: 0.0083 472/500 [===========================>..] - ETA: 14s - loss: 0.2130 - regression_loss: 0.2047 - classification_loss: 0.0083 473/500 [===========================>..] - ETA: 13s - loss: 0.2132 - regression_loss: 0.2048 - classification_loss: 0.0084 474/500 [===========================>..] - ETA: 13s - loss: 0.2133 - regression_loss: 0.2050 - classification_loss: 0.0084 475/500 [===========================>..] - ETA: 12s - loss: 0.2135 - regression_loss: 0.2052 - classification_loss: 0.0084 476/500 [===========================>..] - ETA: 12s - loss: 0.2134 - regression_loss: 0.2050 - classification_loss: 0.0083 477/500 [===========================>..] - ETA: 11s - loss: 0.2133 - regression_loss: 0.2050 - classification_loss: 0.0083 478/500 [===========================>..] - ETA: 11s - loss: 0.2132 - regression_loss: 0.2048 - classification_loss: 0.0083 479/500 [===========================>..] - ETA: 10s - loss: 0.2132 - regression_loss: 0.2049 - classification_loss: 0.0083 480/500 [===========================>..] - ETA: 10s - loss: 0.2132 - regression_loss: 0.2049 - classification_loss: 0.0083 481/500 [===========================>..] - ETA: 9s - loss: 0.2135 - regression_loss: 0.2051 - classification_loss: 0.0085  482/500 [===========================>..] - ETA: 9s - loss: 0.2136 - regression_loss: 0.2051 - classification_loss: 0.0085 483/500 [===========================>..] - ETA: 8s - loss: 0.2136 - regression_loss: 0.2051 - classification_loss: 0.0085 484/500 [============================>.] - ETA: 8s - loss: 0.2136 - regression_loss: 0.2051 - classification_loss: 0.0085 485/500 [============================>.] - ETA: 7s - loss: 0.2136 - regression_loss: 0.2051 - classification_loss: 0.0085 486/500 [============================>.] - ETA: 7s - loss: 0.2136 - regression_loss: 0.2051 - classification_loss: 0.0084 487/500 [============================>.] - ETA: 6s - loss: 0.2139 - regression_loss: 0.2055 - classification_loss: 0.0084 488/500 [============================>.] - ETA: 6s - loss: 0.2138 - regression_loss: 0.2053 - classification_loss: 0.0084 489/500 [============================>.] - ETA: 5s - loss: 0.2138 - regression_loss: 0.2053 - classification_loss: 0.0085 490/500 [============================>.] - ETA: 5s - loss: 0.2140 - regression_loss: 0.2055 - classification_loss: 0.0084 491/500 [============================>.] - ETA: 4s - loss: 0.2139 - regression_loss: 0.2054 - classification_loss: 0.0085 492/500 [============================>.] - ETA: 4s - loss: 0.2140 - regression_loss: 0.2055 - classification_loss: 0.0085 493/500 [============================>.] - ETA: 3s - loss: 0.2141 - regression_loss: 0.2057 - classification_loss: 0.0085 494/500 [============================>.] - ETA: 3s - loss: 0.2145 - regression_loss: 0.2060 - classification_loss: 0.0085 495/500 [============================>.] - ETA: 2s - loss: 0.2143 - regression_loss: 0.2058 - classification_loss: 0.0085 496/500 [============================>.] - ETA: 2s - loss: 0.2142 - regression_loss: 0.2058 - classification_loss: 0.0085 497/500 [============================>.] - ETA: 1s - loss: 0.2143 - regression_loss: 0.2059 - classification_loss: 0.0085 498/500 [============================>.] - ETA: 1s - loss: 0.2142 - regression_loss: 0.2058 - classification_loss: 0.0085 499/500 [============================>.] - ETA: 0s - loss: 0.2141 - regression_loss: 0.2057 - classification_loss: 0.0085 500/500 [==============================] - 251s 502ms/step - loss: 0.2141 - regression_loss: 0.2056 - classification_loss: 0.0085 1172 instances of class plum with average precision: 0.7544 mAP: 0.7544 Epoch 00037: saving model to ./training/snapshots/resnet101_pascal_37.h5 Epoch 38/150 1/500 [..............................] - ETA: 3:58 - loss: 0.1477 - regression_loss: 0.1455 - classification_loss: 0.0021 2/500 [..............................] - ETA: 4:05 - loss: 0.1421 - regression_loss: 0.1366 - classification_loss: 0.0056 3/500 [..............................] - ETA: 4:06 - loss: 0.1466 - regression_loss: 0.1414 - classification_loss: 0.0052 4/500 [..............................] - ETA: 4:06 - loss: 0.1498 - regression_loss: 0.1440 - classification_loss: 0.0058 5/500 [..............................] - ETA: 4:05 - loss: 0.1470 - regression_loss: 0.1414 - classification_loss: 0.0056 6/500 [..............................] - ETA: 4:04 - loss: 0.1518 - regression_loss: 0.1469 - classification_loss: 0.0049 7/500 [..............................] - ETA: 4:05 - loss: 0.1536 - regression_loss: 0.1488 - classification_loss: 0.0048 8/500 [..............................] - ETA: 4:04 - loss: 0.1566 - regression_loss: 0.1513 - classification_loss: 0.0053 9/500 [..............................] - ETA: 4:04 - loss: 0.1688 - regression_loss: 0.1609 - classification_loss: 0.0079 10/500 [..............................] - ETA: 4:04 - loss: 0.1616 - regression_loss: 0.1544 - classification_loss: 0.0072 11/500 [..............................] - ETA: 4:04 - loss: 0.1557 - regression_loss: 0.1489 - classification_loss: 0.0069 12/500 [..............................] - ETA: 4:03 - loss: 0.1582 - regression_loss: 0.1514 - classification_loss: 0.0067 13/500 [..............................] - ETA: 4:04 - loss: 0.1572 - regression_loss: 0.1505 - classification_loss: 0.0067 14/500 [..............................] - ETA: 4:03 - loss: 0.1654 - regression_loss: 0.1586 - classification_loss: 0.0068 15/500 [..............................] - ETA: 4:02 - loss: 0.1636 - regression_loss: 0.1572 - classification_loss: 0.0064 16/500 [..............................] - ETA: 4:02 - loss: 0.1645 - regression_loss: 0.1582 - classification_loss: 0.0063 17/500 [>.............................] - ETA: 4:01 - loss: 0.1621 - regression_loss: 0.1558 - classification_loss: 0.0063 18/500 [>.............................] - ETA: 4:02 - loss: 0.1626 - regression_loss: 0.1560 - classification_loss: 0.0067 19/500 [>.............................] - ETA: 4:01 - loss: 0.1580 - regression_loss: 0.1516 - classification_loss: 0.0064 20/500 [>.............................] - ETA: 4:00 - loss: 0.1581 - regression_loss: 0.1519 - classification_loss: 0.0062 21/500 [>.............................] - ETA: 4:00 - loss: 0.1580 - regression_loss: 0.1519 - classification_loss: 0.0060 22/500 [>.............................] - ETA: 3:59 - loss: 0.1605 - regression_loss: 0.1545 - classification_loss: 0.0060 23/500 [>.............................] - ETA: 3:58 - loss: 0.1639 - regression_loss: 0.1577 - classification_loss: 0.0062 24/500 [>.............................] - ETA: 3:58 - loss: 0.1644 - regression_loss: 0.1581 - classification_loss: 0.0063 25/500 [>.............................] - ETA: 3:58 - loss: 0.1622 - regression_loss: 0.1561 - classification_loss: 0.0061 26/500 [>.............................] - ETA: 3:58 - loss: 0.1683 - regression_loss: 0.1622 - classification_loss: 0.0061 27/500 [>.............................] - ETA: 3:57 - loss: 0.1664 - regression_loss: 0.1604 - classification_loss: 0.0060 28/500 [>.............................] - ETA: 3:56 - loss: 0.1686 - regression_loss: 0.1626 - classification_loss: 0.0060 29/500 [>.............................] - ETA: 3:56 - loss: 0.1726 - regression_loss: 0.1667 - classification_loss: 0.0060 30/500 [>.............................] - ETA: 3:56 - loss: 0.1704 - regression_loss: 0.1646 - classification_loss: 0.0058 31/500 [>.............................] - ETA: 3:55 - loss: 0.1752 - regression_loss: 0.1687 - classification_loss: 0.0065 32/500 [>.............................] - ETA: 3:55 - loss: 0.1734 - regression_loss: 0.1669 - classification_loss: 0.0064 33/500 [>.............................] - ETA: 3:54 - loss: 0.1787 - regression_loss: 0.1720 - classification_loss: 0.0067 34/500 [=>............................] - ETA: 3:53 - loss: 0.1774 - regression_loss: 0.1707 - classification_loss: 0.0068 35/500 [=>............................] - ETA: 3:53 - loss: 0.1758 - regression_loss: 0.1691 - classification_loss: 0.0066 36/500 [=>............................] - ETA: 3:52 - loss: 0.1768 - regression_loss: 0.1702 - classification_loss: 0.0065 37/500 [=>............................] - ETA: 3:52 - loss: 0.1815 - regression_loss: 0.1744 - classification_loss: 0.0071 38/500 [=>............................] - ETA: 3:52 - loss: 0.1823 - regression_loss: 0.1752 - classification_loss: 0.0071 39/500 [=>............................] - ETA: 3:51 - loss: 0.1821 - regression_loss: 0.1751 - classification_loss: 0.0070 40/500 [=>............................] - ETA: 3:50 - loss: 0.1827 - regression_loss: 0.1757 - classification_loss: 0.0070 41/500 [=>............................] - ETA: 3:50 - loss: 0.1855 - regression_loss: 0.1781 - classification_loss: 0.0074 42/500 [=>............................] - ETA: 3:50 - loss: 0.1884 - regression_loss: 0.1809 - classification_loss: 0.0075 43/500 [=>............................] - ETA: 3:49 - loss: 0.1875 - regression_loss: 0.1800 - classification_loss: 0.0075 44/500 [=>............................] - ETA: 3:49 - loss: 0.1868 - regression_loss: 0.1794 - classification_loss: 0.0074 45/500 [=>............................] - ETA: 3:48 - loss: 0.1856 - regression_loss: 0.1784 - classification_loss: 0.0073 46/500 [=>............................] - ETA: 3:48 - loss: 0.1856 - regression_loss: 0.1784 - classification_loss: 0.0073 47/500 [=>............................] - ETA: 3:47 - loss: 0.1857 - regression_loss: 0.1786 - classification_loss: 0.0071 48/500 [=>............................] - ETA: 3:47 - loss: 0.1866 - regression_loss: 0.1795 - classification_loss: 0.0071 49/500 [=>............................] - ETA: 3:46 - loss: 0.1885 - regression_loss: 0.1812 - classification_loss: 0.0073 50/500 [==>...........................] - ETA: 3:46 - loss: 0.1887 - regression_loss: 0.1813 - classification_loss: 0.0073 51/500 [==>...........................] - ETA: 3:45 - loss: 0.1900 - regression_loss: 0.1825 - classification_loss: 0.0075 52/500 [==>...........................] - ETA: 3:45 - loss: 0.1903 - regression_loss: 0.1829 - classification_loss: 0.0073 53/500 [==>...........................] - ETA: 3:44 - loss: 0.1902 - regression_loss: 0.1828 - classification_loss: 0.0074 54/500 [==>...........................] - ETA: 3:44 - loss: 0.1889 - regression_loss: 0.1816 - classification_loss: 0.0073 55/500 [==>...........................] - ETA: 3:43 - loss: 0.1889 - regression_loss: 0.1817 - classification_loss: 0.0072 56/500 [==>...........................] - ETA: 3:43 - loss: 0.1896 - regression_loss: 0.1824 - classification_loss: 0.0072 57/500 [==>...........................] - ETA: 3:42 - loss: 0.1917 - regression_loss: 0.1841 - classification_loss: 0.0076 58/500 [==>...........................] - ETA: 3:41 - loss: 0.1921 - regression_loss: 0.1846 - classification_loss: 0.0075 59/500 [==>...........................] - ETA: 3:41 - loss: 0.1913 - regression_loss: 0.1838 - classification_loss: 0.0075 60/500 [==>...........................] - ETA: 3:41 - loss: 0.1920 - regression_loss: 0.1843 - classification_loss: 0.0077 61/500 [==>...........................] - ETA: 3:40 - loss: 0.1935 - regression_loss: 0.1858 - classification_loss: 0.0078 62/500 [==>...........................] - ETA: 3:40 - loss: 0.1927 - regression_loss: 0.1850 - classification_loss: 0.0077 63/500 [==>...........................] - ETA: 3:39 - loss: 0.1955 - regression_loss: 0.1876 - classification_loss: 0.0079 64/500 [==>...........................] - ETA: 3:39 - loss: 0.1934 - regression_loss: 0.1855 - classification_loss: 0.0078 65/500 [==>...........................] - ETA: 3:38 - loss: 0.1926 - regression_loss: 0.1849 - classification_loss: 0.0077 66/500 [==>...........................] - ETA: 3:38 - loss: 0.1922 - regression_loss: 0.1845 - classification_loss: 0.0077 67/500 [===>..........................] - ETA: 3:37 - loss: 0.1934 - regression_loss: 0.1857 - classification_loss: 0.0077 68/500 [===>..........................] - ETA: 3:37 - loss: 0.1926 - regression_loss: 0.1850 - classification_loss: 0.0076 69/500 [===>..........................] - ETA: 3:36 - loss: 0.1915 - regression_loss: 0.1840 - classification_loss: 0.0075 70/500 [===>..........................] - ETA: 3:36 - loss: 0.1906 - regression_loss: 0.1832 - classification_loss: 0.0074 71/500 [===>..........................] - ETA: 3:35 - loss: 0.1956 - regression_loss: 0.1881 - classification_loss: 0.0075 72/500 [===>..........................] - ETA: 3:35 - loss: 0.1950 - regression_loss: 0.1876 - classification_loss: 0.0074 73/500 [===>..........................] - ETA: 3:34 - loss: 0.1956 - regression_loss: 0.1882 - classification_loss: 0.0074 74/500 [===>..........................] - ETA: 3:34 - loss: 0.1989 - regression_loss: 0.1914 - classification_loss: 0.0075 75/500 [===>..........................] - ETA: 3:33 - loss: 0.2000 - regression_loss: 0.1924 - classification_loss: 0.0076 76/500 [===>..........................] - ETA: 3:33 - loss: 0.1994 - regression_loss: 0.1919 - classification_loss: 0.0075 77/500 [===>..........................] - ETA: 3:32 - loss: 0.2008 - regression_loss: 0.1933 - classification_loss: 0.0075 78/500 [===>..........................] - ETA: 3:32 - loss: 0.2017 - regression_loss: 0.1941 - classification_loss: 0.0076 79/500 [===>..........................] - ETA: 3:31 - loss: 0.2031 - regression_loss: 0.1955 - classification_loss: 0.0075 80/500 [===>..........................] - ETA: 3:31 - loss: 0.2048 - regression_loss: 0.1971 - classification_loss: 0.0077 81/500 [===>..........................] - ETA: 3:30 - loss: 0.2048 - regression_loss: 0.1971 - classification_loss: 0.0077 82/500 [===>..........................] - ETA: 3:30 - loss: 0.2055 - regression_loss: 0.1978 - classification_loss: 0.0077 83/500 [===>..........................] - ETA: 3:29 - loss: 0.2049 - regression_loss: 0.1973 - classification_loss: 0.0076 84/500 [====>.........................] - ETA: 3:29 - loss: 0.2055 - regression_loss: 0.1979 - classification_loss: 0.0077 85/500 [====>.........................] - ETA: 3:28 - loss: 0.2067 - regression_loss: 0.1990 - classification_loss: 0.0077 86/500 [====>.........................] - ETA: 3:27 - loss: 0.2063 - regression_loss: 0.1986 - classification_loss: 0.0077 87/500 [====>.........................] - ETA: 3:27 - loss: 0.2052 - regression_loss: 0.1976 - classification_loss: 0.0076 88/500 [====>.........................] - ETA: 3:26 - loss: 0.2054 - regression_loss: 0.1977 - classification_loss: 0.0077 89/500 [====>.........................] - ETA: 3:25 - loss: 0.2038 - regression_loss: 0.1962 - classification_loss: 0.0076 90/500 [====>.........................] - ETA: 3:25 - loss: 0.2027 - regression_loss: 0.1952 - classification_loss: 0.0075 91/500 [====>.........................] - ETA: 3:24 - loss: 0.2024 - regression_loss: 0.1949 - classification_loss: 0.0075 92/500 [====>.........................] - ETA: 3:24 - loss: 0.2019 - regression_loss: 0.1944 - classification_loss: 0.0075 93/500 [====>.........................] - ETA: 3:23 - loss: 0.2007 - regression_loss: 0.1932 - classification_loss: 0.0075 94/500 [====>.........................] - ETA: 3:23 - loss: 0.2002 - regression_loss: 0.1927 - classification_loss: 0.0075 95/500 [====>.........................] - ETA: 3:22 - loss: 0.1999 - regression_loss: 0.1924 - classification_loss: 0.0075 96/500 [====>.........................] - ETA: 3:22 - loss: 0.1999 - regression_loss: 0.1924 - classification_loss: 0.0075 97/500 [====>.........................] - ETA: 3:21 - loss: 0.1986 - regression_loss: 0.1912 - classification_loss: 0.0074 98/500 [====>.........................] - ETA: 3:21 - loss: 0.1980 - regression_loss: 0.1907 - classification_loss: 0.0074 99/500 [====>.........................] - ETA: 3:20 - loss: 0.1992 - regression_loss: 0.1918 - classification_loss: 0.0074 100/500 [=====>........................] - ETA: 3:20 - loss: 0.1993 - regression_loss: 0.1918 - classification_loss: 0.0075 101/500 [=====>........................] - ETA: 3:19 - loss: 0.1979 - regression_loss: 0.1905 - classification_loss: 0.0074 102/500 [=====>........................] - ETA: 3:19 - loss: 0.1980 - regression_loss: 0.1907 - classification_loss: 0.0073 103/500 [=====>........................] - ETA: 3:18 - loss: 0.1974 - regression_loss: 0.1901 - classification_loss: 0.0073 104/500 [=====>........................] - ETA: 3:18 - loss: 0.1975 - regression_loss: 0.1903 - classification_loss: 0.0073 105/500 [=====>........................] - ETA: 3:17 - loss: 0.1978 - regression_loss: 0.1905 - classification_loss: 0.0073 106/500 [=====>........................] - ETA: 3:17 - loss: 0.1973 - regression_loss: 0.1900 - classification_loss: 0.0072 107/500 [=====>........................] - ETA: 3:16 - loss: 0.1966 - regression_loss: 0.1893 - classification_loss: 0.0072 108/500 [=====>........................] - ETA: 3:16 - loss: 0.1963 - regression_loss: 0.1890 - classification_loss: 0.0072 109/500 [=====>........................] - ETA: 3:15 - loss: 0.1958 - regression_loss: 0.1886 - classification_loss: 0.0072 110/500 [=====>........................] - ETA: 3:15 - loss: 0.1956 - regression_loss: 0.1884 - classification_loss: 0.0072 111/500 [=====>........................] - ETA: 3:14 - loss: 0.1950 - regression_loss: 0.1878 - classification_loss: 0.0072 112/500 [=====>........................] - ETA: 3:14 - loss: 0.1955 - regression_loss: 0.1884 - classification_loss: 0.0071 113/500 [=====>........................] - ETA: 3:13 - loss: 0.1944 - regression_loss: 0.1873 - classification_loss: 0.0071 114/500 [=====>........................] - ETA: 3:13 - loss: 0.1937 - regression_loss: 0.1865 - classification_loss: 0.0072 115/500 [=====>........................] - ETA: 3:12 - loss: 0.1932 - regression_loss: 0.1860 - classification_loss: 0.0071 116/500 [=====>........................] - ETA: 3:12 - loss: 0.1923 - regression_loss: 0.1852 - classification_loss: 0.0071 117/500 [======>.......................] - ETA: 3:11 - loss: 0.1929 - regression_loss: 0.1858 - classification_loss: 0.0071 118/500 [======>.......................] - ETA: 3:11 - loss: 0.1938 - regression_loss: 0.1867 - classification_loss: 0.0071 119/500 [======>.......................] - ETA: 3:10 - loss: 0.1950 - regression_loss: 0.1878 - classification_loss: 0.0072 120/500 [======>.......................] - ETA: 3:10 - loss: 0.1961 - regression_loss: 0.1888 - classification_loss: 0.0073 121/500 [======>.......................] - ETA: 3:09 - loss: 0.1954 - regression_loss: 0.1882 - classification_loss: 0.0072 122/500 [======>.......................] - ETA: 3:09 - loss: 0.1957 - regression_loss: 0.1885 - classification_loss: 0.0072 123/500 [======>.......................] - ETA: 3:08 - loss: 0.1967 - regression_loss: 0.1893 - classification_loss: 0.0073 124/500 [======>.......................] - ETA: 3:08 - loss: 0.1984 - regression_loss: 0.1910 - classification_loss: 0.0074 125/500 [======>.......................] - ETA: 3:07 - loss: 0.1978 - regression_loss: 0.1905 - classification_loss: 0.0074 126/500 [======>.......................] - ETA: 3:07 - loss: 0.1979 - regression_loss: 0.1905 - classification_loss: 0.0074 127/500 [======>.......................] - ETA: 3:06 - loss: 0.1971 - regression_loss: 0.1897 - classification_loss: 0.0073 128/500 [======>.......................] - ETA: 3:06 - loss: 0.1978 - regression_loss: 0.1904 - classification_loss: 0.0073 129/500 [======>.......................] - ETA: 3:05 - loss: 0.1974 - regression_loss: 0.1901 - classification_loss: 0.0073 130/500 [======>.......................] - ETA: 3:05 - loss: 0.1977 - regression_loss: 0.1903 - classification_loss: 0.0073 131/500 [======>.......................] - ETA: 3:04 - loss: 0.1972 - regression_loss: 0.1899 - classification_loss: 0.0073 132/500 [======>.......................] - ETA: 3:04 - loss: 0.1965 - regression_loss: 0.1892 - classification_loss: 0.0073 133/500 [======>.......................] - ETA: 3:03 - loss: 0.1960 - regression_loss: 0.1888 - classification_loss: 0.0072 134/500 [=======>......................] - ETA: 3:03 - loss: 0.1954 - regression_loss: 0.1882 - classification_loss: 0.0072 135/500 [=======>......................] - ETA: 3:02 - loss: 0.1963 - regression_loss: 0.1891 - classification_loss: 0.0072 136/500 [=======>......................] - ETA: 3:02 - loss: 0.1954 - regression_loss: 0.1882 - classification_loss: 0.0072 137/500 [=======>......................] - ETA: 3:01 - loss: 0.1962 - regression_loss: 0.1890 - classification_loss: 0.0072 138/500 [=======>......................] - ETA: 3:01 - loss: 0.1971 - regression_loss: 0.1899 - classification_loss: 0.0072 139/500 [=======>......................] - ETA: 3:00 - loss: 0.1982 - regression_loss: 0.1909 - classification_loss: 0.0073 140/500 [=======>......................] - ETA: 3:00 - loss: 0.1981 - regression_loss: 0.1909 - classification_loss: 0.0072 141/500 [=======>......................] - ETA: 2:59 - loss: 0.1988 - regression_loss: 0.1915 - classification_loss: 0.0073 142/500 [=======>......................] - ETA: 2:59 - loss: 0.1994 - regression_loss: 0.1921 - classification_loss: 0.0073 143/500 [=======>......................] - ETA: 2:59 - loss: 0.1994 - regression_loss: 0.1920 - classification_loss: 0.0073 144/500 [=======>......................] - ETA: 2:58 - loss: 0.1994 - regression_loss: 0.1921 - classification_loss: 0.0073 145/500 [=======>......................] - ETA: 2:57 - loss: 0.1999 - regression_loss: 0.1925 - classification_loss: 0.0074 146/500 [=======>......................] - ETA: 2:57 - loss: 0.2002 - regression_loss: 0.1928 - classification_loss: 0.0075 147/500 [=======>......................] - ETA: 2:56 - loss: 0.1999 - regression_loss: 0.1925 - classification_loss: 0.0074 148/500 [=======>......................] - ETA: 2:56 - loss: 0.1993 - regression_loss: 0.1919 - classification_loss: 0.0074 149/500 [=======>......................] - ETA: 2:55 - loss: 0.1999 - regression_loss: 0.1924 - classification_loss: 0.0075 150/500 [========>.....................] - ETA: 2:55 - loss: 0.2003 - regression_loss: 0.1928 - classification_loss: 0.0075 151/500 [========>.....................] - ETA: 2:54 - loss: 0.2008 - regression_loss: 0.1934 - classification_loss: 0.0075 152/500 [========>.....................] - ETA: 2:54 - loss: 0.2006 - regression_loss: 0.1932 - classification_loss: 0.0074 153/500 [========>.....................] - ETA: 2:53 - loss: 0.2006 - regression_loss: 0.1931 - classification_loss: 0.0075 154/500 [========>.....................] - ETA: 2:53 - loss: 0.2001 - regression_loss: 0.1927 - classification_loss: 0.0074 155/500 [========>.....................] - ETA: 2:52 - loss: 0.2013 - regression_loss: 0.1938 - classification_loss: 0.0074 156/500 [========>.....................] - ETA: 2:52 - loss: 0.2003 - regression_loss: 0.1929 - classification_loss: 0.0074 157/500 [========>.....................] - ETA: 2:51 - loss: 0.2004 - regression_loss: 0.1930 - classification_loss: 0.0074 158/500 [========>.....................] - ETA: 2:51 - loss: 0.1999 - regression_loss: 0.1925 - classification_loss: 0.0074 159/500 [========>.....................] - ETA: 2:50 - loss: 0.1991 - regression_loss: 0.1918 - classification_loss: 0.0073 160/500 [========>.....................] - ETA: 2:50 - loss: 0.1985 - regression_loss: 0.1912 - classification_loss: 0.0073 161/500 [========>.....................] - ETA: 2:49 - loss: 0.1977 - regression_loss: 0.1904 - classification_loss: 0.0073 162/500 [========>.....................] - ETA: 2:49 - loss: 0.1969 - regression_loss: 0.1897 - classification_loss: 0.0072 163/500 [========>.....................] - ETA: 2:48 - loss: 0.1964 - regression_loss: 0.1892 - classification_loss: 0.0072 164/500 [========>.....................] - ETA: 2:48 - loss: 0.1964 - regression_loss: 0.1892 - classification_loss: 0.0072 165/500 [========>.....................] - ETA: 2:47 - loss: 0.1963 - regression_loss: 0.1891 - classification_loss: 0.0072 166/500 [========>.....................] - ETA: 2:47 - loss: 0.1969 - regression_loss: 0.1897 - classification_loss: 0.0072 167/500 [=========>....................] - ETA: 2:46 - loss: 0.1974 - regression_loss: 0.1901 - classification_loss: 0.0072 168/500 [=========>....................] - ETA: 2:46 - loss: 0.1967 - regression_loss: 0.1895 - classification_loss: 0.0072 169/500 [=========>....................] - ETA: 2:45 - loss: 0.1959 - regression_loss: 0.1887 - classification_loss: 0.0072 170/500 [=========>....................] - ETA: 2:45 - loss: 0.1954 - regression_loss: 0.1883 - classification_loss: 0.0072 171/500 [=========>....................] - ETA: 2:44 - loss: 0.1950 - regression_loss: 0.1879 - classification_loss: 0.0071 172/500 [=========>....................] - ETA: 2:44 - loss: 0.1948 - regression_loss: 0.1877 - classification_loss: 0.0071 173/500 [=========>....................] - ETA: 2:43 - loss: 0.1944 - regression_loss: 0.1873 - classification_loss: 0.0071 174/500 [=========>....................] - ETA: 2:43 - loss: 0.1941 - regression_loss: 0.1870 - classification_loss: 0.0071 175/500 [=========>....................] - ETA: 2:42 - loss: 0.1939 - regression_loss: 0.1868 - classification_loss: 0.0071 176/500 [=========>....................] - ETA: 2:42 - loss: 0.1940 - regression_loss: 0.1869 - classification_loss: 0.0071 177/500 [=========>....................] - ETA: 2:41 - loss: 0.1944 - regression_loss: 0.1872 - classification_loss: 0.0071 178/500 [=========>....................] - ETA: 2:41 - loss: 0.1946 - regression_loss: 0.1874 - classification_loss: 0.0073 179/500 [=========>....................] - ETA: 2:40 - loss: 0.1953 - regression_loss: 0.1880 - classification_loss: 0.0073 180/500 [=========>....................] - ETA: 2:40 - loss: 0.1949 - regression_loss: 0.1876 - classification_loss: 0.0073 181/500 [=========>....................] - ETA: 2:39 - loss: 0.1951 - regression_loss: 0.1878 - classification_loss: 0.0073 182/500 [=========>....................] - ETA: 2:39 - loss: 0.1947 - regression_loss: 0.1874 - classification_loss: 0.0073 183/500 [=========>....................] - ETA: 2:38 - loss: 0.1951 - regression_loss: 0.1878 - classification_loss: 0.0073 184/500 [==========>...................] - ETA: 2:38 - loss: 0.1952 - regression_loss: 0.1879 - classification_loss: 0.0074 185/500 [==========>...................] - ETA: 2:37 - loss: 0.1952 - regression_loss: 0.1878 - classification_loss: 0.0073 186/500 [==========>...................] - ETA: 2:37 - loss: 0.1951 - regression_loss: 0.1877 - classification_loss: 0.0073 187/500 [==========>...................] - ETA: 2:36 - loss: 0.1953 - regression_loss: 0.1879 - classification_loss: 0.0074 188/500 [==========>...................] - ETA: 2:36 - loss: 0.1956 - regression_loss: 0.1882 - classification_loss: 0.0074 189/500 [==========>...................] - ETA: 2:35 - loss: 0.1953 - regression_loss: 0.1879 - classification_loss: 0.0074 190/500 [==========>...................] - ETA: 2:35 - loss: 0.1947 - regression_loss: 0.1874 - classification_loss: 0.0074 191/500 [==========>...................] - ETA: 2:34 - loss: 0.1943 - regression_loss: 0.1870 - classification_loss: 0.0074 192/500 [==========>...................] - ETA: 2:34 - loss: 0.1939 - regression_loss: 0.1865 - classification_loss: 0.0073 193/500 [==========>...................] - ETA: 2:33 - loss: 0.1934 - regression_loss: 0.1861 - classification_loss: 0.0073 194/500 [==========>...................] - ETA: 2:33 - loss: 0.1929 - regression_loss: 0.1856 - classification_loss: 0.0073 195/500 [==========>...................] - ETA: 2:32 - loss: 0.1930 - regression_loss: 0.1858 - classification_loss: 0.0073 196/500 [==========>...................] - ETA: 2:32 - loss: 0.1932 - regression_loss: 0.1859 - classification_loss: 0.0073 197/500 [==========>...................] - ETA: 2:31 - loss: 0.1941 - regression_loss: 0.1867 - classification_loss: 0.0074 198/500 [==========>...................] - ETA: 2:31 - loss: 0.1944 - regression_loss: 0.1870 - classification_loss: 0.0074 199/500 [==========>...................] - ETA: 2:30 - loss: 0.1945 - regression_loss: 0.1872 - classification_loss: 0.0074 200/500 [===========>..................] - ETA: 2:30 - loss: 0.1947 - regression_loss: 0.1873 - classification_loss: 0.0074 201/500 [===========>..................] - ETA: 2:29 - loss: 0.1943 - regression_loss: 0.1869 - classification_loss: 0.0074 202/500 [===========>..................] - ETA: 2:29 - loss: 0.1937 - regression_loss: 0.1863 - classification_loss: 0.0073 203/500 [===========>..................] - ETA: 2:28 - loss: 0.1946 - regression_loss: 0.1872 - classification_loss: 0.0073 204/500 [===========>..................] - ETA: 2:28 - loss: 0.1947 - regression_loss: 0.1874 - classification_loss: 0.0073 205/500 [===========>..................] - ETA: 2:27 - loss: 0.1950 - regression_loss: 0.1877 - classification_loss: 0.0073 206/500 [===========>..................] - ETA: 2:27 - loss: 0.1960 - regression_loss: 0.1887 - classification_loss: 0.0073 207/500 [===========>..................] - ETA: 2:26 - loss: 0.1958 - regression_loss: 0.1884 - classification_loss: 0.0073 208/500 [===========>..................] - ETA: 2:26 - loss: 0.1952 - regression_loss: 0.1879 - classification_loss: 0.0073 209/500 [===========>..................] - ETA: 2:25 - loss: 0.1950 - regression_loss: 0.1877 - classification_loss: 0.0073 210/500 [===========>..................] - ETA: 2:25 - loss: 0.1954 - regression_loss: 0.1881 - classification_loss: 0.0073 211/500 [===========>..................] - ETA: 2:24 - loss: 0.1953 - regression_loss: 0.1880 - classification_loss: 0.0073 212/500 [===========>..................] - ETA: 2:24 - loss: 0.1950 - regression_loss: 0.1878 - classification_loss: 0.0073 213/500 [===========>..................] - ETA: 2:23 - loss: 0.1948 - regression_loss: 0.1875 - classification_loss: 0.0072 214/500 [===========>..................] - ETA: 2:23 - loss: 0.1955 - regression_loss: 0.1882 - classification_loss: 0.0073 215/500 [===========>..................] - ETA: 2:22 - loss: 0.1958 - regression_loss: 0.1885 - classification_loss: 0.0073 216/500 [===========>..................] - ETA: 2:22 - loss: 0.1964 - regression_loss: 0.1891 - classification_loss: 0.0073 217/500 [============>.................] - ETA: 2:21 - loss: 0.1965 - regression_loss: 0.1892 - classification_loss: 0.0073 218/500 [============>.................] - ETA: 2:21 - loss: 0.1961 - regression_loss: 0.1888 - classification_loss: 0.0073 219/500 [============>.................] - ETA: 2:20 - loss: 0.1959 - regression_loss: 0.1886 - classification_loss: 0.0073 220/500 [============>.................] - ETA: 2:20 - loss: 0.1956 - regression_loss: 0.1883 - classification_loss: 0.0073 221/500 [============>.................] - ETA: 2:19 - loss: 0.1952 - regression_loss: 0.1879 - classification_loss: 0.0073 222/500 [============>.................] - ETA: 2:19 - loss: 0.1945 - regression_loss: 0.1873 - classification_loss: 0.0073 223/500 [============>.................] - ETA: 2:18 - loss: 0.1943 - regression_loss: 0.1871 - classification_loss: 0.0072 224/500 [============>.................] - ETA: 2:18 - loss: 0.1940 - regression_loss: 0.1868 - classification_loss: 0.0072 225/500 [============>.................] - ETA: 2:17 - loss: 0.1942 - regression_loss: 0.1870 - classification_loss: 0.0073 226/500 [============>.................] - ETA: 2:17 - loss: 0.1937 - regression_loss: 0.1865 - classification_loss: 0.0072 227/500 [============>.................] - ETA: 2:16 - loss: 0.1936 - regression_loss: 0.1864 - classification_loss: 0.0072 228/500 [============>.................] - ETA: 2:16 - loss: 0.1933 - regression_loss: 0.1861 - classification_loss: 0.0072 229/500 [============>.................] - ETA: 2:15 - loss: 0.1932 - regression_loss: 0.1861 - classification_loss: 0.0072 230/500 [============>.................] - ETA: 2:15 - loss: 0.1934 - regression_loss: 0.1862 - classification_loss: 0.0072 231/500 [============>.................] - ETA: 2:14 - loss: 0.1930 - regression_loss: 0.1858 - classification_loss: 0.0072 232/500 [============>.................] - ETA: 2:14 - loss: 0.1931 - regression_loss: 0.1860 - classification_loss: 0.0071 233/500 [============>.................] - ETA: 2:13 - loss: 0.1932 - regression_loss: 0.1861 - classification_loss: 0.0071 234/500 [=============>................] - ETA: 2:13 - loss: 0.1931 - regression_loss: 0.1859 - classification_loss: 0.0071 235/500 [=============>................] - ETA: 2:12 - loss: 0.1934 - regression_loss: 0.1862 - classification_loss: 0.0071 236/500 [=============>................] - ETA: 2:12 - loss: 0.1938 - regression_loss: 0.1866 - classification_loss: 0.0072 237/500 [=============>................] - ETA: 2:11 - loss: 0.1934 - regression_loss: 0.1863 - classification_loss: 0.0072 238/500 [=============>................] - ETA: 2:11 - loss: 0.1933 - regression_loss: 0.1861 - classification_loss: 0.0071 239/500 [=============>................] - ETA: 2:10 - loss: 0.1937 - regression_loss: 0.1866 - classification_loss: 0.0071 240/500 [=============>................] - ETA: 2:10 - loss: 0.1939 - regression_loss: 0.1867 - classification_loss: 0.0072 241/500 [=============>................] - ETA: 2:09 - loss: 0.1944 - regression_loss: 0.1872 - classification_loss: 0.0072 242/500 [=============>................] - ETA: 2:09 - loss: 0.1941 - regression_loss: 0.1869 - classification_loss: 0.0072 243/500 [=============>................] - ETA: 2:08 - loss: 0.1938 - regression_loss: 0.1866 - classification_loss: 0.0072 244/500 [=============>................] - ETA: 2:08 - loss: 0.1939 - regression_loss: 0.1866 - classification_loss: 0.0073 245/500 [=============>................] - ETA: 2:07 - loss: 0.1941 - regression_loss: 0.1869 - classification_loss: 0.0072 246/500 [=============>................] - ETA: 2:07 - loss: 0.1938 - regression_loss: 0.1866 - classification_loss: 0.0072 247/500 [=============>................] - ETA: 2:06 - loss: 0.1942 - regression_loss: 0.1869 - classification_loss: 0.0072 248/500 [=============>................] - ETA: 2:06 - loss: 0.1947 - regression_loss: 0.1874 - classification_loss: 0.0072 249/500 [=============>................] - ETA: 2:05 - loss: 0.1949 - regression_loss: 0.1876 - classification_loss: 0.0072 250/500 [==============>...............] - ETA: 2:05 - loss: 0.1950 - regression_loss: 0.1878 - classification_loss: 0.0072 251/500 [==============>...............] - ETA: 2:04 - loss: 0.1951 - regression_loss: 0.1878 - classification_loss: 0.0073 252/500 [==============>...............] - ETA: 2:04 - loss: 0.1954 - regression_loss: 0.1881 - classification_loss: 0.0073 253/500 [==============>...............] - ETA: 2:03 - loss: 0.1953 - regression_loss: 0.1880 - classification_loss: 0.0073 254/500 [==============>...............] - ETA: 2:03 - loss: 0.1953 - regression_loss: 0.1880 - classification_loss: 0.0073 255/500 [==============>...............] - ETA: 2:02 - loss: 0.1952 - regression_loss: 0.1880 - classification_loss: 0.0073 256/500 [==============>...............] - ETA: 2:02 - loss: 0.1955 - regression_loss: 0.1882 - classification_loss: 0.0073 257/500 [==============>...............] - ETA: 2:01 - loss: 0.1953 - regression_loss: 0.1880 - classification_loss: 0.0073 258/500 [==============>...............] - ETA: 2:01 - loss: 0.1959 - regression_loss: 0.1886 - classification_loss: 0.0073 259/500 [==============>...............] - ETA: 2:00 - loss: 0.1958 - regression_loss: 0.1885 - classification_loss: 0.0073 260/500 [==============>...............] - ETA: 2:00 - loss: 0.1961 - regression_loss: 0.1888 - classification_loss: 0.0073 261/500 [==============>...............] - ETA: 1:59 - loss: 0.1961 - regression_loss: 0.1888 - classification_loss: 0.0073 262/500 [==============>...............] - ETA: 1:59 - loss: 0.1957 - regression_loss: 0.1885 - classification_loss: 0.0073 263/500 [==============>...............] - ETA: 1:58 - loss: 0.1955 - regression_loss: 0.1882 - classification_loss: 0.0072 264/500 [==============>...............] - ETA: 1:58 - loss: 0.1958 - regression_loss: 0.1886 - classification_loss: 0.0072 265/500 [==============>...............] - ETA: 1:57 - loss: 0.1959 - regression_loss: 0.1887 - classification_loss: 0.0073 266/500 [==============>...............] - ETA: 1:57 - loss: 0.1955 - regression_loss: 0.1883 - classification_loss: 0.0072 267/500 [===============>..............] - ETA: 1:56 - loss: 0.1957 - regression_loss: 0.1884 - classification_loss: 0.0073 268/500 [===============>..............] - ETA: 1:56 - loss: 0.1960 - regression_loss: 0.1887 - classification_loss: 0.0073 269/500 [===============>..............] - ETA: 1:55 - loss: 0.1959 - regression_loss: 0.1886 - classification_loss: 0.0073 270/500 [===============>..............] - ETA: 1:55 - loss: 0.1956 - regression_loss: 0.1883 - classification_loss: 0.0073 271/500 [===============>..............] - ETA: 1:54 - loss: 0.1957 - regression_loss: 0.1884 - classification_loss: 0.0073 272/500 [===============>..............] - ETA: 1:54 - loss: 0.1958 - regression_loss: 0.1885 - classification_loss: 0.0073 273/500 [===============>..............] - ETA: 1:53 - loss: 0.1958 - regression_loss: 0.1884 - classification_loss: 0.0073 274/500 [===============>..............] - ETA: 1:53 - loss: 0.1957 - regression_loss: 0.1883 - classification_loss: 0.0073 275/500 [===============>..............] - ETA: 1:52 - loss: 0.1960 - regression_loss: 0.1887 - classification_loss: 0.0074 276/500 [===============>..............] - ETA: 1:52 - loss: 0.1959 - regression_loss: 0.1885 - classification_loss: 0.0074 277/500 [===============>..............] - ETA: 1:51 - loss: 0.1958 - regression_loss: 0.1884 - classification_loss: 0.0074 278/500 [===============>..............] - ETA: 1:51 - loss: 0.1957 - regression_loss: 0.1884 - classification_loss: 0.0074 279/500 [===============>..............] - ETA: 1:50 - loss: 0.1955 - regression_loss: 0.1881 - classification_loss: 0.0074 280/500 [===============>..............] - ETA: 1:50 - loss: 0.1952 - regression_loss: 0.1878 - classification_loss: 0.0074 281/500 [===============>..............] - ETA: 1:49 - loss: 0.1950 - regression_loss: 0.1877 - classification_loss: 0.0073 282/500 [===============>..............] - ETA: 1:49 - loss: 0.1953 - regression_loss: 0.1879 - classification_loss: 0.0074 283/500 [===============>..............] - ETA: 1:48 - loss: 0.1952 - regression_loss: 0.1878 - classification_loss: 0.0074 284/500 [================>.............] - ETA: 1:48 - loss: 0.1955 - regression_loss: 0.1881 - classification_loss: 0.0074 285/500 [================>.............] - ETA: 1:47 - loss: 0.1951 - regression_loss: 0.1876 - classification_loss: 0.0074 286/500 [================>.............] - ETA: 1:47 - loss: 0.1954 - regression_loss: 0.1880 - classification_loss: 0.0075 287/500 [================>.............] - ETA: 1:46 - loss: 0.1953 - regression_loss: 0.1879 - classification_loss: 0.0074 288/500 [================>.............] - ETA: 1:46 - loss: 0.1952 - regression_loss: 0.1878 - classification_loss: 0.0074 289/500 [================>.............] - ETA: 1:45 - loss: 0.1949 - regression_loss: 0.1875 - classification_loss: 0.0074 290/500 [================>.............] - ETA: 1:45 - loss: 0.1946 - regression_loss: 0.1872 - classification_loss: 0.0074 291/500 [================>.............] - ETA: 1:44 - loss: 0.1949 - regression_loss: 0.1875 - classification_loss: 0.0074 292/500 [================>.............] - ETA: 1:44 - loss: 0.1958 - regression_loss: 0.1884 - classification_loss: 0.0074 293/500 [================>.............] - ETA: 1:43 - loss: 0.1962 - regression_loss: 0.1887 - classification_loss: 0.0074 294/500 [================>.............] - ETA: 1:43 - loss: 0.1958 - regression_loss: 0.1883 - classification_loss: 0.0074 295/500 [================>.............] - ETA: 1:42 - loss: 0.1957 - regression_loss: 0.1883 - classification_loss: 0.0074 296/500 [================>.............] - ETA: 1:42 - loss: 0.1957 - regression_loss: 0.1883 - classification_loss: 0.0074 297/500 [================>.............] - ETA: 1:41 - loss: 0.1956 - regression_loss: 0.1881 - classification_loss: 0.0074 298/500 [================>.............] - ETA: 1:41 - loss: 0.1959 - regression_loss: 0.1884 - classification_loss: 0.0075 299/500 [================>.............] - ETA: 1:40 - loss: 0.1956 - regression_loss: 0.1881 - classification_loss: 0.0075 300/500 [=================>............] - ETA: 1:40 - loss: 0.1953 - regression_loss: 0.1879 - classification_loss: 0.0074 301/500 [=================>............] - ETA: 1:39 - loss: 0.1952 - regression_loss: 0.1877 - classification_loss: 0.0074 302/500 [=================>............] - ETA: 1:39 - loss: 0.1949 - regression_loss: 0.1875 - classification_loss: 0.0074 303/500 [=================>............] - ETA: 1:38 - loss: 0.1953 - regression_loss: 0.1879 - classification_loss: 0.0074 304/500 [=================>............] - ETA: 1:38 - loss: 0.1957 - regression_loss: 0.1883 - classification_loss: 0.0074 305/500 [=================>............] - ETA: 1:37 - loss: 0.1954 - regression_loss: 0.1879 - classification_loss: 0.0074 306/500 [=================>............] - ETA: 1:37 - loss: 0.1950 - regression_loss: 0.1876 - classification_loss: 0.0074 307/500 [=================>............] - ETA: 1:36 - loss: 0.1951 - regression_loss: 0.1877 - classification_loss: 0.0074 308/500 [=================>............] - ETA: 1:36 - loss: 0.1957 - regression_loss: 0.1882 - classification_loss: 0.0074 309/500 [=================>............] - ETA: 1:35 - loss: 0.1955 - regression_loss: 0.1881 - classification_loss: 0.0074 310/500 [=================>............] - ETA: 1:35 - loss: 0.1959 - regression_loss: 0.1885 - classification_loss: 0.0074 311/500 [=================>............] - ETA: 1:34 - loss: 0.1959 - regression_loss: 0.1885 - classification_loss: 0.0074 312/500 [=================>............] - ETA: 1:34 - loss: 0.1957 - regression_loss: 0.1883 - classification_loss: 0.0074 313/500 [=================>............] - ETA: 1:33 - loss: 0.1952 - regression_loss: 0.1878 - classification_loss: 0.0074 314/500 [=================>............] - ETA: 1:33 - loss: 0.1954 - regression_loss: 0.1879 - classification_loss: 0.0075 315/500 [=================>............] - ETA: 1:32 - loss: 0.1954 - regression_loss: 0.1880 - classification_loss: 0.0075 316/500 [=================>............] - ETA: 1:32 - loss: 0.1951 - regression_loss: 0.1876 - classification_loss: 0.0075 317/500 [==================>...........] - ETA: 1:31 - loss: 0.1950 - regression_loss: 0.1875 - classification_loss: 0.0074 318/500 [==================>...........] - ETA: 1:31 - loss: 0.1946 - regression_loss: 0.1871 - classification_loss: 0.0074 319/500 [==================>...........] - ETA: 1:30 - loss: 0.1953 - regression_loss: 0.1878 - classification_loss: 0.0074 320/500 [==================>...........] - ETA: 1:30 - loss: 0.1956 - regression_loss: 0.1882 - classification_loss: 0.0074 321/500 [==================>...........] - ETA: 1:29 - loss: 0.1955 - regression_loss: 0.1881 - classification_loss: 0.0074 322/500 [==================>...........] - ETA: 1:29 - loss: 0.1961 - regression_loss: 0.1887 - classification_loss: 0.0074 323/500 [==================>...........] - ETA: 1:28 - loss: 0.1964 - regression_loss: 0.1890 - classification_loss: 0.0074 324/500 [==================>...........] - ETA: 1:28 - loss: 0.1962 - regression_loss: 0.1888 - classification_loss: 0.0074 325/500 [==================>...........] - ETA: 1:27 - loss: 0.1967 - regression_loss: 0.1891 - classification_loss: 0.0076 326/500 [==================>...........] - ETA: 1:27 - loss: 0.1970 - regression_loss: 0.1893 - classification_loss: 0.0076 327/500 [==================>...........] - ETA: 1:26 - loss: 0.1966 - regression_loss: 0.1890 - classification_loss: 0.0076 328/500 [==================>...........] - ETA: 1:26 - loss: 0.1970 - regression_loss: 0.1894 - classification_loss: 0.0076 329/500 [==================>...........] - ETA: 1:25 - loss: 0.1968 - regression_loss: 0.1892 - classification_loss: 0.0076 330/500 [==================>...........] - ETA: 1:25 - loss: 0.1969 - regression_loss: 0.1894 - classification_loss: 0.0076 331/500 [==================>...........] - ETA: 1:24 - loss: 0.1974 - regression_loss: 0.1898 - classification_loss: 0.0076 332/500 [==================>...........] - ETA: 1:24 - loss: 0.1976 - regression_loss: 0.1900 - classification_loss: 0.0076 333/500 [==================>...........] - ETA: 1:23 - loss: 0.1979 - regression_loss: 0.1903 - classification_loss: 0.0076 334/500 [===================>..........] - ETA: 1:23 - loss: 0.1979 - regression_loss: 0.1904 - classification_loss: 0.0076 335/500 [===================>..........] - ETA: 1:22 - loss: 0.1977 - regression_loss: 0.1901 - classification_loss: 0.0076 336/500 [===================>..........] - ETA: 1:22 - loss: 0.1975 - regression_loss: 0.1899 - classification_loss: 0.0076 337/500 [===================>..........] - ETA: 1:21 - loss: 0.1971 - regression_loss: 0.1896 - classification_loss: 0.0075 338/500 [===================>..........] - ETA: 1:21 - loss: 0.1972 - regression_loss: 0.1897 - classification_loss: 0.0075 339/500 [===================>..........] - ETA: 1:20 - loss: 0.1971 - regression_loss: 0.1895 - classification_loss: 0.0075 340/500 [===================>..........] - ETA: 1:20 - loss: 0.1971 - regression_loss: 0.1895 - classification_loss: 0.0075 341/500 [===================>..........] - ETA: 1:19 - loss: 0.1975 - regression_loss: 0.1900 - classification_loss: 0.0075 342/500 [===================>..........] - ETA: 1:19 - loss: 0.1975 - regression_loss: 0.1900 - classification_loss: 0.0075 343/500 [===================>..........] - ETA: 1:18 - loss: 0.1973 - regression_loss: 0.1898 - classification_loss: 0.0075 344/500 [===================>..........] - ETA: 1:18 - loss: 0.1969 - regression_loss: 0.1894 - classification_loss: 0.0075 345/500 [===================>..........] - ETA: 1:17 - loss: 0.1967 - regression_loss: 0.1892 - classification_loss: 0.0075 346/500 [===================>..........] - ETA: 1:17 - loss: 0.1966 - regression_loss: 0.1891 - classification_loss: 0.0075 347/500 [===================>..........] - ETA: 1:16 - loss: 0.1965 - regression_loss: 0.1890 - classification_loss: 0.0075 348/500 [===================>..........] - ETA: 1:16 - loss: 0.1966 - regression_loss: 0.1891 - classification_loss: 0.0075 349/500 [===================>..........] - ETA: 1:15 - loss: 0.1965 - regression_loss: 0.1890 - classification_loss: 0.0075 350/500 [====================>.........] - ETA: 1:15 - loss: 0.1963 - regression_loss: 0.1889 - classification_loss: 0.0075 351/500 [====================>.........] - ETA: 1:14 - loss: 0.1966 - regression_loss: 0.1891 - classification_loss: 0.0075 352/500 [====================>.........] - ETA: 1:14 - loss: 0.1964 - regression_loss: 0.1889 - classification_loss: 0.0075 353/500 [====================>.........] - ETA: 1:13 - loss: 0.1963 - regression_loss: 0.1888 - classification_loss: 0.0075 354/500 [====================>.........] - ETA: 1:13 - loss: 0.1968 - regression_loss: 0.1891 - classification_loss: 0.0076 355/500 [====================>.........] - ETA: 1:12 - loss: 0.1967 - regression_loss: 0.1891 - classification_loss: 0.0076 356/500 [====================>.........] - ETA: 1:12 - loss: 0.1966 - regression_loss: 0.1890 - classification_loss: 0.0076 357/500 [====================>.........] - ETA: 1:11 - loss: 0.1964 - regression_loss: 0.1888 - classification_loss: 0.0076 358/500 [====================>.........] - ETA: 1:11 - loss: 0.1967 - regression_loss: 0.1891 - classification_loss: 0.0076 359/500 [====================>.........] - ETA: 1:10 - loss: 0.1964 - regression_loss: 0.1888 - classification_loss: 0.0076 360/500 [====================>.........] - ETA: 1:10 - loss: 0.1966 - regression_loss: 0.1890 - classification_loss: 0.0076 361/500 [====================>.........] - ETA: 1:09 - loss: 0.1963 - regression_loss: 0.1888 - classification_loss: 0.0076 362/500 [====================>.........] - ETA: 1:09 - loss: 0.1964 - regression_loss: 0.1889 - classification_loss: 0.0076 363/500 [====================>.........] - ETA: 1:08 - loss: 0.1965 - regression_loss: 0.1890 - classification_loss: 0.0076 364/500 [====================>.........] - ETA: 1:08 - loss: 0.1963 - regression_loss: 0.1888 - classification_loss: 0.0076 365/500 [====================>.........] - ETA: 1:07 - loss: 0.1962 - regression_loss: 0.1887 - classification_loss: 0.0076 366/500 [====================>.........] - ETA: 1:07 - loss: 0.1964 - regression_loss: 0.1888 - classification_loss: 0.0076 367/500 [=====================>........] - ETA: 1:06 - loss: 0.1963 - regression_loss: 0.1888 - classification_loss: 0.0076 368/500 [=====================>........] - ETA: 1:06 - loss: 0.1962 - regression_loss: 0.1887 - classification_loss: 0.0076 369/500 [=====================>........] - ETA: 1:05 - loss: 0.1963 - regression_loss: 0.1887 - classification_loss: 0.0076 370/500 [=====================>........] - ETA: 1:05 - loss: 0.1963 - regression_loss: 0.1888 - classification_loss: 0.0075 371/500 [=====================>........] - ETA: 1:04 - loss: 0.1969 - regression_loss: 0.1893 - classification_loss: 0.0075 372/500 [=====================>........] - ETA: 1:04 - loss: 0.1971 - regression_loss: 0.1895 - classification_loss: 0.0076 373/500 [=====================>........] - ETA: 1:03 - loss: 0.1974 - regression_loss: 0.1899 - classification_loss: 0.0076 374/500 [=====================>........] - ETA: 1:03 - loss: 0.1974 - regression_loss: 0.1898 - classification_loss: 0.0076 375/500 [=====================>........] - ETA: 1:02 - loss: 0.1974 - regression_loss: 0.1898 - classification_loss: 0.0076 376/500 [=====================>........] - ETA: 1:02 - loss: 0.1973 - regression_loss: 0.1897 - classification_loss: 0.0076 377/500 [=====================>........] - ETA: 1:01 - loss: 0.1977 - regression_loss: 0.1900 - classification_loss: 0.0076 378/500 [=====================>........] - ETA: 1:01 - loss: 0.1975 - regression_loss: 0.1899 - classification_loss: 0.0076 379/500 [=====================>........] - ETA: 1:00 - loss: 0.1978 - regression_loss: 0.1901 - classification_loss: 0.0076 380/500 [=====================>........] - ETA: 1:00 - loss: 0.1975 - regression_loss: 0.1899 - classification_loss: 0.0076 381/500 [=====================>........] - ETA: 59s - loss: 0.1977 - regression_loss: 0.1901 - classification_loss: 0.0076  382/500 [=====================>........] - ETA: 59s - loss: 0.1977 - regression_loss: 0.1901 - classification_loss: 0.0076 383/500 [=====================>........] - ETA: 58s - loss: 0.1978 - regression_loss: 0.1902 - classification_loss: 0.0076 384/500 [======================>.......] - ETA: 58s - loss: 0.1977 - regression_loss: 0.1901 - classification_loss: 0.0076 385/500 [======================>.......] - ETA: 57s - loss: 0.1979 - regression_loss: 0.1903 - classification_loss: 0.0076 386/500 [======================>.......] - ETA: 57s - loss: 0.1981 - regression_loss: 0.1905 - classification_loss: 0.0076 387/500 [======================>.......] - ETA: 56s - loss: 0.1982 - regression_loss: 0.1906 - classification_loss: 0.0076 388/500 [======================>.......] - ETA: 56s - loss: 0.1981 - regression_loss: 0.1905 - classification_loss: 0.0076 389/500 [======================>.......] - ETA: 55s - loss: 0.1984 - regression_loss: 0.1908 - classification_loss: 0.0076 390/500 [======================>.......] - ETA: 55s - loss: 0.1986 - regression_loss: 0.1909 - classification_loss: 0.0076 391/500 [======================>.......] - ETA: 54s - loss: 0.1984 - regression_loss: 0.1908 - classification_loss: 0.0076 392/500 [======================>.......] - ETA: 54s - loss: 0.1989 - regression_loss: 0.1912 - classification_loss: 0.0077 393/500 [======================>.......] - ETA: 53s - loss: 0.1986 - regression_loss: 0.1909 - classification_loss: 0.0077 394/500 [======================>.......] - ETA: 53s - loss: 0.1983 - regression_loss: 0.1906 - classification_loss: 0.0077 395/500 [======================>.......] - ETA: 52s - loss: 0.1983 - regression_loss: 0.1906 - classification_loss: 0.0077 396/500 [======================>.......] - ETA: 52s - loss: 0.1987 - regression_loss: 0.1910 - classification_loss: 0.0077 397/500 [======================>.......] - ETA: 51s - loss: 0.1987 - regression_loss: 0.1910 - classification_loss: 0.0077 398/500 [======================>.......] - ETA: 51s - loss: 0.1987 - regression_loss: 0.1911 - classification_loss: 0.0077 399/500 [======================>.......] - ETA: 50s - loss: 0.1986 - regression_loss: 0.1910 - classification_loss: 0.0076 400/500 [=======================>......] - ETA: 50s - loss: 0.1988 - regression_loss: 0.1912 - classification_loss: 0.0077 401/500 [=======================>......] - ETA: 49s - loss: 0.1988 - regression_loss: 0.1911 - classification_loss: 0.0076 402/500 [=======================>......] - ETA: 49s - loss: 0.1989 - regression_loss: 0.1912 - classification_loss: 0.0076 403/500 [=======================>......] - ETA: 48s - loss: 0.1991 - regression_loss: 0.1914 - classification_loss: 0.0077 404/500 [=======================>......] - ETA: 48s - loss: 0.1989 - regression_loss: 0.1912 - classification_loss: 0.0076 405/500 [=======================>......] - ETA: 47s - loss: 0.1991 - regression_loss: 0.1914 - classification_loss: 0.0076 406/500 [=======================>......] - ETA: 47s - loss: 0.1989 - regression_loss: 0.1913 - classification_loss: 0.0076 407/500 [=======================>......] - ETA: 46s - loss: 0.1991 - regression_loss: 0.1915 - classification_loss: 0.0076 408/500 [=======================>......] - ETA: 46s - loss: 0.1988 - regression_loss: 0.1912 - classification_loss: 0.0076 409/500 [=======================>......] - ETA: 45s - loss: 0.1991 - regression_loss: 0.1915 - classification_loss: 0.0076 410/500 [=======================>......] - ETA: 45s - loss: 0.1998 - regression_loss: 0.1922 - classification_loss: 0.0076 411/500 [=======================>......] - ETA: 44s - loss: 0.1998 - regression_loss: 0.1922 - classification_loss: 0.0076 412/500 [=======================>......] - ETA: 44s - loss: 0.1998 - regression_loss: 0.1922 - classification_loss: 0.0077 413/500 [=======================>......] - ETA: 43s - loss: 0.1996 - regression_loss: 0.1920 - classification_loss: 0.0076 414/500 [=======================>......] - ETA: 43s - loss: 0.1997 - regression_loss: 0.1921 - classification_loss: 0.0077 415/500 [=======================>......] - ETA: 42s - loss: 0.1997 - regression_loss: 0.1921 - classification_loss: 0.0076 416/500 [=======================>......] - ETA: 42s - loss: 0.1996 - regression_loss: 0.1920 - classification_loss: 0.0076 417/500 [========================>.....] - ETA: 41s - loss: 0.1994 - regression_loss: 0.1918 - classification_loss: 0.0076 418/500 [========================>.....] - ETA: 41s - loss: 0.1994 - regression_loss: 0.1918 - classification_loss: 0.0076 419/500 [========================>.....] - ETA: 40s - loss: 0.1995 - regression_loss: 0.1919 - classification_loss: 0.0076 420/500 [========================>.....] - ETA: 40s - loss: 0.1993 - regression_loss: 0.1917 - classification_loss: 0.0076 421/500 [========================>.....] - ETA: 39s - loss: 0.1997 - regression_loss: 0.1920 - classification_loss: 0.0077 422/500 [========================>.....] - ETA: 39s - loss: 0.1995 - regression_loss: 0.1918 - classification_loss: 0.0077 423/500 [========================>.....] - ETA: 38s - loss: 0.1993 - regression_loss: 0.1917 - classification_loss: 0.0077 424/500 [========================>.....] - ETA: 38s - loss: 0.1993 - regression_loss: 0.1916 - classification_loss: 0.0077 425/500 [========================>.....] - ETA: 37s - loss: 0.1991 - regression_loss: 0.1914 - classification_loss: 0.0077 426/500 [========================>.....] - ETA: 37s - loss: 0.1994 - regression_loss: 0.1917 - classification_loss: 0.0077 427/500 [========================>.....] - ETA: 36s - loss: 0.1993 - regression_loss: 0.1916 - classification_loss: 0.0077 428/500 [========================>.....] - ETA: 36s - loss: 0.1991 - regression_loss: 0.1915 - classification_loss: 0.0077 429/500 [========================>.....] - ETA: 35s - loss: 0.1992 - regression_loss: 0.1915 - classification_loss: 0.0077 430/500 [========================>.....] - ETA: 35s - loss: 0.1993 - regression_loss: 0.1916 - classification_loss: 0.0077 431/500 [========================>.....] - ETA: 34s - loss: 0.1990 - regression_loss: 0.1914 - classification_loss: 0.0077 432/500 [========================>.....] - ETA: 34s - loss: 0.1991 - regression_loss: 0.1914 - classification_loss: 0.0077 433/500 [========================>.....] - ETA: 33s - loss: 0.1990 - regression_loss: 0.1914 - classification_loss: 0.0077 434/500 [=========================>....] - ETA: 33s - loss: 0.1993 - regression_loss: 0.1916 - classification_loss: 0.0077 435/500 [=========================>....] - ETA: 32s - loss: 0.1995 - regression_loss: 0.1918 - classification_loss: 0.0077 436/500 [=========================>....] - ETA: 32s - loss: 0.1995 - regression_loss: 0.1918 - classification_loss: 0.0077 437/500 [=========================>....] - ETA: 31s - loss: 0.1993 - regression_loss: 0.1916 - classification_loss: 0.0077 438/500 [=========================>....] - ETA: 31s - loss: 0.1991 - regression_loss: 0.1914 - classification_loss: 0.0077 439/500 [=========================>....] - ETA: 30s - loss: 0.1994 - regression_loss: 0.1917 - classification_loss: 0.0077 440/500 [=========================>....] - ETA: 30s - loss: 0.1995 - regression_loss: 0.1918 - classification_loss: 0.0077 441/500 [=========================>....] - ETA: 29s - loss: 0.1994 - regression_loss: 0.1918 - classification_loss: 0.0077 442/500 [=========================>....] - ETA: 29s - loss: 0.1994 - regression_loss: 0.1917 - classification_loss: 0.0077 443/500 [=========================>....] - ETA: 28s - loss: 0.1993 - regression_loss: 0.1917 - classification_loss: 0.0077 444/500 [=========================>....] - ETA: 28s - loss: 0.1994 - regression_loss: 0.1917 - classification_loss: 0.0077 445/500 [=========================>....] - ETA: 27s - loss: 0.1994 - regression_loss: 0.1917 - classification_loss: 0.0077 446/500 [=========================>....] - ETA: 27s - loss: 0.1996 - regression_loss: 0.1920 - classification_loss: 0.0077 447/500 [=========================>....] - ETA: 26s - loss: 0.1997 - regression_loss: 0.1920 - classification_loss: 0.0077 448/500 [=========================>....] - ETA: 26s - loss: 0.2000 - regression_loss: 0.1923 - classification_loss: 0.0077 449/500 [=========================>....] - ETA: 25s - loss: 0.2003 - regression_loss: 0.1926 - classification_loss: 0.0077 450/500 [==========================>...] - ETA: 25s - loss: 0.2002 - regression_loss: 0.1925 - classification_loss: 0.0077 451/500 [==========================>...] - ETA: 24s - loss: 0.2003 - regression_loss: 0.1926 - classification_loss: 0.0077 452/500 [==========================>...] - ETA: 24s - loss: 0.2002 - regression_loss: 0.1925 - classification_loss: 0.0077 453/500 [==========================>...] - ETA: 23s - loss: 0.2006 - regression_loss: 0.1929 - classification_loss: 0.0077 454/500 [==========================>...] - ETA: 23s - loss: 0.2007 - regression_loss: 0.1930 - classification_loss: 0.0077 455/500 [==========================>...] - ETA: 22s - loss: 0.2009 - regression_loss: 0.1932 - classification_loss: 0.0077 456/500 [==========================>...] - ETA: 22s - loss: 0.2007 - regression_loss: 0.1930 - classification_loss: 0.0077 457/500 [==========================>...] - ETA: 21s - loss: 0.2005 - regression_loss: 0.1929 - classification_loss: 0.0077 458/500 [==========================>...] - ETA: 21s - loss: 0.2007 - regression_loss: 0.1930 - classification_loss: 0.0077 459/500 [==========================>...] - ETA: 20s - loss: 0.2008 - regression_loss: 0.1931 - classification_loss: 0.0077 460/500 [==========================>...] - ETA: 20s - loss: 0.2008 - regression_loss: 0.1931 - classification_loss: 0.0077 461/500 [==========================>...] - ETA: 19s - loss: 0.2008 - regression_loss: 0.1931 - classification_loss: 0.0077 462/500 [==========================>...] - ETA: 19s - loss: 0.2006 - regression_loss: 0.1929 - classification_loss: 0.0077 463/500 [==========================>...] - ETA: 18s - loss: 0.2004 - regression_loss: 0.1927 - classification_loss: 0.0077 464/500 [==========================>...] - ETA: 18s - loss: 0.2002 - regression_loss: 0.1925 - classification_loss: 0.0077 465/500 [==========================>...] - ETA: 17s - loss: 0.2004 - regression_loss: 0.1927 - classification_loss: 0.0077 466/500 [==========================>...] - ETA: 17s - loss: 0.2003 - regression_loss: 0.1927 - classification_loss: 0.0077 467/500 [===========================>..] - ETA: 16s - loss: 0.2003 - regression_loss: 0.1927 - classification_loss: 0.0077 468/500 [===========================>..] - ETA: 16s - loss: 0.2002 - regression_loss: 0.1926 - classification_loss: 0.0076 469/500 [===========================>..] - ETA: 15s - loss: 0.2001 - regression_loss: 0.1925 - classification_loss: 0.0076 470/500 [===========================>..] - ETA: 15s - loss: 0.2001 - regression_loss: 0.1924 - classification_loss: 0.0076 471/500 [===========================>..] - ETA: 14s - loss: 0.2009 - regression_loss: 0.1932 - classification_loss: 0.0077 472/500 [===========================>..] - ETA: 14s - loss: 0.2011 - regression_loss: 0.1934 - classification_loss: 0.0077 473/500 [===========================>..] - ETA: 13s - loss: 0.2015 - regression_loss: 0.1938 - classification_loss: 0.0076 474/500 [===========================>..] - ETA: 13s - loss: 0.2019 - regression_loss: 0.1943 - classification_loss: 0.0077 475/500 [===========================>..] - ETA: 12s - loss: 0.2019 - regression_loss: 0.1943 - classification_loss: 0.0076 476/500 [===========================>..] - ETA: 12s - loss: 0.2019 - regression_loss: 0.1942 - classification_loss: 0.0076 477/500 [===========================>..] - ETA: 11s - loss: 0.2016 - regression_loss: 0.1940 - classification_loss: 0.0076 478/500 [===========================>..] - ETA: 11s - loss: 0.2020 - regression_loss: 0.1943 - classification_loss: 0.0076 479/500 [===========================>..] - ETA: 10s - loss: 0.2026 - regression_loss: 0.1949 - classification_loss: 0.0077 480/500 [===========================>..] - ETA: 10s - loss: 0.2029 - regression_loss: 0.1952 - classification_loss: 0.0077 481/500 [===========================>..] - ETA: 9s - loss: 0.2027 - regression_loss: 0.1950 - classification_loss: 0.0077  482/500 [===========================>..] - ETA: 9s - loss: 0.2026 - regression_loss: 0.1950 - classification_loss: 0.0077 483/500 [===========================>..] - ETA: 8s - loss: 0.2026 - regression_loss: 0.1950 - classification_loss: 0.0077 484/500 [============================>.] - ETA: 8s - loss: 0.2024 - regression_loss: 0.1947 - classification_loss: 0.0077 485/500 [============================>.] - ETA: 7s - loss: 0.2021 - regression_loss: 0.1945 - classification_loss: 0.0077 486/500 [============================>.] - ETA: 7s - loss: 0.2022 - regression_loss: 0.1945 - classification_loss: 0.0077 487/500 [============================>.] - ETA: 6s - loss: 0.2020 - regression_loss: 0.1943 - classification_loss: 0.0077 488/500 [============================>.] - ETA: 6s - loss: 0.2019 - regression_loss: 0.1943 - classification_loss: 0.0077 489/500 [============================>.] - ETA: 5s - loss: 0.2020 - regression_loss: 0.1943 - classification_loss: 0.0077 490/500 [============================>.] - ETA: 5s - loss: 0.2018 - regression_loss: 0.1941 - classification_loss: 0.0077 491/500 [============================>.] - ETA: 4s - loss: 0.2024 - regression_loss: 0.1948 - classification_loss: 0.0077 492/500 [============================>.] - ETA: 4s - loss: 0.2024 - regression_loss: 0.1947 - classification_loss: 0.0077 493/500 [============================>.] - ETA: 3s - loss: 0.2024 - regression_loss: 0.1948 - classification_loss: 0.0076 494/500 [============================>.] - ETA: 3s - loss: 0.2030 - regression_loss: 0.1954 - classification_loss: 0.0077 495/500 [============================>.] - ETA: 2s - loss: 0.2034 - regression_loss: 0.1957 - classification_loss: 0.0077 496/500 [============================>.] - ETA: 2s - loss: 0.2032 - regression_loss: 0.1955 - classification_loss: 0.0077 497/500 [============================>.] - ETA: 1s - loss: 0.2036 - regression_loss: 0.1959 - classification_loss: 0.0077 498/500 [============================>.] - ETA: 1s - loss: 0.2036 - regression_loss: 0.1959 - classification_loss: 0.0077 499/500 [============================>.] - ETA: 0s - loss: 0.2036 - regression_loss: 0.1959 - classification_loss: 0.0077 500/500 [==============================] - 251s 502ms/step - loss: 0.2036 - regression_loss: 0.1959 - classification_loss: 0.0077 1172 instances of class plum with average precision: 0.7479 mAP: 0.7479 Epoch 00038: saving model to ./training/snapshots/resnet101_pascal_38.h5 Epoch 39/150 1/500 [..............................] - ETA: 3:57 - loss: 0.1687 - regression_loss: 0.1657 - classification_loss: 0.0030 2/500 [..............................] - ETA: 4:01 - loss: 0.1822 - regression_loss: 0.1782 - classification_loss: 0.0040 3/500 [..............................] - ETA: 4:00 - loss: 0.2890 - regression_loss: 0.2788 - classification_loss: 0.0103 4/500 [..............................] - ETA: 3:59 - loss: 0.2696 - regression_loss: 0.2606 - classification_loss: 0.0089 5/500 [..............................] - ETA: 4:01 - loss: 0.2801 - regression_loss: 0.2718 - classification_loss: 0.0083 6/500 [..............................] - ETA: 4:04 - loss: 0.2940 - regression_loss: 0.2854 - classification_loss: 0.0086 7/500 [..............................] - ETA: 4:04 - loss: 0.2592 - regression_loss: 0.2517 - classification_loss: 0.0075 8/500 [..............................] - ETA: 4:03 - loss: 0.2473 - regression_loss: 0.2403 - classification_loss: 0.0071 9/500 [..............................] - ETA: 4:03 - loss: 0.2304 - regression_loss: 0.2240 - classification_loss: 0.0064 10/500 [..............................] - ETA: 4:03 - loss: 0.2353 - regression_loss: 0.2288 - classification_loss: 0.0065 11/500 [..............................] - ETA: 4:02 - loss: 0.2290 - regression_loss: 0.2229 - classification_loss: 0.0062 12/500 [..............................] - ETA: 4:02 - loss: 0.2445 - regression_loss: 0.2384 - classification_loss: 0.0061 13/500 [..............................] - ETA: 4:03 - loss: 0.2458 - regression_loss: 0.2396 - classification_loss: 0.0062 14/500 [..............................] - ETA: 4:02 - loss: 0.2333 - regression_loss: 0.2275 - classification_loss: 0.0058 15/500 [..............................] - ETA: 4:01 - loss: 0.2482 - regression_loss: 0.2411 - classification_loss: 0.0071 16/500 [..............................] - ETA: 4:01 - loss: 0.2458 - regression_loss: 0.2390 - classification_loss: 0.0067 17/500 [>.............................] - ETA: 4:01 - loss: 0.2451 - regression_loss: 0.2385 - classification_loss: 0.0066 18/500 [>.............................] - ETA: 4:00 - loss: 0.2439 - regression_loss: 0.2366 - classification_loss: 0.0073 19/500 [>.............................] - ETA: 4:00 - loss: 0.2394 - regression_loss: 0.2321 - classification_loss: 0.0073 20/500 [>.............................] - ETA: 4:00 - loss: 0.2400 - regression_loss: 0.2326 - classification_loss: 0.0074 21/500 [>.............................] - ETA: 3:59 - loss: 0.2500 - regression_loss: 0.2421 - classification_loss: 0.0079 22/500 [>.............................] - ETA: 3:58 - loss: 0.2455 - regression_loss: 0.2378 - classification_loss: 0.0077 23/500 [>.............................] - ETA: 3:58 - loss: 0.2411 - regression_loss: 0.2337 - classification_loss: 0.0074 24/500 [>.............................] - ETA: 3:57 - loss: 0.2363 - regression_loss: 0.2290 - classification_loss: 0.0073 25/500 [>.............................] - ETA: 3:56 - loss: 0.2425 - regression_loss: 0.2351 - classification_loss: 0.0074 26/500 [>.............................] - ETA: 3:55 - loss: 0.2438 - regression_loss: 0.2365 - classification_loss: 0.0074 27/500 [>.............................] - ETA: 3:55 - loss: 0.2485 - regression_loss: 0.2408 - classification_loss: 0.0077 28/500 [>.............................] - ETA: 3:55 - loss: 0.2450 - regression_loss: 0.2373 - classification_loss: 0.0077 29/500 [>.............................] - ETA: 3:55 - loss: 0.2468 - regression_loss: 0.2391 - classification_loss: 0.0078 30/500 [>.............................] - ETA: 3:55 - loss: 0.2441 - regression_loss: 0.2365 - classification_loss: 0.0076 31/500 [>.............................] - ETA: 3:54 - loss: 0.2419 - regression_loss: 0.2343 - classification_loss: 0.0075 32/500 [>.............................] - ETA: 3:54 - loss: 0.2461 - regression_loss: 0.2384 - classification_loss: 0.0076 33/500 [>.............................] - ETA: 3:54 - loss: 0.2462 - regression_loss: 0.2387 - classification_loss: 0.0075 34/500 [=>............................] - ETA: 3:54 - loss: 0.2467 - regression_loss: 0.2390 - classification_loss: 0.0077 35/500 [=>............................] - ETA: 3:54 - loss: 0.2499 - regression_loss: 0.2420 - classification_loss: 0.0079 36/500 [=>............................] - ETA: 3:54 - loss: 0.2476 - regression_loss: 0.2398 - classification_loss: 0.0077 37/500 [=>............................] - ETA: 3:53 - loss: 0.2476 - regression_loss: 0.2398 - classification_loss: 0.0078 38/500 [=>............................] - ETA: 3:52 - loss: 0.2457 - regression_loss: 0.2379 - classification_loss: 0.0078 39/500 [=>............................] - ETA: 3:52 - loss: 0.2454 - regression_loss: 0.2376 - classification_loss: 0.0078 40/500 [=>............................] - ETA: 3:51 - loss: 0.2441 - regression_loss: 0.2362 - classification_loss: 0.0079 41/500 [=>............................] - ETA: 3:51 - loss: 0.2428 - regression_loss: 0.2349 - classification_loss: 0.0079 42/500 [=>............................] - ETA: 3:51 - loss: 0.2415 - regression_loss: 0.2337 - classification_loss: 0.0079 43/500 [=>............................] - ETA: 3:50 - loss: 0.2436 - regression_loss: 0.2352 - classification_loss: 0.0084 44/500 [=>............................] - ETA: 3:49 - loss: 0.2445 - regression_loss: 0.2360 - classification_loss: 0.0085 45/500 [=>............................] - ETA: 3:49 - loss: 0.2422 - regression_loss: 0.2339 - classification_loss: 0.0083 46/500 [=>............................] - ETA: 3:48 - loss: 0.2407 - regression_loss: 0.2325 - classification_loss: 0.0083 47/500 [=>............................] - ETA: 3:48 - loss: 0.2403 - regression_loss: 0.2319 - classification_loss: 0.0083 48/500 [=>............................] - ETA: 3:47 - loss: 0.2432 - regression_loss: 0.2348 - classification_loss: 0.0084 49/500 [=>............................] - ETA: 3:47 - loss: 0.2403 - regression_loss: 0.2320 - classification_loss: 0.0082 50/500 [==>...........................] - ETA: 3:46 - loss: 0.2403 - regression_loss: 0.2322 - classification_loss: 0.0081 51/500 [==>...........................] - ETA: 3:46 - loss: 0.2370 - regression_loss: 0.2290 - classification_loss: 0.0080 52/500 [==>...........................] - ETA: 3:45 - loss: 0.2371 - regression_loss: 0.2292 - classification_loss: 0.0080 53/500 [==>...........................] - ETA: 3:45 - loss: 0.2368 - regression_loss: 0.2288 - classification_loss: 0.0080 54/500 [==>...........................] - ETA: 3:44 - loss: 0.2365 - regression_loss: 0.2283 - classification_loss: 0.0082 55/500 [==>...........................] - ETA: 3:44 - loss: 0.2348 - regression_loss: 0.2267 - classification_loss: 0.0081 56/500 [==>...........................] - ETA: 3:43 - loss: 0.2348 - regression_loss: 0.2268 - classification_loss: 0.0080 57/500 [==>...........................] - ETA: 3:43 - loss: 0.2358 - regression_loss: 0.2278 - classification_loss: 0.0080 58/500 [==>...........................] - ETA: 3:42 - loss: 0.2394 - regression_loss: 0.2313 - classification_loss: 0.0081 59/500 [==>...........................] - ETA: 3:41 - loss: 0.2381 - regression_loss: 0.2300 - classification_loss: 0.0081 60/500 [==>...........................] - ETA: 3:41 - loss: 0.2387 - regression_loss: 0.2306 - classification_loss: 0.0081 61/500 [==>...........................] - ETA: 3:40 - loss: 0.2375 - regression_loss: 0.2295 - classification_loss: 0.0081 62/500 [==>...........................] - ETA: 3:40 - loss: 0.2364 - regression_loss: 0.2283 - classification_loss: 0.0081 63/500 [==>...........................] - ETA: 3:40 - loss: 0.2331 - regression_loss: 0.2252 - classification_loss: 0.0080 64/500 [==>...........................] - ETA: 3:39 - loss: 0.2317 - regression_loss: 0.2239 - classification_loss: 0.0078 65/500 [==>...........................] - ETA: 3:39 - loss: 0.2329 - regression_loss: 0.2250 - classification_loss: 0.0079 66/500 [==>...........................] - ETA: 3:38 - loss: 0.2318 - regression_loss: 0.2240 - classification_loss: 0.0078 67/500 [===>..........................] - ETA: 3:37 - loss: 0.2302 - regression_loss: 0.2225 - classification_loss: 0.0077 68/500 [===>..........................] - ETA: 3:37 - loss: 0.2334 - regression_loss: 0.2253 - classification_loss: 0.0081 69/500 [===>..........................] - ETA: 3:37 - loss: 0.2344 - regression_loss: 0.2263 - classification_loss: 0.0080 70/500 [===>..........................] - ETA: 3:36 - loss: 0.2368 - regression_loss: 0.2286 - classification_loss: 0.0082 71/500 [===>..........................] - ETA: 3:35 - loss: 0.2373 - regression_loss: 0.2290 - classification_loss: 0.0083 72/500 [===>..........................] - ETA: 3:35 - loss: 0.2360 - regression_loss: 0.2278 - classification_loss: 0.0082 73/500 [===>..........................] - ETA: 3:34 - loss: 0.2355 - regression_loss: 0.2273 - classification_loss: 0.0082 74/500 [===>..........................] - ETA: 3:34 - loss: 0.2349 - regression_loss: 0.2267 - classification_loss: 0.0082 75/500 [===>..........................] - ETA: 3:33 - loss: 0.2374 - regression_loss: 0.2291 - classification_loss: 0.0083 76/500 [===>..........................] - ETA: 3:33 - loss: 0.2378 - regression_loss: 0.2295 - classification_loss: 0.0083 77/500 [===>..........................] - ETA: 3:33 - loss: 0.2387 - regression_loss: 0.2303 - classification_loss: 0.0084 78/500 [===>..........................] - ETA: 3:32 - loss: 0.2367 - regression_loss: 0.2284 - classification_loss: 0.0083 79/500 [===>..........................] - ETA: 3:32 - loss: 0.2369 - regression_loss: 0.2285 - classification_loss: 0.0084 80/500 [===>..........................] - ETA: 3:31 - loss: 0.2355 - regression_loss: 0.2271 - classification_loss: 0.0083 81/500 [===>..........................] - ETA: 3:30 - loss: 0.2341 - regression_loss: 0.2258 - classification_loss: 0.0083 82/500 [===>..........................] - ETA: 3:30 - loss: 0.2334 - regression_loss: 0.2251 - classification_loss: 0.0083 83/500 [===>..........................] - ETA: 3:29 - loss: 0.2332 - regression_loss: 0.2250 - classification_loss: 0.0082 84/500 [====>.........................] - ETA: 3:31 - loss: 0.2331 - regression_loss: 0.2249 - classification_loss: 0.0082 85/500 [====>.........................] - ETA: 3:30 - loss: 0.2319 - regression_loss: 0.2237 - classification_loss: 0.0081 86/500 [====>.........................] - ETA: 3:30 - loss: 0.2321 - regression_loss: 0.2240 - classification_loss: 0.0081 87/500 [====>.........................] - ETA: 3:29 - loss: 0.2327 - regression_loss: 0.2244 - classification_loss: 0.0083 88/500 [====>.........................] - ETA: 3:29 - loss: 0.2311 - regression_loss: 0.2229 - classification_loss: 0.0082 89/500 [====>.........................] - ETA: 3:28 - loss: 0.2297 - regression_loss: 0.2216 - classification_loss: 0.0082 90/500 [====>.........................] - ETA: 3:28 - loss: 0.2292 - regression_loss: 0.2211 - classification_loss: 0.0081 91/500 [====>.........................] - ETA: 3:27 - loss: 0.2277 - regression_loss: 0.2197 - classification_loss: 0.0080 92/500 [====>.........................] - ETA: 3:27 - loss: 0.2268 - regression_loss: 0.2188 - classification_loss: 0.0080 93/500 [====>.........................] - ETA: 3:26 - loss: 0.2268 - regression_loss: 0.2187 - classification_loss: 0.0080 94/500 [====>.........................] - ETA: 3:26 - loss: 0.2256 - regression_loss: 0.2176 - classification_loss: 0.0080 95/500 [====>.........................] - ETA: 3:25 - loss: 0.2240 - regression_loss: 0.2161 - classification_loss: 0.0079 96/500 [====>.........................] - ETA: 3:25 - loss: 0.2228 - regression_loss: 0.2149 - classification_loss: 0.0079 97/500 [====>.........................] - ETA: 3:24 - loss: 0.2215 - regression_loss: 0.2137 - classification_loss: 0.0078 98/500 [====>.........................] - ETA: 3:23 - loss: 0.2222 - regression_loss: 0.2144 - classification_loss: 0.0078 99/500 [====>.........................] - ETA: 3:23 - loss: 0.2226 - regression_loss: 0.2148 - classification_loss: 0.0078 100/500 [=====>........................] - ETA: 3:22 - loss: 0.2221 - regression_loss: 0.2143 - classification_loss: 0.0078 101/500 [=====>........................] - ETA: 3:22 - loss: 0.2223 - regression_loss: 0.2146 - classification_loss: 0.0077 102/500 [=====>........................] - ETA: 3:21 - loss: 0.2210 - regression_loss: 0.2133 - classification_loss: 0.0077 103/500 [=====>........................] - ETA: 3:21 - loss: 0.2201 - regression_loss: 0.2125 - classification_loss: 0.0077 104/500 [=====>........................] - ETA: 3:20 - loss: 0.2186 - regression_loss: 0.2110 - classification_loss: 0.0076 105/500 [=====>........................] - ETA: 3:20 - loss: 0.2177 - regression_loss: 0.2102 - classification_loss: 0.0076 106/500 [=====>........................] - ETA: 3:19 - loss: 0.2172 - regression_loss: 0.2097 - classification_loss: 0.0075 107/500 [=====>........................] - ETA: 3:19 - loss: 0.2171 - regression_loss: 0.2096 - classification_loss: 0.0075 108/500 [=====>........................] - ETA: 3:18 - loss: 0.2172 - regression_loss: 0.2097 - classification_loss: 0.0075 109/500 [=====>........................] - ETA: 3:18 - loss: 0.2164 - regression_loss: 0.2089 - classification_loss: 0.0075 110/500 [=====>........................] - ETA: 3:17 - loss: 0.2174 - regression_loss: 0.2099 - classification_loss: 0.0075 111/500 [=====>........................] - ETA: 3:17 - loss: 0.2175 - regression_loss: 0.2099 - classification_loss: 0.0076 112/500 [=====>........................] - ETA: 3:16 - loss: 0.2161 - regression_loss: 0.2085 - classification_loss: 0.0075 113/500 [=====>........................] - ETA: 3:16 - loss: 0.2161 - regression_loss: 0.2086 - classification_loss: 0.0075 114/500 [=====>........................] - ETA: 3:15 - loss: 0.2153 - regression_loss: 0.2078 - classification_loss: 0.0075 115/500 [=====>........................] - ETA: 3:15 - loss: 0.2153 - regression_loss: 0.2078 - classification_loss: 0.0075 116/500 [=====>........................] - ETA: 3:14 - loss: 0.2161 - regression_loss: 0.2085 - classification_loss: 0.0075 117/500 [======>.......................] - ETA: 3:14 - loss: 0.2172 - regression_loss: 0.2097 - classification_loss: 0.0075 118/500 [======>.......................] - ETA: 3:13 - loss: 0.2178 - regression_loss: 0.2103 - classification_loss: 0.0075 119/500 [======>.......................] - ETA: 3:13 - loss: 0.2187 - regression_loss: 0.2110 - classification_loss: 0.0077 120/500 [======>.......................] - ETA: 3:12 - loss: 0.2180 - regression_loss: 0.2104 - classification_loss: 0.0076 121/500 [======>.......................] - ETA: 3:11 - loss: 0.2214 - regression_loss: 0.2138 - classification_loss: 0.0077 122/500 [======>.......................] - ETA: 3:11 - loss: 0.2216 - regression_loss: 0.2140 - classification_loss: 0.0076 123/500 [======>.......................] - ETA: 3:10 - loss: 0.2225 - regression_loss: 0.2149 - classification_loss: 0.0076 124/500 [======>.......................] - ETA: 3:10 - loss: 0.2232 - regression_loss: 0.2156 - classification_loss: 0.0076 125/500 [======>.......................] - ETA: 3:09 - loss: 0.2235 - regression_loss: 0.2158 - classification_loss: 0.0077 126/500 [======>.......................] - ETA: 3:09 - loss: 0.2233 - regression_loss: 0.2156 - classification_loss: 0.0077 127/500 [======>.......................] - ETA: 3:08 - loss: 0.2234 - regression_loss: 0.2157 - classification_loss: 0.0077 128/500 [======>.......................] - ETA: 3:08 - loss: 0.2241 - regression_loss: 0.2164 - classification_loss: 0.0077 129/500 [======>.......................] - ETA: 3:07 - loss: 0.2246 - regression_loss: 0.2169 - classification_loss: 0.0077 130/500 [======>.......................] - ETA: 3:07 - loss: 0.2267 - regression_loss: 0.2188 - classification_loss: 0.0078 131/500 [======>.......................] - ETA: 3:06 - loss: 0.2295 - regression_loss: 0.2217 - classification_loss: 0.0078 132/500 [======>.......................] - ETA: 3:05 - loss: 0.2306 - regression_loss: 0.2228 - classification_loss: 0.0078 133/500 [======>.......................] - ETA: 3:05 - loss: 0.2317 - regression_loss: 0.2239 - classification_loss: 0.0078 134/500 [=======>......................] - ETA: 3:05 - loss: 0.2319 - regression_loss: 0.2241 - classification_loss: 0.0077 135/500 [=======>......................] - ETA: 3:04 - loss: 0.2329 - regression_loss: 0.2252 - classification_loss: 0.0077 136/500 [=======>......................] - ETA: 3:04 - loss: 0.2335 - regression_loss: 0.2258 - classification_loss: 0.0078 137/500 [=======>......................] - ETA: 3:03 - loss: 0.2329 - regression_loss: 0.2251 - classification_loss: 0.0077 138/500 [=======>......................] - ETA: 3:02 - loss: 0.2338 - regression_loss: 0.2260 - classification_loss: 0.0078 139/500 [=======>......................] - ETA: 3:02 - loss: 0.2340 - regression_loss: 0.2261 - classification_loss: 0.0078 140/500 [=======>......................] - ETA: 3:01 - loss: 0.2334 - regression_loss: 0.2256 - classification_loss: 0.0078 141/500 [=======>......................] - ETA: 3:01 - loss: 0.2338 - regression_loss: 0.2260 - classification_loss: 0.0078 142/500 [=======>......................] - ETA: 3:00 - loss: 0.2351 - regression_loss: 0.2273 - classification_loss: 0.0078 143/500 [=======>......................] - ETA: 3:00 - loss: 0.2366 - regression_loss: 0.2288 - classification_loss: 0.0078 144/500 [=======>......................] - ETA: 2:59 - loss: 0.2388 - regression_loss: 0.2310 - classification_loss: 0.0078 145/500 [=======>......................] - ETA: 2:59 - loss: 0.2393 - regression_loss: 0.2315 - classification_loss: 0.0078 146/500 [=======>......................] - ETA: 2:58 - loss: 0.2395 - regression_loss: 0.2317 - classification_loss: 0.0078 147/500 [=======>......................] - ETA: 2:58 - loss: 0.2395 - regression_loss: 0.2317 - classification_loss: 0.0078 148/500 [=======>......................] - ETA: 2:57 - loss: 0.2394 - regression_loss: 0.2316 - classification_loss: 0.0078 149/500 [=======>......................] - ETA: 2:57 - loss: 0.2384 - regression_loss: 0.2306 - classification_loss: 0.0078 150/500 [========>.....................] - ETA: 2:56 - loss: 0.2406 - regression_loss: 0.2324 - classification_loss: 0.0082 151/500 [========>.....................] - ETA: 2:56 - loss: 0.2409 - regression_loss: 0.2327 - classification_loss: 0.0082 152/500 [========>.....................] - ETA: 2:55 - loss: 0.2423 - regression_loss: 0.2340 - classification_loss: 0.0082 153/500 [========>.....................] - ETA: 2:55 - loss: 0.2435 - regression_loss: 0.2352 - classification_loss: 0.0083 154/500 [========>.....................] - ETA: 2:54 - loss: 0.2441 - regression_loss: 0.2358 - classification_loss: 0.0083 155/500 [========>.....................] - ETA: 2:54 - loss: 0.2456 - regression_loss: 0.2368 - classification_loss: 0.0087 156/500 [========>.....................] - ETA: 2:53 - loss: 0.2449 - regression_loss: 0.2362 - classification_loss: 0.0087 157/500 [========>.....................] - ETA: 2:53 - loss: 0.2437 - regression_loss: 0.2351 - classification_loss: 0.0087 158/500 [========>.....................] - ETA: 2:52 - loss: 0.2440 - regression_loss: 0.2353 - classification_loss: 0.0087 159/500 [========>.....................] - ETA: 2:52 - loss: 0.2439 - regression_loss: 0.2352 - classification_loss: 0.0088 160/500 [========>.....................] - ETA: 2:51 - loss: 0.2436 - regression_loss: 0.2349 - classification_loss: 0.0087 161/500 [========>.....................] - ETA: 2:51 - loss: 0.2431 - regression_loss: 0.2344 - classification_loss: 0.0087 162/500 [========>.....................] - ETA: 2:50 - loss: 0.2433 - regression_loss: 0.2346 - classification_loss: 0.0087 163/500 [========>.....................] - ETA: 2:50 - loss: 0.2432 - regression_loss: 0.2345 - classification_loss: 0.0087 164/500 [========>.....................] - ETA: 2:49 - loss: 0.2427 - regression_loss: 0.2340 - classification_loss: 0.0087 165/500 [========>.....................] - ETA: 2:49 - loss: 0.2427 - regression_loss: 0.2340 - classification_loss: 0.0087 166/500 [========>.....................] - ETA: 2:48 - loss: 0.2422 - regression_loss: 0.2336 - classification_loss: 0.0087 167/500 [=========>....................] - ETA: 2:48 - loss: 0.2422 - regression_loss: 0.2334 - classification_loss: 0.0088 168/500 [=========>....................] - ETA: 2:47 - loss: 0.2419 - regression_loss: 0.2331 - classification_loss: 0.0088 169/500 [=========>....................] - ETA: 2:47 - loss: 0.2417 - regression_loss: 0.2330 - classification_loss: 0.0087 170/500 [=========>....................] - ETA: 2:46 - loss: 0.2413 - regression_loss: 0.2326 - classification_loss: 0.0087 171/500 [=========>....................] - ETA: 2:46 - loss: 0.2411 - regression_loss: 0.2324 - classification_loss: 0.0087 172/500 [=========>....................] - ETA: 2:45 - loss: 0.2405 - regression_loss: 0.2318 - classification_loss: 0.0088 173/500 [=========>....................] - ETA: 2:45 - loss: 0.2405 - regression_loss: 0.2316 - classification_loss: 0.0088 174/500 [=========>....................] - ETA: 2:44 - loss: 0.2406 - regression_loss: 0.2318 - classification_loss: 0.0089 175/500 [=========>....................] - ETA: 2:44 - loss: 0.2403 - regression_loss: 0.2314 - classification_loss: 0.0088 176/500 [=========>....................] - ETA: 2:43 - loss: 0.2397 - regression_loss: 0.2309 - classification_loss: 0.0088 177/500 [=========>....................] - ETA: 2:43 - loss: 0.2398 - regression_loss: 0.2310 - classification_loss: 0.0088 178/500 [=========>....................] - ETA: 2:42 - loss: 0.2397 - regression_loss: 0.2309 - classification_loss: 0.0088 179/500 [=========>....................] - ETA: 2:42 - loss: 0.2409 - regression_loss: 0.2321 - classification_loss: 0.0088 180/500 [=========>....................] - ETA: 2:41 - loss: 0.2403 - regression_loss: 0.2315 - classification_loss: 0.0088 181/500 [=========>....................] - ETA: 2:41 - loss: 0.2397 - regression_loss: 0.2310 - classification_loss: 0.0087 182/500 [=========>....................] - ETA: 2:40 - loss: 0.2395 - regression_loss: 0.2308 - classification_loss: 0.0087 183/500 [=========>....................] - ETA: 2:40 - loss: 0.2396 - regression_loss: 0.2308 - classification_loss: 0.0088 184/500 [==========>...................] - ETA: 2:39 - loss: 0.2398 - regression_loss: 0.2310 - classification_loss: 0.0088 185/500 [==========>...................] - ETA: 2:38 - loss: 0.2402 - regression_loss: 0.2314 - classification_loss: 0.0088 186/500 [==========>...................] - ETA: 2:38 - loss: 0.2397 - regression_loss: 0.2309 - classification_loss: 0.0088 187/500 [==========>...................] - ETA: 2:37 - loss: 0.2392 - regression_loss: 0.2304 - classification_loss: 0.0088 188/500 [==========>...................] - ETA: 2:37 - loss: 0.2399 - regression_loss: 0.2312 - classification_loss: 0.0087 189/500 [==========>...................] - ETA: 2:36 - loss: 0.2395 - regression_loss: 0.2307 - classification_loss: 0.0087 190/500 [==========>...................] - ETA: 2:36 - loss: 0.2391 - regression_loss: 0.2303 - classification_loss: 0.0087 191/500 [==========>...................] - ETA: 2:35 - loss: 0.2392 - regression_loss: 0.2305 - classification_loss: 0.0087 192/500 [==========>...................] - ETA: 2:35 - loss: 0.2396 - regression_loss: 0.2309 - classification_loss: 0.0087 193/500 [==========>...................] - ETA: 2:34 - loss: 0.2391 - regression_loss: 0.2304 - classification_loss: 0.0087 194/500 [==========>...................] - ETA: 2:34 - loss: 0.2388 - regression_loss: 0.2301 - classification_loss: 0.0087 195/500 [==========>...................] - ETA: 2:33 - loss: 0.2382 - regression_loss: 0.2296 - classification_loss: 0.0087 196/500 [==========>...................] - ETA: 2:33 - loss: 0.2378 - regression_loss: 0.2292 - classification_loss: 0.0086 197/500 [==========>...................] - ETA: 2:32 - loss: 0.2373 - regression_loss: 0.2287 - classification_loss: 0.0086 198/500 [==========>...................] - ETA: 2:32 - loss: 0.2366 - regression_loss: 0.2280 - classification_loss: 0.0086 199/500 [==========>...................] - ETA: 2:31 - loss: 0.2371 - regression_loss: 0.2285 - classification_loss: 0.0086 200/500 [===========>..................] - ETA: 2:31 - loss: 0.2373 - regression_loss: 0.2286 - classification_loss: 0.0087 201/500 [===========>..................] - ETA: 2:30 - loss: 0.2373 - regression_loss: 0.2286 - classification_loss: 0.0087 202/500 [===========>..................] - ETA: 2:30 - loss: 0.2365 - regression_loss: 0.2279 - classification_loss: 0.0087 203/500 [===========>..................] - ETA: 2:29 - loss: 0.2363 - regression_loss: 0.2277 - classification_loss: 0.0087 204/500 [===========>..................] - ETA: 2:29 - loss: 0.2355 - regression_loss: 0.2269 - classification_loss: 0.0086 205/500 [===========>..................] - ETA: 2:28 - loss: 0.2357 - regression_loss: 0.2271 - classification_loss: 0.0087 206/500 [===========>..................] - ETA: 2:28 - loss: 0.2359 - regression_loss: 0.2273 - classification_loss: 0.0086 207/500 [===========>..................] - ETA: 2:27 - loss: 0.2353 - regression_loss: 0.2267 - classification_loss: 0.0086 208/500 [===========>..................] - ETA: 2:27 - loss: 0.2349 - regression_loss: 0.2263 - classification_loss: 0.0086 209/500 [===========>..................] - ETA: 2:26 - loss: 0.2344 - regression_loss: 0.2258 - classification_loss: 0.0085 210/500 [===========>..................] - ETA: 2:26 - loss: 0.2340 - regression_loss: 0.2255 - classification_loss: 0.0085 211/500 [===========>..................] - ETA: 2:25 - loss: 0.2342 - regression_loss: 0.2256 - classification_loss: 0.0085 212/500 [===========>..................] - ETA: 2:25 - loss: 0.2335 - regression_loss: 0.2250 - classification_loss: 0.0085 213/500 [===========>..................] - ETA: 2:24 - loss: 0.2333 - regression_loss: 0.2248 - classification_loss: 0.0085 214/500 [===========>..................] - ETA: 2:24 - loss: 0.2324 - regression_loss: 0.2239 - classification_loss: 0.0085 215/500 [===========>..................] - ETA: 2:23 - loss: 0.2322 - regression_loss: 0.2238 - classification_loss: 0.0084 216/500 [===========>..................] - ETA: 2:23 - loss: 0.2319 - regression_loss: 0.2235 - classification_loss: 0.0084 217/500 [============>.................] - ETA: 2:22 - loss: 0.2312 - regression_loss: 0.2228 - classification_loss: 0.0084 218/500 [============>.................] - ETA: 2:22 - loss: 0.2307 - regression_loss: 0.2223 - classification_loss: 0.0084 219/500 [============>.................] - ETA: 2:21 - loss: 0.2307 - regression_loss: 0.2223 - classification_loss: 0.0084 220/500 [============>.................] - ETA: 2:21 - loss: 0.2305 - regression_loss: 0.2222 - classification_loss: 0.0084 221/500 [============>.................] - ETA: 2:20 - loss: 0.2301 - regression_loss: 0.2217 - classification_loss: 0.0084 222/500 [============>.................] - ETA: 2:20 - loss: 0.2295 - regression_loss: 0.2212 - classification_loss: 0.0083 223/500 [============>.................] - ETA: 2:19 - loss: 0.2288 - regression_loss: 0.2206 - classification_loss: 0.0083 224/500 [============>.................] - ETA: 2:19 - loss: 0.2291 - regression_loss: 0.2208 - classification_loss: 0.0083 225/500 [============>.................] - ETA: 2:18 - loss: 0.2291 - regression_loss: 0.2208 - classification_loss: 0.0083 226/500 [============>.................] - ETA: 2:18 - loss: 0.2285 - regression_loss: 0.2203 - classification_loss: 0.0082 227/500 [============>.................] - ETA: 2:17 - loss: 0.2280 - regression_loss: 0.2198 - classification_loss: 0.0082 228/500 [============>.................] - ETA: 2:17 - loss: 0.2279 - regression_loss: 0.2197 - classification_loss: 0.0082 229/500 [============>.................] - ETA: 2:16 - loss: 0.2282 - regression_loss: 0.2199 - classification_loss: 0.0082 230/500 [============>.................] - ETA: 2:16 - loss: 0.2288 - regression_loss: 0.2205 - classification_loss: 0.0083 231/500 [============>.................] - ETA: 2:15 - loss: 0.2299 - regression_loss: 0.2217 - classification_loss: 0.0083 232/500 [============>.................] - ETA: 2:15 - loss: 0.2301 - regression_loss: 0.2218 - classification_loss: 0.0083 233/500 [============>.................] - ETA: 2:14 - loss: 0.2305 - regression_loss: 0.2222 - classification_loss: 0.0083 234/500 [=============>................] - ETA: 2:13 - loss: 0.2305 - regression_loss: 0.2222 - classification_loss: 0.0083 235/500 [=============>................] - ETA: 2:13 - loss: 0.2304 - regression_loss: 0.2221 - classification_loss: 0.0083 236/500 [=============>................] - ETA: 2:12 - loss: 0.2299 - regression_loss: 0.2216 - classification_loss: 0.0083 237/500 [=============>................] - ETA: 2:12 - loss: 0.2297 - regression_loss: 0.2214 - classification_loss: 0.0084 238/500 [=============>................] - ETA: 2:11 - loss: 0.2294 - regression_loss: 0.2211 - classification_loss: 0.0084 239/500 [=============>................] - ETA: 2:11 - loss: 0.2293 - regression_loss: 0.2209 - classification_loss: 0.0084 240/500 [=============>................] - ETA: 2:10 - loss: 0.2286 - regression_loss: 0.2202 - classification_loss: 0.0084 241/500 [=============>................] - ETA: 2:10 - loss: 0.2289 - regression_loss: 0.2205 - classification_loss: 0.0084 242/500 [=============>................] - ETA: 2:09 - loss: 0.2286 - regression_loss: 0.2203 - classification_loss: 0.0084 243/500 [=============>................] - ETA: 2:09 - loss: 0.2285 - regression_loss: 0.2201 - classification_loss: 0.0084 244/500 [=============>................] - ETA: 2:08 - loss: 0.2280 - regression_loss: 0.2197 - classification_loss: 0.0083 245/500 [=============>................] - ETA: 2:08 - loss: 0.2280 - regression_loss: 0.2197 - classification_loss: 0.0083 246/500 [=============>................] - ETA: 2:07 - loss: 0.2273 - regression_loss: 0.2190 - classification_loss: 0.0083 247/500 [=============>................] - ETA: 2:07 - loss: 0.2270 - regression_loss: 0.2187 - classification_loss: 0.0083 248/500 [=============>................] - ETA: 2:06 - loss: 0.2269 - regression_loss: 0.2186 - classification_loss: 0.0083 249/500 [=============>................] - ETA: 2:06 - loss: 0.2265 - regression_loss: 0.2183 - classification_loss: 0.0083 250/500 [==============>...............] - ETA: 2:05 - loss: 0.2265 - regression_loss: 0.2182 - classification_loss: 0.0083 251/500 [==============>...............] - ETA: 2:05 - loss: 0.2265 - regression_loss: 0.2182 - classification_loss: 0.0083 252/500 [==============>...............] - ETA: 2:04 - loss: 0.2270 - regression_loss: 0.2187 - classification_loss: 0.0083 253/500 [==============>...............] - ETA: 2:04 - loss: 0.2274 - regression_loss: 0.2192 - classification_loss: 0.0082 254/500 [==============>...............] - ETA: 2:03 - loss: 0.2272 - regression_loss: 0.2190 - classification_loss: 0.0082 255/500 [==============>...............] - ETA: 2:03 - loss: 0.2273 - regression_loss: 0.2191 - classification_loss: 0.0083 256/500 [==============>...............] - ETA: 2:02 - loss: 0.2268 - regression_loss: 0.2186 - classification_loss: 0.0082 257/500 [==============>...............] - ETA: 2:02 - loss: 0.2266 - regression_loss: 0.2184 - classification_loss: 0.0082 258/500 [==============>...............] - ETA: 2:01 - loss: 0.2258 - regression_loss: 0.2176 - classification_loss: 0.0082 259/500 [==============>...............] - ETA: 2:01 - loss: 0.2250 - regression_loss: 0.2169 - classification_loss: 0.0082 260/500 [==============>...............] - ETA: 2:00 - loss: 0.2251 - regression_loss: 0.2170 - classification_loss: 0.0082 261/500 [==============>...............] - ETA: 2:00 - loss: 0.2252 - regression_loss: 0.2171 - classification_loss: 0.0082 262/500 [==============>...............] - ETA: 1:59 - loss: 0.2254 - regression_loss: 0.2172 - classification_loss: 0.0082 263/500 [==============>...............] - ETA: 1:59 - loss: 0.2253 - regression_loss: 0.2171 - classification_loss: 0.0082 264/500 [==============>...............] - ETA: 1:58 - loss: 0.2257 - regression_loss: 0.2175 - classification_loss: 0.0082 265/500 [==============>...............] - ETA: 1:58 - loss: 0.2258 - regression_loss: 0.2176 - classification_loss: 0.0082 266/500 [==============>...............] - ETA: 1:57 - loss: 0.2258 - regression_loss: 0.2175 - classification_loss: 0.0083 267/500 [===============>..............] - ETA: 1:57 - loss: 0.2256 - regression_loss: 0.2173 - classification_loss: 0.0082 268/500 [===============>..............] - ETA: 1:56 - loss: 0.2257 - regression_loss: 0.2175 - classification_loss: 0.0082 269/500 [===============>..............] - ETA: 1:56 - loss: 0.2254 - regression_loss: 0.2171 - classification_loss: 0.0082 270/500 [===============>..............] - ETA: 1:55 - loss: 0.2251 - regression_loss: 0.2169 - classification_loss: 0.0082 271/500 [===============>..............] - ETA: 1:55 - loss: 0.2249 - regression_loss: 0.2167 - classification_loss: 0.0082 272/500 [===============>..............] - ETA: 1:54 - loss: 0.2250 - regression_loss: 0.2168 - classification_loss: 0.0082 273/500 [===============>..............] - ETA: 1:54 - loss: 0.2255 - regression_loss: 0.2171 - classification_loss: 0.0084 274/500 [===============>..............] - ETA: 1:53 - loss: 0.2259 - regression_loss: 0.2174 - classification_loss: 0.0085 275/500 [===============>..............] - ETA: 1:53 - loss: 0.2259 - regression_loss: 0.2174 - classification_loss: 0.0085 276/500 [===============>..............] - ETA: 1:52 - loss: 0.2259 - regression_loss: 0.2174 - classification_loss: 0.0085 277/500 [===============>..............] - ETA: 1:52 - loss: 0.2258 - regression_loss: 0.2174 - classification_loss: 0.0084 278/500 [===============>..............] - ETA: 1:51 - loss: 0.2259 - regression_loss: 0.2175 - classification_loss: 0.0084 279/500 [===============>..............] - ETA: 1:51 - loss: 0.2258 - regression_loss: 0.2173 - classification_loss: 0.0084 280/500 [===============>..............] - ETA: 1:50 - loss: 0.2255 - regression_loss: 0.2171 - classification_loss: 0.0084 281/500 [===============>..............] - ETA: 1:50 - loss: 0.2258 - regression_loss: 0.2174 - classification_loss: 0.0084 282/500 [===============>..............] - ETA: 1:49 - loss: 0.2255 - regression_loss: 0.2171 - classification_loss: 0.0084 283/500 [===============>..............] - ETA: 1:49 - loss: 0.2251 - regression_loss: 0.2168 - classification_loss: 0.0084 284/500 [================>.............] - ETA: 1:48 - loss: 0.2252 - regression_loss: 0.2168 - classification_loss: 0.0084 285/500 [================>.............] - ETA: 1:48 - loss: 0.2251 - regression_loss: 0.2167 - classification_loss: 0.0084 286/500 [================>.............] - ETA: 1:47 - loss: 0.2251 - regression_loss: 0.2168 - classification_loss: 0.0084 287/500 [================>.............] - ETA: 1:47 - loss: 0.2249 - regression_loss: 0.2165 - classification_loss: 0.0084 288/500 [================>.............] - ETA: 1:46 - loss: 0.2255 - regression_loss: 0.2171 - classification_loss: 0.0084 289/500 [================>.............] - ETA: 1:46 - loss: 0.2255 - regression_loss: 0.2170 - classification_loss: 0.0084 290/500 [================>.............] - ETA: 1:45 - loss: 0.2252 - regression_loss: 0.2168 - classification_loss: 0.0084 291/500 [================>.............] - ETA: 1:45 - loss: 0.2248 - regression_loss: 0.2164 - classification_loss: 0.0084 292/500 [================>.............] - ETA: 1:44 - loss: 0.2249 - regression_loss: 0.2165 - classification_loss: 0.0084 293/500 [================>.............] - ETA: 1:44 - loss: 0.2251 - regression_loss: 0.2167 - classification_loss: 0.0084 294/500 [================>.............] - ETA: 1:43 - loss: 0.2249 - regression_loss: 0.2165 - classification_loss: 0.0084 295/500 [================>.............] - ETA: 1:43 - loss: 0.2257 - regression_loss: 0.2173 - classification_loss: 0.0084 296/500 [================>.............] - ETA: 1:42 - loss: 0.2253 - regression_loss: 0.2169 - classification_loss: 0.0084 297/500 [================>.............] - ETA: 1:42 - loss: 0.2249 - regression_loss: 0.2166 - classification_loss: 0.0084 298/500 [================>.............] - ETA: 1:41 - loss: 0.2244 - regression_loss: 0.2161 - classification_loss: 0.0084 299/500 [================>.............] - ETA: 1:41 - loss: 0.2243 - regression_loss: 0.2160 - classification_loss: 0.0084 300/500 [=================>............] - ETA: 1:40 - loss: 0.2245 - regression_loss: 0.2161 - classification_loss: 0.0084 301/500 [=================>............] - ETA: 1:40 - loss: 0.2246 - regression_loss: 0.2162 - classification_loss: 0.0084 302/500 [=================>............] - ETA: 1:39 - loss: 0.2247 - regression_loss: 0.2163 - classification_loss: 0.0084 303/500 [=================>............] - ETA: 1:39 - loss: 0.2248 - regression_loss: 0.2165 - classification_loss: 0.0084 304/500 [=================>............] - ETA: 1:38 - loss: 0.2249 - regression_loss: 0.2165 - classification_loss: 0.0084 305/500 [=================>............] - ETA: 1:38 - loss: 0.2250 - regression_loss: 0.2166 - classification_loss: 0.0084 306/500 [=================>............] - ETA: 1:37 - loss: 0.2255 - regression_loss: 0.2171 - classification_loss: 0.0084 307/500 [=================>............] - ETA: 1:37 - loss: 0.2254 - regression_loss: 0.2170 - classification_loss: 0.0084 308/500 [=================>............] - ETA: 1:36 - loss: 0.2256 - regression_loss: 0.2172 - classification_loss: 0.0084 309/500 [=================>............] - ETA: 1:36 - loss: 0.2255 - regression_loss: 0.2171 - classification_loss: 0.0084 310/500 [=================>............] - ETA: 1:35 - loss: 0.2256 - regression_loss: 0.2172 - classification_loss: 0.0084 311/500 [=================>............] - ETA: 1:35 - loss: 0.2259 - regression_loss: 0.2175 - classification_loss: 0.0084 312/500 [=================>............] - ETA: 1:34 - loss: 0.2262 - regression_loss: 0.2177 - classification_loss: 0.0084 313/500 [=================>............] - ETA: 1:34 - loss: 0.2259 - regression_loss: 0.2175 - classification_loss: 0.0084 314/500 [=================>............] - ETA: 1:33 - loss: 0.2255 - regression_loss: 0.2171 - classification_loss: 0.0084 315/500 [=================>............] - ETA: 1:33 - loss: 0.2254 - regression_loss: 0.2171 - classification_loss: 0.0084 316/500 [=================>............] - ETA: 1:32 - loss: 0.2257 - regression_loss: 0.2172 - classification_loss: 0.0084 317/500 [==================>...........] - ETA: 1:32 - loss: 0.2257 - regression_loss: 0.2173 - classification_loss: 0.0084 318/500 [==================>...........] - ETA: 1:31 - loss: 0.2254 - regression_loss: 0.2170 - classification_loss: 0.0084 319/500 [==================>...........] - ETA: 1:31 - loss: 0.2258 - regression_loss: 0.2174 - classification_loss: 0.0084 320/500 [==================>...........] - ETA: 1:30 - loss: 0.2258 - regression_loss: 0.2174 - classification_loss: 0.0084 321/500 [==================>...........] - ETA: 1:30 - loss: 0.2258 - regression_loss: 0.2174 - classification_loss: 0.0085 322/500 [==================>...........] - ETA: 1:29 - loss: 0.2255 - regression_loss: 0.2171 - classification_loss: 0.0085 323/500 [==================>...........] - ETA: 1:29 - loss: 0.2251 - regression_loss: 0.2167 - classification_loss: 0.0084 324/500 [==================>...........] - ETA: 1:28 - loss: 0.2246 - regression_loss: 0.2162 - classification_loss: 0.0084 325/500 [==================>...........] - ETA: 1:28 - loss: 0.2243 - regression_loss: 0.2159 - classification_loss: 0.0084 326/500 [==================>...........] - ETA: 1:27 - loss: 0.2240 - regression_loss: 0.2156 - classification_loss: 0.0084 327/500 [==================>...........] - ETA: 1:27 - loss: 0.2240 - regression_loss: 0.2156 - classification_loss: 0.0084 328/500 [==================>...........] - ETA: 1:26 - loss: 0.2238 - regression_loss: 0.2155 - classification_loss: 0.0084 329/500 [==================>...........] - ETA: 1:26 - loss: 0.2239 - regression_loss: 0.2155 - classification_loss: 0.0083 330/500 [==================>...........] - ETA: 1:25 - loss: 0.2237 - regression_loss: 0.2154 - classification_loss: 0.0084 331/500 [==================>...........] - ETA: 1:25 - loss: 0.2239 - regression_loss: 0.2156 - classification_loss: 0.0084 332/500 [==================>...........] - ETA: 1:24 - loss: 0.2237 - regression_loss: 0.2153 - classification_loss: 0.0084 333/500 [==================>...........] - ETA: 1:24 - loss: 0.2234 - regression_loss: 0.2150 - classification_loss: 0.0083 334/500 [===================>..........] - ETA: 1:23 - loss: 0.2230 - regression_loss: 0.2147 - classification_loss: 0.0083 335/500 [===================>..........] - ETA: 1:23 - loss: 0.2230 - regression_loss: 0.2146 - classification_loss: 0.0084 336/500 [===================>..........] - ETA: 1:22 - loss: 0.2229 - regression_loss: 0.2146 - classification_loss: 0.0084 337/500 [===================>..........] - ETA: 1:22 - loss: 0.2227 - regression_loss: 0.2143 - classification_loss: 0.0083 338/500 [===================>..........] - ETA: 1:21 - loss: 0.2226 - regression_loss: 0.2142 - classification_loss: 0.0083 339/500 [===================>..........] - ETA: 1:20 - loss: 0.2224 - regression_loss: 0.2141 - classification_loss: 0.0084 340/500 [===================>..........] - ETA: 1:20 - loss: 0.2222 - regression_loss: 0.2139 - classification_loss: 0.0083 341/500 [===================>..........] - ETA: 1:19 - loss: 0.2221 - regression_loss: 0.2138 - classification_loss: 0.0083 342/500 [===================>..........] - ETA: 1:19 - loss: 0.2222 - regression_loss: 0.2139 - classification_loss: 0.0083 343/500 [===================>..........] - ETA: 1:18 - loss: 0.2221 - regression_loss: 0.2137 - classification_loss: 0.0083 344/500 [===================>..........] - ETA: 1:18 - loss: 0.2216 - regression_loss: 0.2133 - classification_loss: 0.0083 345/500 [===================>..........] - ETA: 1:17 - loss: 0.2218 - regression_loss: 0.2135 - classification_loss: 0.0083 346/500 [===================>..........] - ETA: 1:17 - loss: 0.2220 - regression_loss: 0.2136 - classification_loss: 0.0083 347/500 [===================>..........] - ETA: 1:16 - loss: 0.2220 - regression_loss: 0.2137 - classification_loss: 0.0083 348/500 [===================>..........] - ETA: 1:16 - loss: 0.2220 - regression_loss: 0.2137 - classification_loss: 0.0083 349/500 [===================>..........] - ETA: 1:15 - loss: 0.2222 - regression_loss: 0.2139 - classification_loss: 0.0083 350/500 [====================>.........] - ETA: 1:15 - loss: 0.2218 - regression_loss: 0.2135 - classification_loss: 0.0083 351/500 [====================>.........] - ETA: 1:14 - loss: 0.2217 - regression_loss: 0.2134 - classification_loss: 0.0083 352/500 [====================>.........] - ETA: 1:14 - loss: 0.2220 - regression_loss: 0.2136 - classification_loss: 0.0083 353/500 [====================>.........] - ETA: 1:13 - loss: 0.2219 - regression_loss: 0.2136 - classification_loss: 0.0083 354/500 [====================>.........] - ETA: 1:13 - loss: 0.2219 - regression_loss: 0.2136 - classification_loss: 0.0083 355/500 [====================>.........] - ETA: 1:12 - loss: 0.2216 - regression_loss: 0.2133 - classification_loss: 0.0083 356/500 [====================>.........] - ETA: 1:12 - loss: 0.2216 - regression_loss: 0.2133 - classification_loss: 0.0083 357/500 [====================>.........] - ETA: 1:11 - loss: 0.2212 - regression_loss: 0.2129 - classification_loss: 0.0083 358/500 [====================>.........] - ETA: 1:11 - loss: 0.2216 - regression_loss: 0.2133 - classification_loss: 0.0083 359/500 [====================>.........] - ETA: 1:10 - loss: 0.2217 - regression_loss: 0.2134 - classification_loss: 0.0083 360/500 [====================>.........] - ETA: 1:10 - loss: 0.2215 - regression_loss: 0.2132 - classification_loss: 0.0083 361/500 [====================>.........] - ETA: 1:09 - loss: 0.2222 - regression_loss: 0.2138 - classification_loss: 0.0083 362/500 [====================>.........] - ETA: 1:09 - loss: 0.2224 - regression_loss: 0.2140 - classification_loss: 0.0084 363/500 [====================>.........] - ETA: 1:08 - loss: 0.2229 - regression_loss: 0.2145 - classification_loss: 0.0084 364/500 [====================>.........] - ETA: 1:08 - loss: 0.2225 - regression_loss: 0.2141 - classification_loss: 0.0084 365/500 [====================>.........] - ETA: 1:07 - loss: 0.2230 - regression_loss: 0.2146 - classification_loss: 0.0084 366/500 [====================>.........] - ETA: 1:07 - loss: 0.2228 - regression_loss: 0.2145 - classification_loss: 0.0083 367/500 [=====================>........] - ETA: 1:06 - loss: 0.2232 - regression_loss: 0.2147 - classification_loss: 0.0085 368/500 [=====================>........] - ETA: 1:06 - loss: 0.2230 - regression_loss: 0.2145 - classification_loss: 0.0085 369/500 [=====================>........] - ETA: 1:05 - loss: 0.2231 - regression_loss: 0.2146 - classification_loss: 0.0085 370/500 [=====================>........] - ETA: 1:05 - loss: 0.2230 - regression_loss: 0.2145 - classification_loss: 0.0085 371/500 [=====================>........] - ETA: 1:04 - loss: 0.2229 - regression_loss: 0.2143 - classification_loss: 0.0085 372/500 [=====================>........] - ETA: 1:04 - loss: 0.2226 - regression_loss: 0.2141 - classification_loss: 0.0085 373/500 [=====================>........] - ETA: 1:03 - loss: 0.2225 - regression_loss: 0.2140 - classification_loss: 0.0085 374/500 [=====================>........] - ETA: 1:03 - loss: 0.2222 - regression_loss: 0.2137 - classification_loss: 0.0085 375/500 [=====================>........] - ETA: 1:02 - loss: 0.2220 - regression_loss: 0.2135 - classification_loss: 0.0085 376/500 [=====================>........] - ETA: 1:02 - loss: 0.2217 - regression_loss: 0.2132 - classification_loss: 0.0085 377/500 [=====================>........] - ETA: 1:01 - loss: 0.2215 - regression_loss: 0.2130 - classification_loss: 0.0085 378/500 [=====================>........] - ETA: 1:01 - loss: 0.2214 - regression_loss: 0.2130 - classification_loss: 0.0084 379/500 [=====================>........] - ETA: 1:00 - loss: 0.2217 - regression_loss: 0.2132 - classification_loss: 0.0085 380/500 [=====================>........] - ETA: 1:00 - loss: 0.2216 - regression_loss: 0.2131 - classification_loss: 0.0085 381/500 [=====================>........] - ETA: 59s - loss: 0.2219 - regression_loss: 0.2134 - classification_loss: 0.0085  382/500 [=====================>........] - ETA: 59s - loss: 0.2220 - regression_loss: 0.2135 - classification_loss: 0.0085 383/500 [=====================>........] - ETA: 58s - loss: 0.2217 - regression_loss: 0.2132 - classification_loss: 0.0085 384/500 [======================>.......] - ETA: 58s - loss: 0.2217 - regression_loss: 0.2132 - classification_loss: 0.0085 385/500 [======================>.......] - ETA: 57s - loss: 0.2216 - regression_loss: 0.2131 - classification_loss: 0.0085 386/500 [======================>.......] - ETA: 57s - loss: 0.2216 - regression_loss: 0.2131 - classification_loss: 0.0085 387/500 [======================>.......] - ETA: 56s - loss: 0.2215 - regression_loss: 0.2130 - classification_loss: 0.0085 388/500 [======================>.......] - ETA: 56s - loss: 0.2215 - regression_loss: 0.2130 - classification_loss: 0.0085 389/500 [======================>.......] - ETA: 55s - loss: 0.2214 - regression_loss: 0.2129 - classification_loss: 0.0085 390/500 [======================>.......] - ETA: 55s - loss: 0.2215 - regression_loss: 0.2130 - classification_loss: 0.0085 391/500 [======================>.......] - ETA: 54s - loss: 0.2215 - regression_loss: 0.2130 - classification_loss: 0.0085 392/500 [======================>.......] - ETA: 54s - loss: 0.2215 - regression_loss: 0.2131 - classification_loss: 0.0085 393/500 [======================>.......] - ETA: 53s - loss: 0.2214 - regression_loss: 0.2130 - classification_loss: 0.0084 394/500 [======================>.......] - ETA: 53s - loss: 0.2213 - regression_loss: 0.2128 - classification_loss: 0.0084 395/500 [======================>.......] - ETA: 52s - loss: 0.2215 - regression_loss: 0.2131 - classification_loss: 0.0084 396/500 [======================>.......] - ETA: 52s - loss: 0.2218 - regression_loss: 0.2134 - classification_loss: 0.0085 397/500 [======================>.......] - ETA: 51s - loss: 0.2218 - regression_loss: 0.2133 - classification_loss: 0.0084 398/500 [======================>.......] - ETA: 51s - loss: 0.2215 - regression_loss: 0.2131 - classification_loss: 0.0084 399/500 [======================>.......] - ETA: 50s - loss: 0.2212 - regression_loss: 0.2128 - classification_loss: 0.0084 400/500 [=======================>......] - ETA: 50s - loss: 0.2212 - regression_loss: 0.2128 - classification_loss: 0.0084 401/500 [=======================>......] - ETA: 49s - loss: 0.2213 - regression_loss: 0.2129 - classification_loss: 0.0084 402/500 [=======================>......] - ETA: 49s - loss: 0.2212 - regression_loss: 0.2127 - classification_loss: 0.0084 403/500 [=======================>......] - ETA: 48s - loss: 0.2214 - regression_loss: 0.2129 - classification_loss: 0.0085 404/500 [=======================>......] - ETA: 48s - loss: 0.2211 - regression_loss: 0.2126 - classification_loss: 0.0085 405/500 [=======================>......] - ETA: 47s - loss: 0.2209 - regression_loss: 0.2125 - classification_loss: 0.0085 406/500 [=======================>......] - ETA: 47s - loss: 0.2207 - regression_loss: 0.2122 - classification_loss: 0.0084 407/500 [=======================>......] - ETA: 46s - loss: 0.2203 - regression_loss: 0.2119 - classification_loss: 0.0084 408/500 [=======================>......] - ETA: 46s - loss: 0.2201 - regression_loss: 0.2116 - classification_loss: 0.0084 409/500 [=======================>......] - ETA: 45s - loss: 0.2197 - regression_loss: 0.2113 - classification_loss: 0.0084 410/500 [=======================>......] - ETA: 45s - loss: 0.2194 - regression_loss: 0.2110 - classification_loss: 0.0084 411/500 [=======================>......] - ETA: 44s - loss: 0.2195 - regression_loss: 0.2111 - classification_loss: 0.0084 412/500 [=======================>......] - ETA: 44s - loss: 0.2196 - regression_loss: 0.2112 - classification_loss: 0.0084 413/500 [=======================>......] - ETA: 43s - loss: 0.2196 - regression_loss: 0.2112 - classification_loss: 0.0084 414/500 [=======================>......] - ETA: 43s - loss: 0.2195 - regression_loss: 0.2111 - classification_loss: 0.0084 415/500 [=======================>......] - ETA: 42s - loss: 0.2194 - regression_loss: 0.2110 - classification_loss: 0.0084 416/500 [=======================>......] - ETA: 42s - loss: 0.2192 - regression_loss: 0.2108 - classification_loss: 0.0084 417/500 [========================>.....] - ETA: 41s - loss: 0.2189 - regression_loss: 0.2105 - classification_loss: 0.0084 418/500 [========================>.....] - ETA: 41s - loss: 0.2188 - regression_loss: 0.2105 - classification_loss: 0.0083 419/500 [========================>.....] - ETA: 40s - loss: 0.2187 - regression_loss: 0.2103 - classification_loss: 0.0083 420/500 [========================>.....] - ETA: 40s - loss: 0.2190 - regression_loss: 0.2106 - classification_loss: 0.0083 421/500 [========================>.....] - ETA: 39s - loss: 0.2191 - regression_loss: 0.2107 - classification_loss: 0.0083 422/500 [========================>.....] - ETA: 39s - loss: 0.2194 - regression_loss: 0.2111 - classification_loss: 0.0083 423/500 [========================>.....] - ETA: 38s - loss: 0.2195 - regression_loss: 0.2111 - classification_loss: 0.0083 424/500 [========================>.....] - ETA: 38s - loss: 0.2193 - regression_loss: 0.2110 - classification_loss: 0.0083 425/500 [========================>.....] - ETA: 37s - loss: 0.2193 - regression_loss: 0.2109 - classification_loss: 0.0083 426/500 [========================>.....] - ETA: 37s - loss: 0.2192 - regression_loss: 0.2109 - classification_loss: 0.0083 427/500 [========================>.....] - ETA: 36s - loss: 0.2194 - regression_loss: 0.2111 - classification_loss: 0.0083 428/500 [========================>.....] - ETA: 36s - loss: 0.2192 - regression_loss: 0.2109 - classification_loss: 0.0083 429/500 [========================>.....] - ETA: 35s - loss: 0.2190 - regression_loss: 0.2107 - classification_loss: 0.0083 430/500 [========================>.....] - ETA: 35s - loss: 0.2189 - regression_loss: 0.2106 - classification_loss: 0.0083 431/500 [========================>.....] - ETA: 34s - loss: 0.2186 - regression_loss: 0.2103 - classification_loss: 0.0083 432/500 [========================>.....] - ETA: 34s - loss: 0.2183 - regression_loss: 0.2100 - classification_loss: 0.0083 433/500 [========================>.....] - ETA: 33s - loss: 0.2183 - regression_loss: 0.2101 - classification_loss: 0.0083 434/500 [=========================>....] - ETA: 33s - loss: 0.2182 - regression_loss: 0.2100 - classification_loss: 0.0082 435/500 [=========================>....] - ETA: 32s - loss: 0.2181 - regression_loss: 0.2098 - classification_loss: 0.0082 436/500 [=========================>....] - ETA: 32s - loss: 0.2179 - regression_loss: 0.2097 - classification_loss: 0.0082 437/500 [=========================>....] - ETA: 31s - loss: 0.2176 - regression_loss: 0.2094 - classification_loss: 0.0082 438/500 [=========================>....] - ETA: 31s - loss: 0.2175 - regression_loss: 0.2093 - classification_loss: 0.0082 439/500 [=========================>....] - ETA: 30s - loss: 0.2174 - regression_loss: 0.2092 - classification_loss: 0.0082 440/500 [=========================>....] - ETA: 30s - loss: 0.2175 - regression_loss: 0.2093 - classification_loss: 0.0082 441/500 [=========================>....] - ETA: 29s - loss: 0.2175 - regression_loss: 0.2093 - classification_loss: 0.0082 442/500 [=========================>....] - ETA: 29s - loss: 0.2172 - regression_loss: 0.2091 - classification_loss: 0.0082 443/500 [=========================>....] - ETA: 28s - loss: 0.2170 - regression_loss: 0.2089 - classification_loss: 0.0082 444/500 [=========================>....] - ETA: 28s - loss: 0.2172 - regression_loss: 0.2090 - classification_loss: 0.0082 445/500 [=========================>....] - ETA: 27s - loss: 0.2172 - regression_loss: 0.2090 - classification_loss: 0.0082 446/500 [=========================>....] - ETA: 27s - loss: 0.2171 - regression_loss: 0.2089 - classification_loss: 0.0082 447/500 [=========================>....] - ETA: 26s - loss: 0.2169 - regression_loss: 0.2087 - classification_loss: 0.0082 448/500 [=========================>....] - ETA: 26s - loss: 0.2174 - regression_loss: 0.2091 - classification_loss: 0.0082 449/500 [=========================>....] - ETA: 25s - loss: 0.2179 - regression_loss: 0.2096 - classification_loss: 0.0083 450/500 [==========================>...] - ETA: 25s - loss: 0.2179 - regression_loss: 0.2096 - classification_loss: 0.0083 451/500 [==========================>...] - ETA: 24s - loss: 0.2176 - regression_loss: 0.2094 - classification_loss: 0.0083 452/500 [==========================>...] - ETA: 24s - loss: 0.2176 - regression_loss: 0.2094 - classification_loss: 0.0083 453/500 [==========================>...] - ETA: 23s - loss: 0.2175 - regression_loss: 0.2092 - classification_loss: 0.0083 454/500 [==========================>...] - ETA: 23s - loss: 0.2173 - regression_loss: 0.2090 - classification_loss: 0.0082 455/500 [==========================>...] - ETA: 22s - loss: 0.2173 - regression_loss: 0.2091 - classification_loss: 0.0083 456/500 [==========================>...] - ETA: 22s - loss: 0.2171 - regression_loss: 0.2089 - classification_loss: 0.0082 457/500 [==========================>...] - ETA: 21s - loss: 0.2168 - regression_loss: 0.2086 - classification_loss: 0.0082 458/500 [==========================>...] - ETA: 21s - loss: 0.2166 - regression_loss: 0.2084 - classification_loss: 0.0082 459/500 [==========================>...] - ETA: 20s - loss: 0.2165 - regression_loss: 0.2083 - classification_loss: 0.0082 460/500 [==========================>...] - ETA: 20s - loss: 0.2164 - regression_loss: 0.2082 - classification_loss: 0.0082 461/500 [==========================>...] - ETA: 19s - loss: 0.2166 - regression_loss: 0.2083 - classification_loss: 0.0083 462/500 [==========================>...] - ETA: 19s - loss: 0.2163 - regression_loss: 0.2081 - classification_loss: 0.0082 463/500 [==========================>...] - ETA: 18s - loss: 0.2165 - regression_loss: 0.2082 - classification_loss: 0.0082 464/500 [==========================>...] - ETA: 18s - loss: 0.2166 - regression_loss: 0.2084 - classification_loss: 0.0082 465/500 [==========================>...] - ETA: 17s - loss: 0.2163 - regression_loss: 0.2081 - classification_loss: 0.0082 466/500 [==========================>...] - ETA: 17s - loss: 0.2161 - regression_loss: 0.2079 - classification_loss: 0.0082 467/500 [===========================>..] - ETA: 16s - loss: 0.2162 - regression_loss: 0.2080 - classification_loss: 0.0082 468/500 [===========================>..] - ETA: 16s - loss: 0.2165 - regression_loss: 0.2082 - classification_loss: 0.0082 469/500 [===========================>..] - ETA: 15s - loss: 0.2163 - regression_loss: 0.2081 - classification_loss: 0.0082 470/500 [===========================>..] - ETA: 15s - loss: 0.2163 - regression_loss: 0.2081 - classification_loss: 0.0082 471/500 [===========================>..] - ETA: 14s - loss: 0.2162 - regression_loss: 0.2080 - classification_loss: 0.0082 472/500 [===========================>..] - ETA: 14s - loss: 0.2163 - regression_loss: 0.2081 - classification_loss: 0.0082 473/500 [===========================>..] - ETA: 13s - loss: 0.2160 - regression_loss: 0.2078 - classification_loss: 0.0082 474/500 [===========================>..] - ETA: 13s - loss: 0.2160 - regression_loss: 0.2078 - classification_loss: 0.0082 475/500 [===========================>..] - ETA: 12s - loss: 0.2161 - regression_loss: 0.2079 - classification_loss: 0.0082 476/500 [===========================>..] - ETA: 12s - loss: 0.2159 - regression_loss: 0.2078 - classification_loss: 0.0082 477/500 [===========================>..] - ETA: 11s - loss: 0.2159 - regression_loss: 0.2078 - classification_loss: 0.0082 478/500 [===========================>..] - ETA: 11s - loss: 0.2157 - regression_loss: 0.2075 - classification_loss: 0.0082 479/500 [===========================>..] - ETA: 10s - loss: 0.2154 - regression_loss: 0.2072 - classification_loss: 0.0081 480/500 [===========================>..] - ETA: 10s - loss: 0.2154 - regression_loss: 0.2072 - classification_loss: 0.0081 481/500 [===========================>..] - ETA: 9s - loss: 0.2152 - regression_loss: 0.2071 - classification_loss: 0.0081  482/500 [===========================>..] - ETA: 9s - loss: 0.2151 - regression_loss: 0.2069 - classification_loss: 0.0081 483/500 [===========================>..] - ETA: 8s - loss: 0.2149 - regression_loss: 0.2068 - classification_loss: 0.0081 484/500 [============================>.] - ETA: 8s - loss: 0.2150 - regression_loss: 0.2068 - classification_loss: 0.0081 485/500 [============================>.] - ETA: 7s - loss: 0.2148 - regression_loss: 0.2067 - classification_loss: 0.0081 486/500 [============================>.] - ETA: 7s - loss: 0.2150 - regression_loss: 0.2069 - classification_loss: 0.0081 487/500 [============================>.] - ETA: 6s - loss: 0.2148 - regression_loss: 0.2067 - classification_loss: 0.0081 488/500 [============================>.] - ETA: 6s - loss: 0.2147 - regression_loss: 0.2066 - classification_loss: 0.0081 489/500 [============================>.] - ETA: 5s - loss: 0.2146 - regression_loss: 0.2064 - classification_loss: 0.0082 490/500 [============================>.] - ETA: 5s - loss: 0.2146 - regression_loss: 0.2065 - classification_loss: 0.0081 491/500 [============================>.] - ETA: 4s - loss: 0.2146 - regression_loss: 0.2064 - classification_loss: 0.0081 492/500 [============================>.] - ETA: 4s - loss: 0.2147 - regression_loss: 0.2065 - classification_loss: 0.0082 493/500 [============================>.] - ETA: 3s - loss: 0.2145 - regression_loss: 0.2064 - classification_loss: 0.0081 494/500 [============================>.] - ETA: 3s - loss: 0.2147 - regression_loss: 0.2066 - classification_loss: 0.0081 495/500 [============================>.] - ETA: 2s - loss: 0.2147 - regression_loss: 0.2065 - classification_loss: 0.0081 496/500 [============================>.] - ETA: 2s - loss: 0.2148 - regression_loss: 0.2067 - classification_loss: 0.0081 497/500 [============================>.] - ETA: 1s - loss: 0.2146 - regression_loss: 0.2065 - classification_loss: 0.0081 498/500 [============================>.] - ETA: 1s - loss: 0.2146 - regression_loss: 0.2064 - classification_loss: 0.0081 499/500 [============================>.] - ETA: 0s - loss: 0.2146 - regression_loss: 0.2065 - classification_loss: 0.0081 500/500 [==============================] - 251s 502ms/step - loss: 0.2151 - regression_loss: 0.2070 - classification_loss: 0.0081 1172 instances of class plum with average precision: 0.7379 mAP: 0.7379 Epoch 00039: saving model to ./training/snapshots/resnet101_pascal_39.h5 Epoch 40/150 1/500 [..............................] - ETA: 3:59 - loss: 0.5524 - regression_loss: 0.5377 - classification_loss: 0.0146 2/500 [..............................] - ETA: 4:03 - loss: 0.3944 - regression_loss: 0.3859 - classification_loss: 0.0085 3/500 [..............................] - ETA: 4:04 - loss: 0.3108 - regression_loss: 0.3036 - classification_loss: 0.0072 4/500 [..............................] - ETA: 4:06 - loss: 0.3056 - regression_loss: 0.2952 - classification_loss: 0.0104 5/500 [..............................] - ETA: 4:06 - loss: 0.2986 - regression_loss: 0.2889 - classification_loss: 0.0097 6/500 [..............................] - ETA: 4:06 - loss: 0.2566 - regression_loss: 0.2485 - classification_loss: 0.0081 7/500 [..............................] - ETA: 4:07 - loss: 0.2620 - regression_loss: 0.2539 - classification_loss: 0.0082 8/500 [..............................] - ETA: 4:06 - loss: 0.2614 - regression_loss: 0.2538 - classification_loss: 0.0076 9/500 [..............................] - ETA: 4:04 - loss: 0.2480 - regression_loss: 0.2409 - classification_loss: 0.0071 10/500 [..............................] - ETA: 4:04 - loss: 0.2523 - regression_loss: 0.2451 - classification_loss: 0.0072 11/500 [..............................] - ETA: 4:04 - loss: 0.2354 - regression_loss: 0.2288 - classification_loss: 0.0066 12/500 [..............................] - ETA: 4:04 - loss: 0.2412 - regression_loss: 0.2341 - classification_loss: 0.0071 13/500 [..............................] - ETA: 4:03 - loss: 0.2582 - regression_loss: 0.2499 - classification_loss: 0.0083 14/500 [..............................] - ETA: 4:02 - loss: 0.2573 - regression_loss: 0.2494 - classification_loss: 0.0079 15/500 [..............................] - ETA: 4:02 - loss: 0.2500 - regression_loss: 0.2425 - classification_loss: 0.0076 16/500 [..............................] - ETA: 4:01 - loss: 0.2550 - regression_loss: 0.2473 - classification_loss: 0.0077 17/500 [>.............................] - ETA: 4:01 - loss: 0.2582 - regression_loss: 0.2500 - classification_loss: 0.0081 18/500 [>.............................] - ETA: 4:01 - loss: 0.2687 - regression_loss: 0.2603 - classification_loss: 0.0083 19/500 [>.............................] - ETA: 4:00 - loss: 0.2682 - regression_loss: 0.2600 - classification_loss: 0.0082 20/500 [>.............................] - ETA: 4:00 - loss: 0.2647 - regression_loss: 0.2566 - classification_loss: 0.0081 21/500 [>.............................] - ETA: 3:59 - loss: 0.2594 - regression_loss: 0.2513 - classification_loss: 0.0080 22/500 [>.............................] - ETA: 3:59 - loss: 0.2601 - regression_loss: 0.2522 - classification_loss: 0.0079 23/500 [>.............................] - ETA: 3:58 - loss: 0.2593 - regression_loss: 0.2512 - classification_loss: 0.0081 24/500 [>.............................] - ETA: 3:58 - loss: 0.2576 - regression_loss: 0.2492 - classification_loss: 0.0084 25/500 [>.............................] - ETA: 3:58 - loss: 0.2547 - regression_loss: 0.2462 - classification_loss: 0.0084 26/500 [>.............................] - ETA: 3:57 - loss: 0.2498 - regression_loss: 0.2416 - classification_loss: 0.0082 27/500 [>.............................] - ETA: 3:57 - loss: 0.2453 - regression_loss: 0.2373 - classification_loss: 0.0080 28/500 [>.............................] - ETA: 3:56 - loss: 0.2421 - regression_loss: 0.2343 - classification_loss: 0.0078 29/500 [>.............................] - ETA: 3:56 - loss: 0.2393 - regression_loss: 0.2315 - classification_loss: 0.0078 30/500 [>.............................] - ETA: 3:55 - loss: 0.2367 - regression_loss: 0.2289 - classification_loss: 0.0078 31/500 [>.............................] - ETA: 3:55 - loss: 0.2358 - regression_loss: 0.2279 - classification_loss: 0.0079 32/500 [>.............................] - ETA: 3:54 - loss: 0.2333 - regression_loss: 0.2256 - classification_loss: 0.0077 33/500 [>.............................] - ETA: 3:54 - loss: 0.2324 - regression_loss: 0.2245 - classification_loss: 0.0078 34/500 [=>............................] - ETA: 3:53 - loss: 0.2296 - regression_loss: 0.2219 - classification_loss: 0.0077 35/500 [=>............................] - ETA: 3:53 - loss: 0.2283 - regression_loss: 0.2206 - classification_loss: 0.0078 36/500 [=>............................] - ETA: 3:52 - loss: 0.2317 - regression_loss: 0.2236 - classification_loss: 0.0081 37/500 [=>............................] - ETA: 3:51 - loss: 0.2336 - regression_loss: 0.2249 - classification_loss: 0.0086 38/500 [=>............................] - ETA: 3:51 - loss: 0.2300 - regression_loss: 0.2216 - classification_loss: 0.0084 39/500 [=>............................] - ETA: 3:51 - loss: 0.2308 - regression_loss: 0.2221 - classification_loss: 0.0088 40/500 [=>............................] - ETA: 3:50 - loss: 0.2282 - regression_loss: 0.2196 - classification_loss: 0.0087 41/500 [=>............................] - ETA: 3:50 - loss: 0.2284 - regression_loss: 0.2197 - classification_loss: 0.0086 42/500 [=>............................] - ETA: 3:49 - loss: 0.2265 - regression_loss: 0.2179 - classification_loss: 0.0085 43/500 [=>............................] - ETA: 3:49 - loss: 0.2290 - regression_loss: 0.2205 - classification_loss: 0.0085 44/500 [=>............................] - ETA: 3:49 - loss: 0.2315 - regression_loss: 0.2230 - classification_loss: 0.0085 45/500 [=>............................] - ETA: 3:48 - loss: 0.2283 - regression_loss: 0.2199 - classification_loss: 0.0084 46/500 [=>............................] - ETA: 3:48 - loss: 0.2264 - regression_loss: 0.2181 - classification_loss: 0.0083 47/500 [=>............................] - ETA: 3:47 - loss: 0.2249 - regression_loss: 0.2166 - classification_loss: 0.0083 48/500 [=>............................] - ETA: 3:47 - loss: 0.2260 - regression_loss: 0.2176 - classification_loss: 0.0084 49/500 [=>............................] - ETA: 3:46 - loss: 0.2278 - regression_loss: 0.2193 - classification_loss: 0.0086 50/500 [==>...........................] - ETA: 3:46 - loss: 0.2291 - regression_loss: 0.2202 - classification_loss: 0.0089 51/500 [==>...........................] - ETA: 3:46 - loss: 0.2292 - regression_loss: 0.2203 - classification_loss: 0.0089 52/500 [==>...........................] - ETA: 3:45 - loss: 0.2306 - regression_loss: 0.2217 - classification_loss: 0.0089 53/500 [==>...........................] - ETA: 3:44 - loss: 0.2305 - regression_loss: 0.2216 - classification_loss: 0.0089 54/500 [==>...........................] - ETA: 3:44 - loss: 0.2307 - regression_loss: 0.2214 - classification_loss: 0.0093 55/500 [==>...........................] - ETA: 3:43 - loss: 0.2333 - regression_loss: 0.2241 - classification_loss: 0.0092 56/500 [==>...........................] - ETA: 3:43 - loss: 0.2333 - regression_loss: 0.2242 - classification_loss: 0.0091 57/500 [==>...........................] - ETA: 3:42 - loss: 0.2306 - regression_loss: 0.2217 - classification_loss: 0.0090 58/500 [==>...........................] - ETA: 3:42 - loss: 0.2299 - regression_loss: 0.2210 - classification_loss: 0.0090 59/500 [==>...........................] - ETA: 3:41 - loss: 0.2283 - regression_loss: 0.2194 - classification_loss: 0.0089 60/500 [==>...........................] - ETA: 3:41 - loss: 0.2264 - regression_loss: 0.2175 - classification_loss: 0.0088 61/500 [==>...........................] - ETA: 3:40 - loss: 0.2264 - regression_loss: 0.2176 - classification_loss: 0.0088 62/500 [==>...........................] - ETA: 3:40 - loss: 0.2272 - regression_loss: 0.2183 - classification_loss: 0.0088 63/500 [==>...........................] - ETA: 3:39 - loss: 0.2245 - regression_loss: 0.2158 - classification_loss: 0.0087 64/500 [==>...........................] - ETA: 3:38 - loss: 0.2248 - regression_loss: 0.2160 - classification_loss: 0.0087 65/500 [==>...........................] - ETA: 3:38 - loss: 0.2224 - regression_loss: 0.2138 - classification_loss: 0.0086 66/500 [==>...........................] - ETA: 3:37 - loss: 0.2228 - regression_loss: 0.2142 - classification_loss: 0.0086 67/500 [===>..........................] - ETA: 3:36 - loss: 0.2211 - regression_loss: 0.2126 - classification_loss: 0.0085 68/500 [===>..........................] - ETA: 3:36 - loss: 0.2218 - regression_loss: 0.2132 - classification_loss: 0.0086 69/500 [===>..........................] - ETA: 3:35 - loss: 0.2209 - regression_loss: 0.2124 - classification_loss: 0.0086 70/500 [===>..........................] - ETA: 3:35 - loss: 0.2205 - regression_loss: 0.2119 - classification_loss: 0.0086 71/500 [===>..........................] - ETA: 3:34 - loss: 0.2187 - regression_loss: 0.2103 - classification_loss: 0.0085 72/500 [===>..........................] - ETA: 3:34 - loss: 0.2187 - regression_loss: 0.2103 - classification_loss: 0.0084 73/500 [===>..........................] - ETA: 3:34 - loss: 0.2169 - regression_loss: 0.2086 - classification_loss: 0.0083 74/500 [===>..........................] - ETA: 3:33 - loss: 0.2165 - regression_loss: 0.2082 - classification_loss: 0.0083 75/500 [===>..........................] - ETA: 3:33 - loss: 0.2148 - regression_loss: 0.2065 - classification_loss: 0.0083 76/500 [===>..........................] - ETA: 3:32 - loss: 0.2144 - regression_loss: 0.2062 - classification_loss: 0.0082 77/500 [===>..........................] - ETA: 3:32 - loss: 0.2208 - regression_loss: 0.2126 - classification_loss: 0.0082 78/500 [===>..........................] - ETA: 3:31 - loss: 0.2217 - regression_loss: 0.2136 - classification_loss: 0.0081 79/500 [===>..........................] - ETA: 3:31 - loss: 0.2225 - regression_loss: 0.2143 - classification_loss: 0.0082 80/500 [===>..........................] - ETA: 3:30 - loss: 0.2214 - regression_loss: 0.2132 - classification_loss: 0.0081 81/500 [===>..........................] - ETA: 3:30 - loss: 0.2195 - regression_loss: 0.2114 - classification_loss: 0.0081 82/500 [===>..........................] - ETA: 3:29 - loss: 0.2201 - regression_loss: 0.2121 - classification_loss: 0.0080 83/500 [===>..........................] - ETA: 3:29 - loss: 0.2195 - regression_loss: 0.2116 - classification_loss: 0.0080 84/500 [====>.........................] - ETA: 3:28 - loss: 0.2191 - regression_loss: 0.2111 - classification_loss: 0.0080 85/500 [====>.........................] - ETA: 3:28 - loss: 0.2191 - regression_loss: 0.2110 - classification_loss: 0.0081 86/500 [====>.........................] - ETA: 3:27 - loss: 0.2179 - regression_loss: 0.2098 - classification_loss: 0.0080 87/500 [====>.........................] - ETA: 3:27 - loss: 0.2161 - regression_loss: 0.2082 - classification_loss: 0.0080 88/500 [====>.........................] - ETA: 3:26 - loss: 0.2170 - regression_loss: 0.2091 - classification_loss: 0.0079 89/500 [====>.........................] - ETA: 3:26 - loss: 0.2165 - regression_loss: 0.2086 - classification_loss: 0.0079 90/500 [====>.........................] - ETA: 3:25 - loss: 0.2182 - regression_loss: 0.2103 - classification_loss: 0.0079 91/500 [====>.........................] - ETA: 3:25 - loss: 0.2220 - regression_loss: 0.2132 - classification_loss: 0.0088 92/500 [====>.........................] - ETA: 3:24 - loss: 0.2228 - regression_loss: 0.2140 - classification_loss: 0.0088 93/500 [====>.........................] - ETA: 3:24 - loss: 0.2213 - regression_loss: 0.2125 - classification_loss: 0.0087 94/500 [====>.........................] - ETA: 3:23 - loss: 0.2204 - regression_loss: 0.2116 - classification_loss: 0.0088 95/500 [====>.........................] - ETA: 3:23 - loss: 0.2196 - regression_loss: 0.2110 - classification_loss: 0.0087 96/500 [====>.........................] - ETA: 3:22 - loss: 0.2189 - regression_loss: 0.2102 - classification_loss: 0.0087 97/500 [====>.........................] - ETA: 3:22 - loss: 0.2195 - regression_loss: 0.2108 - classification_loss: 0.0087 98/500 [====>.........................] - ETA: 3:21 - loss: 0.2181 - regression_loss: 0.2095 - classification_loss: 0.0086 99/500 [====>.........................] - ETA: 3:21 - loss: 0.2165 - regression_loss: 0.2079 - classification_loss: 0.0086 100/500 [=====>........................] - ETA: 3:20 - loss: 0.2167 - regression_loss: 0.2082 - classification_loss: 0.0085 101/500 [=====>........................] - ETA: 3:20 - loss: 0.2179 - regression_loss: 0.2094 - classification_loss: 0.0085 102/500 [=====>........................] - ETA: 3:19 - loss: 0.2182 - regression_loss: 0.2096 - classification_loss: 0.0086 103/500 [=====>........................] - ETA: 3:19 - loss: 0.2172 - regression_loss: 0.2087 - classification_loss: 0.0085 104/500 [=====>........................] - ETA: 3:18 - loss: 0.2163 - regression_loss: 0.2079 - classification_loss: 0.0084 105/500 [=====>........................] - ETA: 3:18 - loss: 0.2154 - regression_loss: 0.2070 - classification_loss: 0.0084 106/500 [=====>........................] - ETA: 3:17 - loss: 0.2142 - regression_loss: 0.2059 - classification_loss: 0.0083 107/500 [=====>........................] - ETA: 3:17 - loss: 0.2140 - regression_loss: 0.2057 - classification_loss: 0.0083 108/500 [=====>........................] - ETA: 3:16 - loss: 0.2152 - regression_loss: 0.2068 - classification_loss: 0.0084 109/500 [=====>........................] - ETA: 3:16 - loss: 0.2163 - regression_loss: 0.2078 - classification_loss: 0.0084 110/500 [=====>........................] - ETA: 3:15 - loss: 0.2154 - regression_loss: 0.2070 - classification_loss: 0.0084 111/500 [=====>........................] - ETA: 3:15 - loss: 0.2153 - regression_loss: 0.2070 - classification_loss: 0.0083 112/500 [=====>........................] - ETA: 3:14 - loss: 0.2151 - regression_loss: 0.2068 - classification_loss: 0.0083 113/500 [=====>........................] - ETA: 3:14 - loss: 0.2140 - regression_loss: 0.2057 - classification_loss: 0.0082 114/500 [=====>........................] - ETA: 3:13 - loss: 0.2143 - regression_loss: 0.2061 - classification_loss: 0.0082 115/500 [=====>........................] - ETA: 3:13 - loss: 0.2147 - regression_loss: 0.2065 - classification_loss: 0.0082 116/500 [=====>........................] - ETA: 3:12 - loss: 0.2147 - regression_loss: 0.2065 - classification_loss: 0.0081 117/500 [======>.......................] - ETA: 3:12 - loss: 0.2148 - regression_loss: 0.2067 - classification_loss: 0.0081 118/500 [======>.......................] - ETA: 3:11 - loss: 0.2160 - regression_loss: 0.2077 - classification_loss: 0.0082 119/500 [======>.......................] - ETA: 3:11 - loss: 0.2163 - regression_loss: 0.2081 - classification_loss: 0.0083 120/500 [======>.......................] - ETA: 3:10 - loss: 0.2153 - regression_loss: 0.2071 - classification_loss: 0.0082 121/500 [======>.......................] - ETA: 3:10 - loss: 0.2143 - regression_loss: 0.2061 - classification_loss: 0.0082 122/500 [======>.......................] - ETA: 3:09 - loss: 0.2130 - regression_loss: 0.2049 - classification_loss: 0.0081 123/500 [======>.......................] - ETA: 3:09 - loss: 0.2118 - regression_loss: 0.2037 - classification_loss: 0.0081 124/500 [======>.......................] - ETA: 3:08 - loss: 0.2115 - regression_loss: 0.2034 - classification_loss: 0.0080 125/500 [======>.......................] - ETA: 3:08 - loss: 0.2112 - regression_loss: 0.2032 - classification_loss: 0.0080 126/500 [======>.......................] - ETA: 3:07 - loss: 0.2113 - regression_loss: 0.2034 - classification_loss: 0.0080 127/500 [======>.......................] - ETA: 3:07 - loss: 0.2119 - regression_loss: 0.2039 - classification_loss: 0.0080 128/500 [======>.......................] - ETA: 3:06 - loss: 0.2123 - regression_loss: 0.2043 - classification_loss: 0.0080 129/500 [======>.......................] - ETA: 3:06 - loss: 0.2120 - regression_loss: 0.2040 - classification_loss: 0.0080 130/500 [======>.......................] - ETA: 3:05 - loss: 0.2117 - regression_loss: 0.2037 - classification_loss: 0.0080 131/500 [======>.......................] - ETA: 3:05 - loss: 0.2116 - regression_loss: 0.2036 - classification_loss: 0.0080 132/500 [======>.......................] - ETA: 3:04 - loss: 0.2108 - regression_loss: 0.2029 - classification_loss: 0.0079 133/500 [======>.......................] - ETA: 3:04 - loss: 0.2112 - regression_loss: 0.2032 - classification_loss: 0.0080 134/500 [=======>......................] - ETA: 3:03 - loss: 0.2108 - regression_loss: 0.2028 - classification_loss: 0.0080 135/500 [=======>......................] - ETA: 3:03 - loss: 0.2103 - regression_loss: 0.2024 - classification_loss: 0.0080 136/500 [=======>......................] - ETA: 3:02 - loss: 0.2096 - regression_loss: 0.2017 - classification_loss: 0.0079 137/500 [=======>......................] - ETA: 3:02 - loss: 0.2087 - regression_loss: 0.2009 - classification_loss: 0.0079 138/500 [=======>......................] - ETA: 3:01 - loss: 0.2082 - regression_loss: 0.2004 - classification_loss: 0.0078 139/500 [=======>......................] - ETA: 3:01 - loss: 0.2091 - regression_loss: 0.2012 - classification_loss: 0.0079 140/500 [=======>......................] - ETA: 3:00 - loss: 0.2087 - regression_loss: 0.2008 - classification_loss: 0.0078 141/500 [=======>......................] - ETA: 3:00 - loss: 0.2096 - regression_loss: 0.2017 - classification_loss: 0.0079 142/500 [=======>......................] - ETA: 2:59 - loss: 0.2094 - regression_loss: 0.2015 - classification_loss: 0.0079 143/500 [=======>......................] - ETA: 2:59 - loss: 0.2097 - regression_loss: 0.2018 - classification_loss: 0.0079 144/500 [=======>......................] - ETA: 2:58 - loss: 0.2107 - regression_loss: 0.2029 - classification_loss: 0.0078 145/500 [=======>......................] - ETA: 2:58 - loss: 0.2105 - regression_loss: 0.2026 - classification_loss: 0.0078 146/500 [=======>......................] - ETA: 2:57 - loss: 0.2104 - regression_loss: 0.2025 - classification_loss: 0.0079 147/500 [=======>......................] - ETA: 2:57 - loss: 0.2100 - regression_loss: 0.2022 - classification_loss: 0.0078 148/500 [=======>......................] - ETA: 2:56 - loss: 0.2098 - regression_loss: 0.2020 - classification_loss: 0.0078 149/500 [=======>......................] - ETA: 2:56 - loss: 0.2093 - regression_loss: 0.2015 - classification_loss: 0.0078 150/500 [========>.....................] - ETA: 2:55 - loss: 0.2083 - regression_loss: 0.2005 - classification_loss: 0.0077 151/500 [========>.....................] - ETA: 2:55 - loss: 0.2079 - regression_loss: 0.2002 - classification_loss: 0.0077 152/500 [========>.....................] - ETA: 2:54 - loss: 0.2084 - regression_loss: 0.2007 - classification_loss: 0.0077 153/500 [========>.....................] - ETA: 2:54 - loss: 0.2087 - regression_loss: 0.2010 - classification_loss: 0.0077 154/500 [========>.....................] - ETA: 2:53 - loss: 0.2091 - regression_loss: 0.2014 - classification_loss: 0.0077 155/500 [========>.....................] - ETA: 2:53 - loss: 0.2093 - regression_loss: 0.2016 - classification_loss: 0.0077 156/500 [========>.....................] - ETA: 2:52 - loss: 0.2084 - regression_loss: 0.2008 - classification_loss: 0.0076 157/500 [========>.....................] - ETA: 2:52 - loss: 0.2081 - regression_loss: 0.2005 - classification_loss: 0.0076 158/500 [========>.....................] - ETA: 2:51 - loss: 0.2075 - regression_loss: 0.1999 - classification_loss: 0.0076 159/500 [========>.....................] - ETA: 2:51 - loss: 0.2077 - regression_loss: 0.2001 - classification_loss: 0.0076 160/500 [========>.....................] - ETA: 2:50 - loss: 0.2085 - regression_loss: 0.2007 - classification_loss: 0.0078 161/500 [========>.....................] - ETA: 2:50 - loss: 0.2078 - regression_loss: 0.2001 - classification_loss: 0.0078 162/500 [========>.....................] - ETA: 2:49 - loss: 0.2081 - regression_loss: 0.2004 - classification_loss: 0.0077 163/500 [========>.....................] - ETA: 2:49 - loss: 0.2077 - regression_loss: 0.2000 - classification_loss: 0.0077 164/500 [========>.....................] - ETA: 2:48 - loss: 0.2073 - regression_loss: 0.1997 - classification_loss: 0.0077 165/500 [========>.....................] - ETA: 2:48 - loss: 0.2072 - regression_loss: 0.1995 - classification_loss: 0.0077 166/500 [========>.....................] - ETA: 2:47 - loss: 0.2070 - regression_loss: 0.1994 - classification_loss: 0.0076 167/500 [=========>....................] - ETA: 2:47 - loss: 0.2066 - regression_loss: 0.1990 - classification_loss: 0.0077 168/500 [=========>....................] - ETA: 2:46 - loss: 0.2064 - regression_loss: 0.1988 - classification_loss: 0.0077 169/500 [=========>....................] - ETA: 2:46 - loss: 0.2066 - regression_loss: 0.1989 - classification_loss: 0.0077 170/500 [=========>....................] - ETA: 2:45 - loss: 0.2062 - regression_loss: 0.1985 - classification_loss: 0.0077 171/500 [=========>....................] - ETA: 2:45 - loss: 0.2054 - regression_loss: 0.1978 - classification_loss: 0.0076 172/500 [=========>....................] - ETA: 2:44 - loss: 0.2060 - regression_loss: 0.1983 - classification_loss: 0.0077 173/500 [=========>....................] - ETA: 2:44 - loss: 0.2052 - regression_loss: 0.1975 - classification_loss: 0.0077 174/500 [=========>....................] - ETA: 2:43 - loss: 0.2053 - regression_loss: 0.1976 - classification_loss: 0.0077 175/500 [=========>....................] - ETA: 2:43 - loss: 0.2053 - regression_loss: 0.1977 - classification_loss: 0.0076 176/500 [=========>....................] - ETA: 2:42 - loss: 0.2053 - regression_loss: 0.1976 - classification_loss: 0.0076 177/500 [=========>....................] - ETA: 2:42 - loss: 0.2058 - regression_loss: 0.1982 - classification_loss: 0.0076 178/500 [=========>....................] - ETA: 2:41 - loss: 0.2062 - regression_loss: 0.1986 - classification_loss: 0.0076 179/500 [=========>....................] - ETA: 2:41 - loss: 0.2066 - regression_loss: 0.1989 - classification_loss: 0.0077 180/500 [=========>....................] - ETA: 2:40 - loss: 0.2064 - regression_loss: 0.1987 - classification_loss: 0.0077 181/500 [=========>....................] - ETA: 2:40 - loss: 0.2058 - regression_loss: 0.1981 - classification_loss: 0.0076 182/500 [=========>....................] - ETA: 2:39 - loss: 0.2059 - regression_loss: 0.1983 - classification_loss: 0.0077 183/500 [=========>....................] - ETA: 2:39 - loss: 0.2057 - regression_loss: 0.1980 - classification_loss: 0.0077 184/500 [==========>...................] - ETA: 2:38 - loss: 0.2060 - regression_loss: 0.1982 - classification_loss: 0.0077 185/500 [==========>...................] - ETA: 2:38 - loss: 0.2059 - regression_loss: 0.1983 - classification_loss: 0.0077 186/500 [==========>...................] - ETA: 2:37 - loss: 0.2054 - regression_loss: 0.1978 - classification_loss: 0.0077 187/500 [==========>...................] - ETA: 2:37 - loss: 0.2052 - regression_loss: 0.1975 - classification_loss: 0.0076 188/500 [==========>...................] - ETA: 2:36 - loss: 0.2050 - regression_loss: 0.1974 - classification_loss: 0.0076 189/500 [==========>...................] - ETA: 2:36 - loss: 0.2054 - regression_loss: 0.1978 - classification_loss: 0.0076 190/500 [==========>...................] - ETA: 2:35 - loss: 0.2056 - regression_loss: 0.1979 - classification_loss: 0.0077 191/500 [==========>...................] - ETA: 2:35 - loss: 0.2049 - regression_loss: 0.1973 - classification_loss: 0.0076 192/500 [==========>...................] - ETA: 2:34 - loss: 0.2045 - regression_loss: 0.1969 - classification_loss: 0.0076 193/500 [==========>...................] - ETA: 2:34 - loss: 0.2050 - regression_loss: 0.1973 - classification_loss: 0.0076 194/500 [==========>...................] - ETA: 2:33 - loss: 0.2054 - regression_loss: 0.1978 - classification_loss: 0.0076 195/500 [==========>...................] - ETA: 2:33 - loss: 0.2049 - regression_loss: 0.1973 - classification_loss: 0.0076 196/500 [==========>...................] - ETA: 2:32 - loss: 0.2049 - regression_loss: 0.1973 - classification_loss: 0.0076 197/500 [==========>...................] - ETA: 2:32 - loss: 0.2043 - regression_loss: 0.1967 - classification_loss: 0.0075 198/500 [==========>...................] - ETA: 2:31 - loss: 0.2042 - regression_loss: 0.1967 - classification_loss: 0.0075 199/500 [==========>...................] - ETA: 2:31 - loss: 0.2041 - regression_loss: 0.1966 - classification_loss: 0.0075 200/500 [===========>..................] - ETA: 2:30 - loss: 0.2042 - regression_loss: 0.1967 - classification_loss: 0.0075 201/500 [===========>..................] - ETA: 2:30 - loss: 0.2038 - regression_loss: 0.1963 - classification_loss: 0.0075 202/500 [===========>..................] - ETA: 2:29 - loss: 0.2035 - regression_loss: 0.1960 - classification_loss: 0.0075 203/500 [===========>..................] - ETA: 2:29 - loss: 0.2032 - regression_loss: 0.1957 - classification_loss: 0.0075 204/500 [===========>..................] - ETA: 2:28 - loss: 0.2035 - regression_loss: 0.1960 - classification_loss: 0.0075 205/500 [===========>..................] - ETA: 2:28 - loss: 0.2043 - regression_loss: 0.1967 - classification_loss: 0.0075 206/500 [===========>..................] - ETA: 2:27 - loss: 0.2036 - regression_loss: 0.1961 - classification_loss: 0.0075 207/500 [===========>..................] - ETA: 2:27 - loss: 0.2039 - regression_loss: 0.1964 - classification_loss: 0.0076 208/500 [===========>..................] - ETA: 2:26 - loss: 0.2041 - regression_loss: 0.1965 - classification_loss: 0.0076 209/500 [===========>..................] - ETA: 2:26 - loss: 0.2047 - regression_loss: 0.1971 - classification_loss: 0.0076 210/500 [===========>..................] - ETA: 2:25 - loss: 0.2043 - regression_loss: 0.1967 - classification_loss: 0.0076 211/500 [===========>..................] - ETA: 2:25 - loss: 0.2041 - regression_loss: 0.1966 - classification_loss: 0.0076 212/500 [===========>..................] - ETA: 2:24 - loss: 0.2044 - regression_loss: 0.1968 - classification_loss: 0.0076 213/500 [===========>..................] - ETA: 2:24 - loss: 0.2049 - regression_loss: 0.1973 - classification_loss: 0.0076 214/500 [===========>..................] - ETA: 2:23 - loss: 0.2048 - regression_loss: 0.1973 - classification_loss: 0.0076 215/500 [===========>..................] - ETA: 2:23 - loss: 0.2043 - regression_loss: 0.1967 - classification_loss: 0.0076 216/500 [===========>..................] - ETA: 2:22 - loss: 0.2041 - regression_loss: 0.1965 - classification_loss: 0.0076 217/500 [============>.................] - ETA: 2:22 - loss: 0.2042 - regression_loss: 0.1966 - classification_loss: 0.0076 218/500 [============>.................] - ETA: 2:21 - loss: 0.2044 - regression_loss: 0.1969 - classification_loss: 0.0076 219/500 [============>.................] - ETA: 2:21 - loss: 0.2042 - regression_loss: 0.1967 - classification_loss: 0.0075 220/500 [============>.................] - ETA: 2:20 - loss: 0.2037 - regression_loss: 0.1962 - classification_loss: 0.0075 221/500 [============>.................] - ETA: 2:20 - loss: 0.2033 - regression_loss: 0.1958 - classification_loss: 0.0075 222/500 [============>.................] - ETA: 2:19 - loss: 0.2029 - regression_loss: 0.1955 - classification_loss: 0.0075 223/500 [============>.................] - ETA: 2:19 - loss: 0.2030 - regression_loss: 0.1955 - classification_loss: 0.0074 224/500 [============>.................] - ETA: 2:18 - loss: 0.2043 - regression_loss: 0.1968 - classification_loss: 0.0075 225/500 [============>.................] - ETA: 2:18 - loss: 0.2048 - regression_loss: 0.1973 - classification_loss: 0.0075 226/500 [============>.................] - ETA: 2:17 - loss: 0.2046 - regression_loss: 0.1971 - classification_loss: 0.0075 227/500 [============>.................] - ETA: 2:17 - loss: 0.2044 - regression_loss: 0.1969 - classification_loss: 0.0075 228/500 [============>.................] - ETA: 2:16 - loss: 0.2042 - regression_loss: 0.1967 - classification_loss: 0.0075 229/500 [============>.................] - ETA: 2:16 - loss: 0.2046 - regression_loss: 0.1971 - classification_loss: 0.0075 230/500 [============>.................] - ETA: 2:15 - loss: 0.2050 - regression_loss: 0.1975 - classification_loss: 0.0076 231/500 [============>.................] - ETA: 2:15 - loss: 0.2050 - regression_loss: 0.1975 - classification_loss: 0.0075 232/500 [============>.................] - ETA: 2:14 - loss: 0.2050 - regression_loss: 0.1975 - classification_loss: 0.0075 233/500 [============>.................] - ETA: 2:14 - loss: 0.2064 - regression_loss: 0.1989 - classification_loss: 0.0075 234/500 [=============>................] - ETA: 2:13 - loss: 0.2071 - regression_loss: 0.1995 - classification_loss: 0.0076 235/500 [=============>................] - ETA: 2:13 - loss: 0.2071 - regression_loss: 0.1995 - classification_loss: 0.0076 236/500 [=============>................] - ETA: 2:12 - loss: 0.2069 - regression_loss: 0.1993 - classification_loss: 0.0075 237/500 [=============>................] - ETA: 2:12 - loss: 0.2067 - regression_loss: 0.1991 - classification_loss: 0.0075 238/500 [=============>................] - ETA: 2:11 - loss: 0.2060 - regression_loss: 0.1985 - classification_loss: 0.0075 239/500 [=============>................] - ETA: 2:11 - loss: 0.2058 - regression_loss: 0.1983 - classification_loss: 0.0075 240/500 [=============>................] - ETA: 2:10 - loss: 0.2056 - regression_loss: 0.1981 - classification_loss: 0.0075 241/500 [=============>................] - ETA: 2:10 - loss: 0.2056 - regression_loss: 0.1982 - classification_loss: 0.0075 242/500 [=============>................] - ETA: 2:09 - loss: 0.2054 - regression_loss: 0.1979 - classification_loss: 0.0074 243/500 [=============>................] - ETA: 2:09 - loss: 0.2056 - regression_loss: 0.1981 - classification_loss: 0.0074 244/500 [=============>................] - ETA: 2:08 - loss: 0.2058 - regression_loss: 0.1983 - classification_loss: 0.0075 245/500 [=============>................] - ETA: 2:08 - loss: 0.2061 - regression_loss: 0.1986 - classification_loss: 0.0075 246/500 [=============>................] - ETA: 2:07 - loss: 0.2057 - regression_loss: 0.1982 - classification_loss: 0.0075 247/500 [=============>................] - ETA: 2:07 - loss: 0.2062 - regression_loss: 0.1987 - classification_loss: 0.0075 248/500 [=============>................] - ETA: 2:06 - loss: 0.2066 - regression_loss: 0.1991 - classification_loss: 0.0075 249/500 [=============>................] - ETA: 2:06 - loss: 0.2063 - regression_loss: 0.1988 - classification_loss: 0.0075 250/500 [==============>...............] - ETA: 2:05 - loss: 0.2058 - regression_loss: 0.1983 - classification_loss: 0.0075 251/500 [==============>...............] - ETA: 2:05 - loss: 0.2063 - regression_loss: 0.1988 - classification_loss: 0.0075 252/500 [==============>...............] - ETA: 2:04 - loss: 0.2068 - regression_loss: 0.1993 - classification_loss: 0.0075 253/500 [==============>...............] - ETA: 2:04 - loss: 0.2071 - regression_loss: 0.1996 - classification_loss: 0.0075 254/500 [==============>...............] - ETA: 2:03 - loss: 0.2077 - regression_loss: 0.2002 - classification_loss: 0.0075 255/500 [==============>...............] - ETA: 2:03 - loss: 0.2081 - regression_loss: 0.2006 - classification_loss: 0.0075 256/500 [==============>...............] - ETA: 2:02 - loss: 0.2081 - regression_loss: 0.2006 - classification_loss: 0.0075 257/500 [==============>...............] - ETA: 2:02 - loss: 0.2089 - regression_loss: 0.2014 - classification_loss: 0.0075 258/500 [==============>...............] - ETA: 2:01 - loss: 0.2086 - regression_loss: 0.2012 - classification_loss: 0.0075 259/500 [==============>...............] - ETA: 2:01 - loss: 0.2089 - regression_loss: 0.2014 - classification_loss: 0.0075 260/500 [==============>...............] - ETA: 2:00 - loss: 0.2086 - regression_loss: 0.2012 - classification_loss: 0.0075 261/500 [==============>...............] - ETA: 2:00 - loss: 0.2085 - regression_loss: 0.2011 - classification_loss: 0.0075 262/500 [==============>...............] - ETA: 1:59 - loss: 0.2088 - regression_loss: 0.2013 - classification_loss: 0.0075 263/500 [==============>...............] - ETA: 1:59 - loss: 0.2091 - regression_loss: 0.2016 - classification_loss: 0.0075 264/500 [==============>...............] - ETA: 1:58 - loss: 0.2100 - regression_loss: 0.2025 - classification_loss: 0.0075 265/500 [==============>...............] - ETA: 1:57 - loss: 0.2099 - regression_loss: 0.2024 - classification_loss: 0.0075 266/500 [==============>...............] - ETA: 1:57 - loss: 0.2105 - regression_loss: 0.2030 - classification_loss: 0.0075 267/500 [===============>..............] - ETA: 1:56 - loss: 0.2106 - regression_loss: 0.2031 - classification_loss: 0.0075 268/500 [===============>..............] - ETA: 1:56 - loss: 0.2109 - regression_loss: 0.2033 - classification_loss: 0.0075 269/500 [===============>..............] - ETA: 1:55 - loss: 0.2107 - regression_loss: 0.2032 - classification_loss: 0.0075 270/500 [===============>..............] - ETA: 1:55 - loss: 0.2105 - regression_loss: 0.2030 - classification_loss: 0.0075 271/500 [===============>..............] - ETA: 1:54 - loss: 0.2103 - regression_loss: 0.2028 - classification_loss: 0.0075 272/500 [===============>..............] - ETA: 1:54 - loss: 0.2099 - regression_loss: 0.2024 - classification_loss: 0.0075 273/500 [===============>..............] - ETA: 1:53 - loss: 0.2097 - regression_loss: 0.2023 - classification_loss: 0.0075 274/500 [===============>..............] - ETA: 1:53 - loss: 0.2096 - regression_loss: 0.2021 - classification_loss: 0.0074 275/500 [===============>..............] - ETA: 1:52 - loss: 0.2096 - regression_loss: 0.2021 - classification_loss: 0.0074 276/500 [===============>..............] - ETA: 1:52 - loss: 0.2098 - regression_loss: 0.2023 - classification_loss: 0.0074 277/500 [===============>..............] - ETA: 1:51 - loss: 0.2100 - regression_loss: 0.2026 - classification_loss: 0.0074 278/500 [===============>..............] - ETA: 1:51 - loss: 0.2100 - regression_loss: 0.2026 - classification_loss: 0.0074 279/500 [===============>..............] - ETA: 1:50 - loss: 0.2102 - regression_loss: 0.2027 - classification_loss: 0.0075 280/500 [===============>..............] - ETA: 1:50 - loss: 0.2099 - regression_loss: 0.2024 - classification_loss: 0.0074 281/500 [===============>..............] - ETA: 1:49 - loss: 0.2096 - regression_loss: 0.2022 - classification_loss: 0.0074 282/500 [===============>..............] - ETA: 1:49 - loss: 0.2096 - regression_loss: 0.2021 - classification_loss: 0.0074 283/500 [===============>..............] - ETA: 1:48 - loss: 0.2107 - regression_loss: 0.2032 - classification_loss: 0.0075 284/500 [================>.............] - ETA: 1:48 - loss: 0.2107 - regression_loss: 0.2033 - classification_loss: 0.0074 285/500 [================>.............] - ETA: 1:47 - loss: 0.2105 - regression_loss: 0.2031 - classification_loss: 0.0074 286/500 [================>.............] - ETA: 1:47 - loss: 0.2109 - regression_loss: 0.2034 - classification_loss: 0.0075 287/500 [================>.............] - ETA: 1:46 - loss: 0.2109 - regression_loss: 0.2034 - classification_loss: 0.0075 288/500 [================>.............] - ETA: 1:46 - loss: 0.2109 - regression_loss: 0.2034 - classification_loss: 0.0075 289/500 [================>.............] - ETA: 1:45 - loss: 0.2111 - regression_loss: 0.2036 - classification_loss: 0.0074 290/500 [================>.............] - ETA: 1:45 - loss: 0.2114 - regression_loss: 0.2039 - classification_loss: 0.0075 291/500 [================>.............] - ETA: 1:44 - loss: 0.2113 - regression_loss: 0.2037 - classification_loss: 0.0075 292/500 [================>.............] - ETA: 1:44 - loss: 0.2112 - regression_loss: 0.2037 - classification_loss: 0.0075 293/500 [================>.............] - ETA: 1:43 - loss: 0.2111 - regression_loss: 0.2037 - classification_loss: 0.0075 294/500 [================>.............] - ETA: 1:43 - loss: 0.2116 - regression_loss: 0.2041 - classification_loss: 0.0075 295/500 [================>.............] - ETA: 1:42 - loss: 0.2115 - regression_loss: 0.2040 - classification_loss: 0.0075 296/500 [================>.............] - ETA: 1:42 - loss: 0.2118 - regression_loss: 0.2043 - classification_loss: 0.0075 297/500 [================>.............] - ETA: 1:41 - loss: 0.2118 - regression_loss: 0.2043 - classification_loss: 0.0075 298/500 [================>.............] - ETA: 1:41 - loss: 0.2117 - regression_loss: 0.2041 - classification_loss: 0.0075 299/500 [================>.............] - ETA: 1:40 - loss: 0.2113 - regression_loss: 0.2038 - classification_loss: 0.0075 300/500 [=================>............] - ETA: 1:40 - loss: 0.2111 - regression_loss: 0.2037 - classification_loss: 0.0075 301/500 [=================>............] - ETA: 1:39 - loss: 0.2111 - regression_loss: 0.2037 - classification_loss: 0.0075 302/500 [=================>............] - ETA: 1:39 - loss: 0.2113 - regression_loss: 0.2039 - classification_loss: 0.0074 303/500 [=================>............] - ETA: 1:38 - loss: 0.2113 - regression_loss: 0.2038 - classification_loss: 0.0075 304/500 [=================>............] - ETA: 1:38 - loss: 0.2117 - regression_loss: 0.2041 - classification_loss: 0.0075 305/500 [=================>............] - ETA: 1:37 - loss: 0.2114 - regression_loss: 0.2039 - classification_loss: 0.0075 306/500 [=================>............] - ETA: 1:37 - loss: 0.2112 - regression_loss: 0.2037 - classification_loss: 0.0075 307/500 [=================>............] - ETA: 1:36 - loss: 0.2110 - regression_loss: 0.2035 - classification_loss: 0.0075 308/500 [=================>............] - ETA: 1:36 - loss: 0.2108 - regression_loss: 0.2034 - classification_loss: 0.0075 309/500 [=================>............] - ETA: 1:35 - loss: 0.2106 - regression_loss: 0.2032 - classification_loss: 0.0074 310/500 [=================>............] - ETA: 1:35 - loss: 0.2104 - regression_loss: 0.2030 - classification_loss: 0.0074 311/500 [=================>............] - ETA: 1:34 - loss: 0.2104 - regression_loss: 0.2029 - classification_loss: 0.0074 312/500 [=================>............] - ETA: 1:34 - loss: 0.2104 - regression_loss: 0.2030 - classification_loss: 0.0074 313/500 [=================>............] - ETA: 1:33 - loss: 0.2104 - regression_loss: 0.2029 - classification_loss: 0.0074 314/500 [=================>............] - ETA: 1:33 - loss: 0.2100 - regression_loss: 0.2025 - classification_loss: 0.0074 315/500 [=================>............] - ETA: 1:32 - loss: 0.2100 - regression_loss: 0.2026 - classification_loss: 0.0074 316/500 [=================>............] - ETA: 1:32 - loss: 0.2101 - regression_loss: 0.2027 - classification_loss: 0.0074 317/500 [==================>...........] - ETA: 1:31 - loss: 0.2104 - regression_loss: 0.2030 - classification_loss: 0.0075 318/500 [==================>...........] - ETA: 1:31 - loss: 0.2104 - regression_loss: 0.2029 - classification_loss: 0.0074 319/500 [==================>...........] - ETA: 1:30 - loss: 0.2101 - regression_loss: 0.2027 - classification_loss: 0.0074 320/500 [==================>...........] - ETA: 1:30 - loss: 0.2101 - regression_loss: 0.2027 - classification_loss: 0.0074 321/500 [==================>...........] - ETA: 1:29 - loss: 0.2100 - regression_loss: 0.2025 - classification_loss: 0.0074 322/500 [==================>...........] - ETA: 1:29 - loss: 0.2097 - regression_loss: 0.2023 - classification_loss: 0.0074 323/500 [==================>...........] - ETA: 1:28 - loss: 0.2095 - regression_loss: 0.2021 - classification_loss: 0.0074 324/500 [==================>...........] - ETA: 1:28 - loss: 0.2093 - regression_loss: 0.2019 - classification_loss: 0.0074 325/500 [==================>...........] - ETA: 1:27 - loss: 0.2094 - regression_loss: 0.2020 - classification_loss: 0.0074 326/500 [==================>...........] - ETA: 1:27 - loss: 0.2095 - regression_loss: 0.2021 - classification_loss: 0.0074 327/500 [==================>...........] - ETA: 1:26 - loss: 0.2094 - regression_loss: 0.2020 - classification_loss: 0.0074 328/500 [==================>...........] - ETA: 1:26 - loss: 0.2096 - regression_loss: 0.2021 - classification_loss: 0.0074 329/500 [==================>...........] - ETA: 1:25 - loss: 0.2094 - regression_loss: 0.2020 - classification_loss: 0.0074 330/500 [==================>...........] - ETA: 1:25 - loss: 0.2090 - regression_loss: 0.2015 - classification_loss: 0.0074 331/500 [==================>...........] - ETA: 1:24 - loss: 0.2087 - regression_loss: 0.2013 - classification_loss: 0.0074 332/500 [==================>...........] - ETA: 1:24 - loss: 0.2088 - regression_loss: 0.2014 - classification_loss: 0.0074 333/500 [==================>...........] - ETA: 1:23 - loss: 0.2087 - regression_loss: 0.2013 - classification_loss: 0.0074 334/500 [===================>..........] - ETA: 1:23 - loss: 0.2085 - regression_loss: 0.2011 - classification_loss: 0.0074 335/500 [===================>..........] - ETA: 1:22 - loss: 0.2084 - regression_loss: 0.2009 - classification_loss: 0.0074 336/500 [===================>..........] - ETA: 1:22 - loss: 0.2085 - regression_loss: 0.2011 - classification_loss: 0.0075 337/500 [===================>..........] - ETA: 1:21 - loss: 0.2080 - regression_loss: 0.2005 - classification_loss: 0.0074 338/500 [===================>..........] - ETA: 1:21 - loss: 0.2076 - regression_loss: 0.2002 - classification_loss: 0.0074 339/500 [===================>..........] - ETA: 1:20 - loss: 0.2074 - regression_loss: 0.2000 - classification_loss: 0.0074 340/500 [===================>..........] - ETA: 1:20 - loss: 0.2069 - regression_loss: 0.1996 - classification_loss: 0.0074 341/500 [===================>..........] - ETA: 1:19 - loss: 0.2070 - regression_loss: 0.1996 - classification_loss: 0.0074 342/500 [===================>..........] - ETA: 1:19 - loss: 0.2071 - regression_loss: 0.1997 - classification_loss: 0.0074 343/500 [===================>..........] - ETA: 1:18 - loss: 0.2072 - regression_loss: 0.1997 - classification_loss: 0.0074 344/500 [===================>..........] - ETA: 1:18 - loss: 0.2068 - regression_loss: 0.1994 - classification_loss: 0.0074 345/500 [===================>..........] - ETA: 1:17 - loss: 0.2065 - regression_loss: 0.1991 - classification_loss: 0.0074 346/500 [===================>..........] - ETA: 1:17 - loss: 0.2063 - regression_loss: 0.1989 - classification_loss: 0.0074 347/500 [===================>..........] - ETA: 1:16 - loss: 0.2069 - regression_loss: 0.1996 - classification_loss: 0.0074 348/500 [===================>..........] - ETA: 1:16 - loss: 0.2065 - regression_loss: 0.1992 - classification_loss: 0.0074 349/500 [===================>..........] - ETA: 1:15 - loss: 0.2063 - regression_loss: 0.1989 - classification_loss: 0.0074 350/500 [====================>.........] - ETA: 1:15 - loss: 0.2067 - regression_loss: 0.1993 - classification_loss: 0.0074 351/500 [====================>.........] - ETA: 1:14 - loss: 0.2063 - regression_loss: 0.1990 - classification_loss: 0.0074 352/500 [====================>.........] - ETA: 1:14 - loss: 0.2059 - regression_loss: 0.1985 - classification_loss: 0.0074 353/500 [====================>.........] - ETA: 1:13 - loss: 0.2055 - regression_loss: 0.1981 - classification_loss: 0.0074 354/500 [====================>.........] - ETA: 1:13 - loss: 0.2052 - regression_loss: 0.1979 - classification_loss: 0.0073 355/500 [====================>.........] - ETA: 1:12 - loss: 0.2052 - regression_loss: 0.1979 - classification_loss: 0.0073 356/500 [====================>.........] - ETA: 1:12 - loss: 0.2049 - regression_loss: 0.1976 - classification_loss: 0.0073 357/500 [====================>.........] - ETA: 1:11 - loss: 0.2048 - regression_loss: 0.1974 - classification_loss: 0.0074 358/500 [====================>.........] - ETA: 1:11 - loss: 0.2043 - regression_loss: 0.1969 - classification_loss: 0.0073 359/500 [====================>.........] - ETA: 1:10 - loss: 0.2041 - regression_loss: 0.1968 - classification_loss: 0.0073 360/500 [====================>.........] - ETA: 1:10 - loss: 0.2038 - regression_loss: 0.1965 - classification_loss: 0.0073 361/500 [====================>.........] - ETA: 1:09 - loss: 0.2037 - regression_loss: 0.1963 - classification_loss: 0.0073 362/500 [====================>.........] - ETA: 1:09 - loss: 0.2039 - regression_loss: 0.1966 - classification_loss: 0.0074 363/500 [====================>.........] - ETA: 1:08 - loss: 0.2037 - regression_loss: 0.1964 - classification_loss: 0.0073 364/500 [====================>.........] - ETA: 1:08 - loss: 0.2034 - regression_loss: 0.1961 - classification_loss: 0.0073 365/500 [====================>.........] - ETA: 1:07 - loss: 0.2032 - regression_loss: 0.1959 - classification_loss: 0.0073 366/500 [====================>.........] - ETA: 1:07 - loss: 0.2034 - regression_loss: 0.1961 - classification_loss: 0.0073 367/500 [=====================>........] - ETA: 1:06 - loss: 0.2036 - regression_loss: 0.1963 - classification_loss: 0.0073 368/500 [=====================>........] - ETA: 1:06 - loss: 0.2037 - regression_loss: 0.1964 - classification_loss: 0.0073 369/500 [=====================>........] - ETA: 1:05 - loss: 0.2040 - regression_loss: 0.1966 - classification_loss: 0.0073 370/500 [=====================>........] - ETA: 1:05 - loss: 0.2040 - regression_loss: 0.1967 - classification_loss: 0.0073 371/500 [=====================>........] - ETA: 1:04 - loss: 0.2042 - regression_loss: 0.1969 - classification_loss: 0.0073 372/500 [=====================>........] - ETA: 1:04 - loss: 0.2038 - regression_loss: 0.1965 - classification_loss: 0.0073 373/500 [=====================>........] - ETA: 1:03 - loss: 0.2039 - regression_loss: 0.1966 - classification_loss: 0.0073 374/500 [=====================>........] - ETA: 1:03 - loss: 0.2037 - regression_loss: 0.1964 - classification_loss: 0.0073 375/500 [=====================>........] - ETA: 1:02 - loss: 0.2043 - regression_loss: 0.1968 - classification_loss: 0.0075 376/500 [=====================>........] - ETA: 1:02 - loss: 0.2041 - regression_loss: 0.1966 - classification_loss: 0.0075 377/500 [=====================>........] - ETA: 1:01 - loss: 0.2043 - regression_loss: 0.1968 - classification_loss: 0.0075 378/500 [=====================>........] - ETA: 1:01 - loss: 0.2044 - regression_loss: 0.1968 - classification_loss: 0.0076 379/500 [=====================>........] - ETA: 1:00 - loss: 0.2040 - regression_loss: 0.1964 - classification_loss: 0.0075 380/500 [=====================>........] - ETA: 1:00 - loss: 0.2039 - regression_loss: 0.1964 - classification_loss: 0.0075 381/500 [=====================>........] - ETA: 59s - loss: 0.2039 - regression_loss: 0.1964 - classification_loss: 0.0075  382/500 [=====================>........] - ETA: 59s - loss: 0.2037 - regression_loss: 0.1962 - classification_loss: 0.0075 383/500 [=====================>........] - ETA: 58s - loss: 0.2037 - regression_loss: 0.1962 - classification_loss: 0.0075 384/500 [======================>.......] - ETA: 58s - loss: 0.2037 - regression_loss: 0.1961 - classification_loss: 0.0075 385/500 [======================>.......] - ETA: 57s - loss: 0.2039 - regression_loss: 0.1963 - classification_loss: 0.0075 386/500 [======================>.......] - ETA: 57s - loss: 0.2042 - regression_loss: 0.1966 - classification_loss: 0.0076 387/500 [======================>.......] - ETA: 56s - loss: 0.2043 - regression_loss: 0.1968 - classification_loss: 0.0076 388/500 [======================>.......] - ETA: 56s - loss: 0.2041 - regression_loss: 0.1965 - classification_loss: 0.0076 389/500 [======================>.......] - ETA: 55s - loss: 0.2039 - regression_loss: 0.1963 - classification_loss: 0.0076 390/500 [======================>.......] - ETA: 55s - loss: 0.2036 - regression_loss: 0.1961 - classification_loss: 0.0075 391/500 [======================>.......] - ETA: 54s - loss: 0.2037 - regression_loss: 0.1962 - classification_loss: 0.0076 392/500 [======================>.......] - ETA: 54s - loss: 0.2037 - regression_loss: 0.1962 - classification_loss: 0.0076 393/500 [======================>.......] - ETA: 53s - loss: 0.2040 - regression_loss: 0.1963 - classification_loss: 0.0076 394/500 [======================>.......] - ETA: 53s - loss: 0.2036 - regression_loss: 0.1960 - classification_loss: 0.0076 395/500 [======================>.......] - ETA: 52s - loss: 0.2033 - regression_loss: 0.1957 - classification_loss: 0.0076 396/500 [======================>.......] - ETA: 52s - loss: 0.2030 - regression_loss: 0.1955 - classification_loss: 0.0076 397/500 [======================>.......] - ETA: 51s - loss: 0.2026 - regression_loss: 0.1951 - classification_loss: 0.0076 398/500 [======================>.......] - ETA: 51s - loss: 0.2027 - regression_loss: 0.1951 - classification_loss: 0.0076 399/500 [======================>.......] - ETA: 50s - loss: 0.2030 - regression_loss: 0.1953 - classification_loss: 0.0076 400/500 [=======================>......] - ETA: 50s - loss: 0.2031 - regression_loss: 0.1954 - classification_loss: 0.0076 401/500 [=======================>......] - ETA: 49s - loss: 0.2034 - regression_loss: 0.1957 - classification_loss: 0.0076 402/500 [=======================>......] - ETA: 49s - loss: 0.2033 - regression_loss: 0.1957 - classification_loss: 0.0076 403/500 [=======================>......] - ETA: 48s - loss: 0.2034 - regression_loss: 0.1958 - classification_loss: 0.0076 404/500 [=======================>......] - ETA: 48s - loss: 0.2031 - regression_loss: 0.1955 - classification_loss: 0.0076 405/500 [=======================>......] - ETA: 47s - loss: 0.2028 - regression_loss: 0.1952 - classification_loss: 0.0076 406/500 [=======================>......] - ETA: 47s - loss: 0.2027 - regression_loss: 0.1951 - classification_loss: 0.0076 407/500 [=======================>......] - ETA: 46s - loss: 0.2029 - regression_loss: 0.1953 - classification_loss: 0.0076 408/500 [=======================>......] - ETA: 46s - loss: 0.2027 - regression_loss: 0.1951 - classification_loss: 0.0076 409/500 [=======================>......] - ETA: 45s - loss: 0.2031 - regression_loss: 0.1954 - classification_loss: 0.0077 410/500 [=======================>......] - ETA: 45s - loss: 0.2030 - regression_loss: 0.1953 - classification_loss: 0.0076 411/500 [=======================>......] - ETA: 44s - loss: 0.2028 - regression_loss: 0.1952 - classification_loss: 0.0077 412/500 [=======================>......] - ETA: 44s - loss: 0.2027 - regression_loss: 0.1951 - classification_loss: 0.0077 413/500 [=======================>......] - ETA: 43s - loss: 0.2026 - regression_loss: 0.1950 - classification_loss: 0.0077 414/500 [=======================>......] - ETA: 43s - loss: 0.2029 - regression_loss: 0.1953 - classification_loss: 0.0077 415/500 [=======================>......] - ETA: 42s - loss: 0.2029 - regression_loss: 0.1952 - classification_loss: 0.0077 416/500 [=======================>......] - ETA: 42s - loss: 0.2027 - regression_loss: 0.1950 - classification_loss: 0.0077 417/500 [========================>.....] - ETA: 41s - loss: 0.2027 - regression_loss: 0.1950 - classification_loss: 0.0076 418/500 [========================>.....] - ETA: 41s - loss: 0.2027 - regression_loss: 0.1951 - classification_loss: 0.0076 419/500 [========================>.....] - ETA: 40s - loss: 0.2025 - regression_loss: 0.1949 - classification_loss: 0.0076 420/500 [========================>.....] - ETA: 40s - loss: 0.2024 - regression_loss: 0.1948 - classification_loss: 0.0076 421/500 [========================>.....] - ETA: 39s - loss: 0.2024 - regression_loss: 0.1947 - classification_loss: 0.0076 422/500 [========================>.....] - ETA: 39s - loss: 0.2026 - regression_loss: 0.1949 - classification_loss: 0.0076 423/500 [========================>.....] - ETA: 38s - loss: 0.2026 - regression_loss: 0.1949 - classification_loss: 0.0076 424/500 [========================>.....] - ETA: 38s - loss: 0.2025 - regression_loss: 0.1949 - classification_loss: 0.0076 425/500 [========================>.....] - ETA: 37s - loss: 0.2024 - regression_loss: 0.1948 - classification_loss: 0.0076 426/500 [========================>.....] - ETA: 37s - loss: 0.2022 - regression_loss: 0.1946 - classification_loss: 0.0076 427/500 [========================>.....] - ETA: 36s - loss: 0.2022 - regression_loss: 0.1946 - classification_loss: 0.0076 428/500 [========================>.....] - ETA: 36s - loss: 0.2022 - regression_loss: 0.1946 - classification_loss: 0.0076 429/500 [========================>.....] - ETA: 35s - loss: 0.2025 - regression_loss: 0.1949 - classification_loss: 0.0076 430/500 [========================>.....] - ETA: 35s - loss: 0.2025 - regression_loss: 0.1949 - classification_loss: 0.0076 431/500 [========================>.....] - ETA: 34s - loss: 0.2022 - regression_loss: 0.1946 - classification_loss: 0.0076 432/500 [========================>.....] - ETA: 34s - loss: 0.2020 - regression_loss: 0.1944 - classification_loss: 0.0076 433/500 [========================>.....] - ETA: 33s - loss: 0.2018 - regression_loss: 0.1942 - classification_loss: 0.0076 434/500 [=========================>....] - ETA: 33s - loss: 0.2019 - regression_loss: 0.1943 - classification_loss: 0.0076 435/500 [=========================>....] - ETA: 32s - loss: 0.2018 - regression_loss: 0.1943 - classification_loss: 0.0076 436/500 [=========================>....] - ETA: 32s - loss: 0.2014 - regression_loss: 0.1939 - classification_loss: 0.0076 437/500 [=========================>....] - ETA: 31s - loss: 0.2015 - regression_loss: 0.1940 - classification_loss: 0.0076 438/500 [=========================>....] - ETA: 31s - loss: 0.2016 - regression_loss: 0.1940 - classification_loss: 0.0076 439/500 [=========================>....] - ETA: 30s - loss: 0.2014 - regression_loss: 0.1938 - classification_loss: 0.0076 440/500 [=========================>....] - ETA: 30s - loss: 0.2012 - regression_loss: 0.1936 - classification_loss: 0.0076 441/500 [=========================>....] - ETA: 29s - loss: 0.2014 - regression_loss: 0.1937 - classification_loss: 0.0076 442/500 [=========================>....] - ETA: 29s - loss: 0.2015 - regression_loss: 0.1939 - classification_loss: 0.0076 443/500 [=========================>....] - ETA: 28s - loss: 0.2015 - regression_loss: 0.1938 - classification_loss: 0.0076 444/500 [=========================>....] - ETA: 28s - loss: 0.2012 - regression_loss: 0.1935 - classification_loss: 0.0076 445/500 [=========================>....] - ETA: 27s - loss: 0.2009 - regression_loss: 0.1933 - classification_loss: 0.0076 446/500 [=========================>....] - ETA: 27s - loss: 0.2007 - regression_loss: 0.1931 - classification_loss: 0.0076 447/500 [=========================>....] - ETA: 26s - loss: 0.2006 - regression_loss: 0.1930 - classification_loss: 0.0076 448/500 [=========================>....] - ETA: 26s - loss: 0.2008 - regression_loss: 0.1932 - classification_loss: 0.0076 449/500 [=========================>....] - ETA: 25s - loss: 0.2005 - regression_loss: 0.1929 - classification_loss: 0.0076 450/500 [==========================>...] - ETA: 25s - loss: 0.2006 - regression_loss: 0.1930 - classification_loss: 0.0076 451/500 [==========================>...] - ETA: 24s - loss: 0.2004 - regression_loss: 0.1928 - classification_loss: 0.0076 452/500 [==========================>...] - ETA: 24s - loss: 0.2006 - regression_loss: 0.1930 - classification_loss: 0.0076 453/500 [==========================>...] - ETA: 23s - loss: 0.2007 - regression_loss: 0.1932 - classification_loss: 0.0076 454/500 [==========================>...] - ETA: 23s - loss: 0.2007 - regression_loss: 0.1931 - classification_loss: 0.0076 455/500 [==========================>...] - ETA: 22s - loss: 0.2009 - regression_loss: 0.1934 - classification_loss: 0.0076 456/500 [==========================>...] - ETA: 22s - loss: 0.2007 - regression_loss: 0.1932 - classification_loss: 0.0076 457/500 [==========================>...] - ETA: 21s - loss: 0.2008 - regression_loss: 0.1933 - classification_loss: 0.0076 458/500 [==========================>...] - ETA: 21s - loss: 0.2010 - regression_loss: 0.1934 - classification_loss: 0.0076 459/500 [==========================>...] - ETA: 20s - loss: 0.2008 - regression_loss: 0.1933 - classification_loss: 0.0076 460/500 [==========================>...] - ETA: 20s - loss: 0.2007 - regression_loss: 0.1931 - classification_loss: 0.0076 461/500 [==========================>...] - ETA: 19s - loss: 0.2009 - regression_loss: 0.1933 - classification_loss: 0.0076 462/500 [==========================>...] - ETA: 19s - loss: 0.2007 - regression_loss: 0.1932 - classification_loss: 0.0075 463/500 [==========================>...] - ETA: 18s - loss: 0.2006 - regression_loss: 0.1931 - classification_loss: 0.0075 464/500 [==========================>...] - ETA: 18s - loss: 0.2009 - regression_loss: 0.1933 - classification_loss: 0.0075 465/500 [==========================>...] - ETA: 17s - loss: 0.2007 - regression_loss: 0.1932 - classification_loss: 0.0075 466/500 [==========================>...] - ETA: 17s - loss: 0.2011 - regression_loss: 0.1936 - classification_loss: 0.0075 467/500 [===========================>..] - ETA: 16s - loss: 0.2013 - regression_loss: 0.1937 - classification_loss: 0.0076 468/500 [===========================>..] - ETA: 16s - loss: 0.2014 - regression_loss: 0.1939 - classification_loss: 0.0076 469/500 [===========================>..] - ETA: 15s - loss: 0.2013 - regression_loss: 0.1937 - classification_loss: 0.0075 470/500 [===========================>..] - ETA: 15s - loss: 0.2015 - regression_loss: 0.1939 - classification_loss: 0.0076 471/500 [===========================>..] - ETA: 14s - loss: 0.2017 - regression_loss: 0.1941 - classification_loss: 0.0076 472/500 [===========================>..] - ETA: 14s - loss: 0.2018 - regression_loss: 0.1942 - classification_loss: 0.0076 473/500 [===========================>..] - ETA: 13s - loss: 0.2015 - regression_loss: 0.1939 - classification_loss: 0.0076 474/500 [===========================>..] - ETA: 13s - loss: 0.2015 - regression_loss: 0.1939 - classification_loss: 0.0076 475/500 [===========================>..] - ETA: 12s - loss: 0.2016 - regression_loss: 0.1940 - classification_loss: 0.0076 476/500 [===========================>..] - ETA: 12s - loss: 0.2014 - regression_loss: 0.1939 - classification_loss: 0.0076 477/500 [===========================>..] - ETA: 11s - loss: 0.2018 - regression_loss: 0.1942 - classification_loss: 0.0076 478/500 [===========================>..] - ETA: 11s - loss: 0.2018 - regression_loss: 0.1942 - classification_loss: 0.0076 479/500 [===========================>..] - ETA: 10s - loss: 0.2019 - regression_loss: 0.1943 - classification_loss: 0.0076 480/500 [===========================>..] - ETA: 10s - loss: 0.2019 - regression_loss: 0.1943 - classification_loss: 0.0076 481/500 [===========================>..] - ETA: 9s - loss: 0.2018 - regression_loss: 0.1942 - classification_loss: 0.0076  482/500 [===========================>..] - ETA: 9s - loss: 0.2016 - regression_loss: 0.1940 - classification_loss: 0.0076 483/500 [===========================>..] - ETA: 8s - loss: 0.2014 - regression_loss: 0.1938 - classification_loss: 0.0076 484/500 [============================>.] - ETA: 8s - loss: 0.2012 - regression_loss: 0.1937 - classification_loss: 0.0076 485/500 [============================>.] - ETA: 7s - loss: 0.2011 - regression_loss: 0.1936 - classification_loss: 0.0076 486/500 [============================>.] - ETA: 7s - loss: 0.2014 - regression_loss: 0.1939 - classification_loss: 0.0076 487/500 [============================>.] - ETA: 6s - loss: 0.2013 - regression_loss: 0.1938 - classification_loss: 0.0076 488/500 [============================>.] - ETA: 6s - loss: 0.2012 - regression_loss: 0.1936 - classification_loss: 0.0075 489/500 [============================>.] - ETA: 5s - loss: 0.2017 - regression_loss: 0.1941 - classification_loss: 0.0076 490/500 [============================>.] - ETA: 5s - loss: 0.2016 - regression_loss: 0.1940 - classification_loss: 0.0076 491/500 [============================>.] - ETA: 4s - loss: 0.2015 - regression_loss: 0.1940 - classification_loss: 0.0076 492/500 [============================>.] - ETA: 4s - loss: 0.2016 - regression_loss: 0.1940 - classification_loss: 0.0076 493/500 [============================>.] - ETA: 3s - loss: 0.2015 - regression_loss: 0.1939 - classification_loss: 0.0076 494/500 [============================>.] - ETA: 3s - loss: 0.2013 - regression_loss: 0.1937 - classification_loss: 0.0076 495/500 [============================>.] - ETA: 2s - loss: 0.2012 - regression_loss: 0.1937 - classification_loss: 0.0076 496/500 [============================>.] - ETA: 2s - loss: 0.2009 - regression_loss: 0.1934 - classification_loss: 0.0075 497/500 [============================>.] - ETA: 1s - loss: 0.2009 - regression_loss: 0.1933 - classification_loss: 0.0075 498/500 [============================>.] - ETA: 1s - loss: 0.2009 - regression_loss: 0.1934 - classification_loss: 0.0075 499/500 [============================>.] - ETA: 0s - loss: 0.2009 - regression_loss: 0.1934 - classification_loss: 0.0075 500/500 [==============================] - 253s 505ms/step - loss: 0.2010 - regression_loss: 0.1935 - classification_loss: 0.0075 1172 instances of class plum with average precision: 0.7519 mAP: 0.7519 Epoch 00040: saving model to ./training/snapshots/resnet101_pascal_40.h5 Epoch 41/150 1/500 [..............................] - ETA: 4:06 - loss: 0.1218 - regression_loss: 0.1147 - classification_loss: 0.0071 2/500 [..............................] - ETA: 4:10 - loss: 0.1921 - regression_loss: 0.1807 - classification_loss: 0.0114 3/500 [..............................] - ETA: 4:15 - loss: 0.2162 - regression_loss: 0.1985 - classification_loss: 0.0177 4/500 [..............................] - ETA: 4:15 - loss: 0.1810 - regression_loss: 0.1674 - classification_loss: 0.0137 5/500 [..............................] - ETA: 4:15 - loss: 0.1773 - regression_loss: 0.1654 - classification_loss: 0.0118 6/500 [..............................] - ETA: 4:14 - loss: 0.1665 - regression_loss: 0.1559 - classification_loss: 0.0106 7/500 [..............................] - ETA: 4:13 - loss: 0.1527 - regression_loss: 0.1433 - classification_loss: 0.0094 8/500 [..............................] - ETA: 4:12 - loss: 0.1562 - regression_loss: 0.1470 - classification_loss: 0.0093 9/500 [..............................] - ETA: 4:11 - loss: 0.1545 - regression_loss: 0.1456 - classification_loss: 0.0090 10/500 [..............................] - ETA: 4:10 - loss: 0.1638 - regression_loss: 0.1547 - classification_loss: 0.0091 11/500 [..............................] - ETA: 4:09 - loss: 0.1648 - regression_loss: 0.1559 - classification_loss: 0.0089 12/500 [..............................] - ETA: 4:08 - loss: 0.1638 - regression_loss: 0.1553 - classification_loss: 0.0086 13/500 [..............................] - ETA: 4:08 - loss: 0.1584 - regression_loss: 0.1504 - classification_loss: 0.0080 14/500 [..............................] - ETA: 4:07 - loss: 0.1665 - regression_loss: 0.1582 - classification_loss: 0.0083 15/500 [..............................] - ETA: 4:07 - loss: 0.1613 - regression_loss: 0.1535 - classification_loss: 0.0078 16/500 [..............................] - ETA: 4:07 - loss: 0.1642 - regression_loss: 0.1561 - classification_loss: 0.0081 17/500 [>.............................] - ETA: 4:08 - loss: 0.1627 - regression_loss: 0.1550 - classification_loss: 0.0077 18/500 [>.............................] - ETA: 4:07 - loss: 0.1705 - regression_loss: 0.1620 - classification_loss: 0.0084 19/500 [>.............................] - ETA: 4:06 - loss: 0.1718 - regression_loss: 0.1635 - classification_loss: 0.0083 20/500 [>.............................] - ETA: 4:06 - loss: 0.1702 - regression_loss: 0.1619 - classification_loss: 0.0083 21/500 [>.............................] - ETA: 4:05 - loss: 0.1740 - regression_loss: 0.1656 - classification_loss: 0.0084 22/500 [>.............................] - ETA: 4:04 - loss: 0.1790 - regression_loss: 0.1707 - classification_loss: 0.0083 23/500 [>.............................] - ETA: 4:03 - loss: 0.1758 - regression_loss: 0.1677 - classification_loss: 0.0081 24/500 [>.............................] - ETA: 4:03 - loss: 0.1784 - regression_loss: 0.1701 - classification_loss: 0.0083 25/500 [>.............................] - ETA: 4:02 - loss: 0.1740 - regression_loss: 0.1660 - classification_loss: 0.0081 26/500 [>.............................] - ETA: 4:01 - loss: 0.1749 - regression_loss: 0.1668 - classification_loss: 0.0081 27/500 [>.............................] - ETA: 4:01 - loss: 0.1762 - regression_loss: 0.1683 - classification_loss: 0.0079 28/500 [>.............................] - ETA: 4:00 - loss: 0.1790 - regression_loss: 0.1709 - classification_loss: 0.0081 29/500 [>.............................] - ETA: 3:59 - loss: 0.1762 - regression_loss: 0.1683 - classification_loss: 0.0078 30/500 [>.............................] - ETA: 3:59 - loss: 0.1786 - regression_loss: 0.1708 - classification_loss: 0.0078 31/500 [>.............................] - ETA: 3:58 - loss: 0.1849 - regression_loss: 0.1771 - classification_loss: 0.0078 32/500 [>.............................] - ETA: 3:58 - loss: 0.1880 - regression_loss: 0.1799 - classification_loss: 0.0081 33/500 [>.............................] - ETA: 3:57 - loss: 0.1907 - regression_loss: 0.1826 - classification_loss: 0.0081 34/500 [=>............................] - ETA: 3:56 - loss: 0.1947 - regression_loss: 0.1866 - classification_loss: 0.0081 35/500 [=>............................] - ETA: 3:56 - loss: 0.1925 - regression_loss: 0.1846 - classification_loss: 0.0079 36/500 [=>............................] - ETA: 3:55 - loss: 0.2011 - regression_loss: 0.1927 - classification_loss: 0.0084 37/500 [=>............................] - ETA: 3:55 - loss: 0.1999 - regression_loss: 0.1916 - classification_loss: 0.0083 38/500 [=>............................] - ETA: 3:54 - loss: 0.1990 - regression_loss: 0.1908 - classification_loss: 0.0082 39/500 [=>............................] - ETA: 3:53 - loss: 0.1968 - regression_loss: 0.1887 - classification_loss: 0.0080 40/500 [=>............................] - ETA: 3:53 - loss: 0.2007 - regression_loss: 0.1926 - classification_loss: 0.0081 41/500 [=>............................] - ETA: 3:52 - loss: 0.2014 - regression_loss: 0.1933 - classification_loss: 0.0081 42/500 [=>............................] - ETA: 3:52 - loss: 0.2032 - regression_loss: 0.1950 - classification_loss: 0.0082 43/500 [=>............................] - ETA: 3:51 - loss: 0.2034 - regression_loss: 0.1952 - classification_loss: 0.0082 44/500 [=>............................] - ETA: 3:51 - loss: 0.2076 - regression_loss: 0.1992 - classification_loss: 0.0084 45/500 [=>............................] - ETA: 3:50 - loss: 0.2089 - regression_loss: 0.2004 - classification_loss: 0.0085 46/500 [=>............................] - ETA: 3:50 - loss: 0.2111 - regression_loss: 0.2024 - classification_loss: 0.0088 47/500 [=>............................] - ETA: 3:49 - loss: 0.2085 - regression_loss: 0.1998 - classification_loss: 0.0087 48/500 [=>............................] - ETA: 3:48 - loss: 0.2072 - regression_loss: 0.1983 - classification_loss: 0.0088 49/500 [=>............................] - ETA: 3:48 - loss: 0.2074 - regression_loss: 0.1985 - classification_loss: 0.0089 50/500 [==>...........................] - ETA: 3:47 - loss: 0.2060 - regression_loss: 0.1972 - classification_loss: 0.0089 51/500 [==>...........................] - ETA: 3:47 - loss: 0.2183 - regression_loss: 0.2078 - classification_loss: 0.0105 52/500 [==>...........................] - ETA: 3:46 - loss: 0.2272 - regression_loss: 0.2159 - classification_loss: 0.0113 53/500 [==>...........................] - ETA: 3:46 - loss: 0.2287 - regression_loss: 0.2174 - classification_loss: 0.0112 54/500 [==>...........................] - ETA: 3:45 - loss: 0.2264 - regression_loss: 0.2153 - classification_loss: 0.0111 55/500 [==>...........................] - ETA: 3:45 - loss: 0.2255 - regression_loss: 0.2145 - classification_loss: 0.0110 56/500 [==>...........................] - ETA: 3:44 - loss: 0.2246 - regression_loss: 0.2136 - classification_loss: 0.0110 57/500 [==>...........................] - ETA: 3:44 - loss: 0.2247 - regression_loss: 0.2138 - classification_loss: 0.0108 58/500 [==>...........................] - ETA: 3:43 - loss: 0.2229 - regression_loss: 0.2122 - classification_loss: 0.0107 59/500 [==>...........................] - ETA: 3:43 - loss: 0.2253 - regression_loss: 0.2147 - classification_loss: 0.0106 60/500 [==>...........................] - ETA: 3:42 - loss: 0.2256 - regression_loss: 0.2151 - classification_loss: 0.0105 61/500 [==>...........................] - ETA: 3:42 - loss: 0.2244 - regression_loss: 0.2140 - classification_loss: 0.0103 62/500 [==>...........................] - ETA: 3:41 - loss: 0.2223 - regression_loss: 0.2121 - classification_loss: 0.0102 63/500 [==>...........................] - ETA: 3:41 - loss: 0.2217 - regression_loss: 0.2116 - classification_loss: 0.0100 64/500 [==>...........................] - ETA: 3:41 - loss: 0.2205 - regression_loss: 0.2106 - classification_loss: 0.0099 65/500 [==>...........................] - ETA: 3:40 - loss: 0.2191 - regression_loss: 0.2093 - classification_loss: 0.0099 66/500 [==>...........................] - ETA: 3:40 - loss: 0.2192 - regression_loss: 0.2094 - classification_loss: 0.0098 67/500 [===>..........................] - ETA: 3:39 - loss: 0.2185 - regression_loss: 0.2088 - classification_loss: 0.0097 68/500 [===>..........................] - ETA: 3:39 - loss: 0.2190 - regression_loss: 0.2091 - classification_loss: 0.0099 69/500 [===>..........................] - ETA: 3:38 - loss: 0.2198 - regression_loss: 0.2097 - classification_loss: 0.0101 70/500 [===>..........................] - ETA: 3:38 - loss: 0.2207 - regression_loss: 0.2103 - classification_loss: 0.0103 71/500 [===>..........................] - ETA: 3:37 - loss: 0.2211 - regression_loss: 0.2109 - classification_loss: 0.0102 72/500 [===>..........................] - ETA: 3:37 - loss: 0.2217 - regression_loss: 0.2116 - classification_loss: 0.0101 73/500 [===>..........................] - ETA: 3:36 - loss: 0.2209 - regression_loss: 0.2109 - classification_loss: 0.0100 74/500 [===>..........................] - ETA: 3:36 - loss: 0.2196 - regression_loss: 0.2096 - classification_loss: 0.0099 75/500 [===>..........................] - ETA: 3:35 - loss: 0.2181 - regression_loss: 0.2082 - classification_loss: 0.0098 76/500 [===>..........................] - ETA: 3:35 - loss: 0.2180 - regression_loss: 0.2082 - classification_loss: 0.0097 77/500 [===>..........................] - ETA: 3:34 - loss: 0.2197 - regression_loss: 0.2099 - classification_loss: 0.0097 78/500 [===>..........................] - ETA: 3:34 - loss: 0.2196 - regression_loss: 0.2099 - classification_loss: 0.0097 79/500 [===>..........................] - ETA: 3:33 - loss: 0.2196 - regression_loss: 0.2099 - classification_loss: 0.0097 80/500 [===>..........................] - ETA: 3:33 - loss: 0.2191 - regression_loss: 0.2095 - classification_loss: 0.0096 81/500 [===>..........................] - ETA: 3:32 - loss: 0.2215 - regression_loss: 0.2119 - classification_loss: 0.0096 82/500 [===>..........................] - ETA: 3:32 - loss: 0.2214 - regression_loss: 0.2118 - classification_loss: 0.0095 83/500 [===>..........................] - ETA: 3:31 - loss: 0.2223 - regression_loss: 0.2127 - classification_loss: 0.0096 84/500 [====>.........................] - ETA: 3:31 - loss: 0.2233 - regression_loss: 0.2136 - classification_loss: 0.0097 85/500 [====>.........................] - ETA: 3:30 - loss: 0.2234 - regression_loss: 0.2136 - classification_loss: 0.0099 86/500 [====>.........................] - ETA: 3:30 - loss: 0.2231 - regression_loss: 0.2132 - classification_loss: 0.0099 87/500 [====>.........................] - ETA: 3:29 - loss: 0.2222 - regression_loss: 0.2124 - classification_loss: 0.0098 88/500 [====>.........................] - ETA: 3:29 - loss: 0.2216 - regression_loss: 0.2118 - classification_loss: 0.0098 89/500 [====>.........................] - ETA: 3:28 - loss: 0.2211 - regression_loss: 0.2113 - classification_loss: 0.0098 90/500 [====>.........................] - ETA: 3:28 - loss: 0.2213 - regression_loss: 0.2116 - classification_loss: 0.0098 91/500 [====>.........................] - ETA: 3:27 - loss: 0.2203 - regression_loss: 0.2106 - classification_loss: 0.0097 92/500 [====>.........................] - ETA: 3:27 - loss: 0.2186 - regression_loss: 0.2090 - classification_loss: 0.0096 93/500 [====>.........................] - ETA: 3:26 - loss: 0.2195 - regression_loss: 0.2099 - classification_loss: 0.0096 94/500 [====>.........................] - ETA: 3:26 - loss: 0.2210 - regression_loss: 0.2114 - classification_loss: 0.0097 95/500 [====>.........................] - ETA: 3:25 - loss: 0.2217 - regression_loss: 0.2121 - classification_loss: 0.0096 96/500 [====>.........................] - ETA: 3:25 - loss: 0.2216 - regression_loss: 0.2120 - classification_loss: 0.0096 97/500 [====>.........................] - ETA: 3:24 - loss: 0.2221 - regression_loss: 0.2124 - classification_loss: 0.0097 98/500 [====>.........................] - ETA: 3:24 - loss: 0.2220 - regression_loss: 0.2123 - classification_loss: 0.0097 99/500 [====>.........................] - ETA: 3:23 - loss: 0.2207 - regression_loss: 0.2111 - classification_loss: 0.0096 100/500 [=====>........................] - ETA: 3:23 - loss: 0.2203 - regression_loss: 0.2107 - classification_loss: 0.0096 101/500 [=====>........................] - ETA: 3:22 - loss: 0.2203 - regression_loss: 0.2106 - classification_loss: 0.0096 102/500 [=====>........................] - ETA: 3:22 - loss: 0.2190 - regression_loss: 0.2095 - classification_loss: 0.0096 103/500 [=====>........................] - ETA: 3:21 - loss: 0.2179 - regression_loss: 0.2085 - classification_loss: 0.0095 104/500 [=====>........................] - ETA: 3:21 - loss: 0.2167 - regression_loss: 0.2073 - classification_loss: 0.0094 105/500 [=====>........................] - ETA: 3:20 - loss: 0.2168 - regression_loss: 0.2074 - classification_loss: 0.0094 106/500 [=====>........................] - ETA: 3:20 - loss: 0.2162 - regression_loss: 0.2069 - classification_loss: 0.0093 107/500 [=====>........................] - ETA: 3:19 - loss: 0.2155 - regression_loss: 0.2062 - classification_loss: 0.0093 108/500 [=====>........................] - ETA: 3:19 - loss: 0.2150 - regression_loss: 0.2058 - classification_loss: 0.0092 109/500 [=====>........................] - ETA: 3:18 - loss: 0.2158 - regression_loss: 0.2066 - classification_loss: 0.0092 110/500 [=====>........................] - ETA: 3:18 - loss: 0.2158 - regression_loss: 0.2067 - classification_loss: 0.0091 111/500 [=====>........................] - ETA: 3:17 - loss: 0.2157 - regression_loss: 0.2066 - classification_loss: 0.0091 112/500 [=====>........................] - ETA: 3:17 - loss: 0.2154 - regression_loss: 0.2063 - classification_loss: 0.0091 113/500 [=====>........................] - ETA: 3:16 - loss: 0.2163 - regression_loss: 0.2072 - classification_loss: 0.0091 114/500 [=====>........................] - ETA: 3:16 - loss: 0.2161 - regression_loss: 0.2070 - classification_loss: 0.0091 115/500 [=====>........................] - ETA: 3:15 - loss: 0.2157 - regression_loss: 0.2066 - classification_loss: 0.0091 116/500 [=====>........................] - ETA: 3:15 - loss: 0.2149 - regression_loss: 0.2059 - classification_loss: 0.0090 117/500 [======>.......................] - ETA: 3:14 - loss: 0.2140 - regression_loss: 0.2050 - classification_loss: 0.0090 118/500 [======>.......................] - ETA: 3:14 - loss: 0.2149 - regression_loss: 0.2058 - classification_loss: 0.0091 119/500 [======>.......................] - ETA: 3:13 - loss: 0.2142 - regression_loss: 0.2051 - classification_loss: 0.0090 120/500 [======>.......................] - ETA: 3:13 - loss: 0.2147 - regression_loss: 0.2056 - classification_loss: 0.0091 121/500 [======>.......................] - ETA: 3:12 - loss: 0.2152 - regression_loss: 0.2061 - classification_loss: 0.0090 122/500 [======>.......................] - ETA: 3:12 - loss: 0.2147 - regression_loss: 0.2057 - classification_loss: 0.0090 123/500 [======>.......................] - ETA: 3:11 - loss: 0.2136 - regression_loss: 0.2047 - classification_loss: 0.0089 124/500 [======>.......................] - ETA: 3:11 - loss: 0.2136 - regression_loss: 0.2047 - classification_loss: 0.0090 125/500 [======>.......................] - ETA: 3:10 - loss: 0.2132 - regression_loss: 0.2043 - classification_loss: 0.0089 126/500 [======>.......................] - ETA: 3:10 - loss: 0.2123 - regression_loss: 0.2035 - classification_loss: 0.0089 127/500 [======>.......................] - ETA: 3:09 - loss: 0.2130 - regression_loss: 0.2041 - classification_loss: 0.0088 128/500 [======>.......................] - ETA: 3:09 - loss: 0.2123 - regression_loss: 0.2035 - classification_loss: 0.0088 129/500 [======>.......................] - ETA: 3:08 - loss: 0.2117 - regression_loss: 0.2030 - classification_loss: 0.0087 130/500 [======>.......................] - ETA: 3:08 - loss: 0.2115 - regression_loss: 0.2027 - classification_loss: 0.0087 131/500 [======>.......................] - ETA: 3:07 - loss: 0.2104 - regression_loss: 0.2017 - classification_loss: 0.0087 132/500 [======>.......................] - ETA: 3:07 - loss: 0.2102 - regression_loss: 0.2015 - classification_loss: 0.0087 133/500 [======>.......................] - ETA: 3:06 - loss: 0.2091 - regression_loss: 0.2004 - classification_loss: 0.0086 134/500 [=======>......................] - ETA: 3:06 - loss: 0.2086 - regression_loss: 0.1999 - classification_loss: 0.0086 135/500 [=======>......................] - ETA: 3:05 - loss: 0.2082 - regression_loss: 0.1996 - classification_loss: 0.0086 136/500 [=======>......................] - ETA: 3:05 - loss: 0.2085 - regression_loss: 0.1999 - classification_loss: 0.0086 137/500 [=======>......................] - ETA: 3:04 - loss: 0.2081 - regression_loss: 0.1995 - classification_loss: 0.0086 138/500 [=======>......................] - ETA: 3:04 - loss: 0.2074 - regression_loss: 0.1988 - classification_loss: 0.0086 139/500 [=======>......................] - ETA: 3:03 - loss: 0.2069 - regression_loss: 0.1984 - classification_loss: 0.0085 140/500 [=======>......................] - ETA: 3:03 - loss: 0.2072 - regression_loss: 0.1987 - classification_loss: 0.0085 141/500 [=======>......................] - ETA: 3:02 - loss: 0.2068 - regression_loss: 0.1983 - classification_loss: 0.0085 142/500 [=======>......................] - ETA: 3:02 - loss: 0.2064 - regression_loss: 0.1980 - classification_loss: 0.0084 143/500 [=======>......................] - ETA: 3:01 - loss: 0.2068 - regression_loss: 0.1984 - classification_loss: 0.0085 144/500 [=======>......................] - ETA: 3:01 - loss: 0.2071 - regression_loss: 0.1986 - classification_loss: 0.0084 145/500 [=======>......................] - ETA: 3:00 - loss: 0.2068 - regression_loss: 0.1984 - classification_loss: 0.0084 146/500 [=======>......................] - ETA: 3:00 - loss: 0.2077 - regression_loss: 0.1992 - classification_loss: 0.0085 147/500 [=======>......................] - ETA: 2:59 - loss: 0.2078 - regression_loss: 0.1993 - classification_loss: 0.0085 148/500 [=======>......................] - ETA: 2:59 - loss: 0.2068 - regression_loss: 0.1983 - classification_loss: 0.0084 149/500 [=======>......................] - ETA: 2:58 - loss: 0.2062 - regression_loss: 0.1977 - classification_loss: 0.0084 150/500 [========>.....................] - ETA: 2:58 - loss: 0.2061 - regression_loss: 0.1978 - classification_loss: 0.0084 151/500 [========>.....................] - ETA: 2:57 - loss: 0.2067 - regression_loss: 0.1983 - classification_loss: 0.0084 152/500 [========>.....................] - ETA: 2:57 - loss: 0.2063 - regression_loss: 0.1979 - classification_loss: 0.0084 153/500 [========>.....................] - ETA: 2:56 - loss: 0.2069 - regression_loss: 0.1984 - classification_loss: 0.0085 154/500 [========>.....................] - ETA: 2:56 - loss: 0.2063 - regression_loss: 0.1978 - classification_loss: 0.0085 155/500 [========>.....................] - ETA: 2:55 - loss: 0.2065 - regression_loss: 0.1980 - classification_loss: 0.0085 156/500 [========>.....................] - ETA: 2:55 - loss: 0.2057 - regression_loss: 0.1973 - classification_loss: 0.0085 157/500 [========>.....................] - ETA: 2:54 - loss: 0.2057 - regression_loss: 0.1973 - classification_loss: 0.0084 158/500 [========>.....................] - ETA: 2:54 - loss: 0.2056 - regression_loss: 0.1971 - classification_loss: 0.0084 159/500 [========>.....................] - ETA: 2:53 - loss: 0.2059 - regression_loss: 0.1975 - classification_loss: 0.0084 160/500 [========>.....................] - ETA: 2:53 - loss: 0.2057 - regression_loss: 0.1973 - classification_loss: 0.0084 161/500 [========>.....................] - ETA: 2:52 - loss: 0.2065 - regression_loss: 0.1981 - classification_loss: 0.0084 162/500 [========>.....................] - ETA: 2:52 - loss: 0.2067 - regression_loss: 0.1983 - classification_loss: 0.0084 163/500 [========>.....................] - ETA: 2:51 - loss: 0.2061 - regression_loss: 0.1978 - classification_loss: 0.0083 164/500 [========>.....................] - ETA: 2:51 - loss: 0.2066 - regression_loss: 0.1983 - classification_loss: 0.0083 165/500 [========>.....................] - ETA: 2:50 - loss: 0.2063 - regression_loss: 0.1980 - classification_loss: 0.0083 166/500 [========>.....................] - ETA: 2:50 - loss: 0.2074 - regression_loss: 0.1991 - classification_loss: 0.0083 167/500 [=========>....................] - ETA: 2:49 - loss: 0.2103 - regression_loss: 0.2020 - classification_loss: 0.0083 168/500 [=========>....................] - ETA: 2:49 - loss: 0.2104 - regression_loss: 0.2020 - classification_loss: 0.0083 169/500 [=========>....................] - ETA: 2:48 - loss: 0.2107 - regression_loss: 0.2024 - classification_loss: 0.0083 170/500 [=========>....................] - ETA: 2:48 - loss: 0.2102 - regression_loss: 0.2018 - classification_loss: 0.0083 171/500 [=========>....................] - ETA: 2:47 - loss: 0.2098 - regression_loss: 0.2015 - classification_loss: 0.0083 172/500 [=========>....................] - ETA: 2:47 - loss: 0.2092 - regression_loss: 0.2010 - classification_loss: 0.0082 173/500 [=========>....................] - ETA: 2:46 - loss: 0.2090 - regression_loss: 0.2008 - classification_loss: 0.0082 174/500 [=========>....................] - ETA: 2:46 - loss: 0.2093 - regression_loss: 0.2010 - classification_loss: 0.0082 175/500 [=========>....................] - ETA: 2:45 - loss: 0.2088 - regression_loss: 0.2006 - classification_loss: 0.0082 176/500 [=========>....................] - ETA: 2:45 - loss: 0.2107 - regression_loss: 0.2024 - classification_loss: 0.0083 177/500 [=========>....................] - ETA: 2:44 - loss: 0.2099 - regression_loss: 0.2017 - classification_loss: 0.0082 178/500 [=========>....................] - ETA: 2:44 - loss: 0.2108 - regression_loss: 0.2026 - classification_loss: 0.0082 179/500 [=========>....................] - ETA: 2:43 - loss: 0.2105 - regression_loss: 0.2023 - classification_loss: 0.0082 180/500 [=========>....................] - ETA: 2:43 - loss: 0.2101 - regression_loss: 0.2020 - classification_loss: 0.0081 181/500 [=========>....................] - ETA: 2:42 - loss: 0.2099 - regression_loss: 0.2018 - classification_loss: 0.0081 182/500 [=========>....................] - ETA: 2:42 - loss: 0.2091 - regression_loss: 0.2010 - classification_loss: 0.0081 183/500 [=========>....................] - ETA: 2:41 - loss: 0.2087 - regression_loss: 0.2006 - classification_loss: 0.0080 184/500 [==========>...................] - ETA: 2:41 - loss: 0.2078 - regression_loss: 0.1998 - classification_loss: 0.0080 185/500 [==========>...................] - ETA: 2:40 - loss: 0.2075 - regression_loss: 0.1995 - classification_loss: 0.0080 186/500 [==========>...................] - ETA: 2:40 - loss: 0.2074 - regression_loss: 0.1994 - classification_loss: 0.0080 187/500 [==========>...................] - ETA: 2:39 - loss: 0.2074 - regression_loss: 0.1995 - classification_loss: 0.0080 188/500 [==========>...................] - ETA: 2:39 - loss: 0.2078 - regression_loss: 0.1997 - classification_loss: 0.0081 189/500 [==========>...................] - ETA: 2:38 - loss: 0.2075 - regression_loss: 0.1995 - classification_loss: 0.0081 190/500 [==========>...................] - ETA: 2:38 - loss: 0.2073 - regression_loss: 0.1993 - classification_loss: 0.0080 191/500 [==========>...................] - ETA: 2:37 - loss: 0.2073 - regression_loss: 0.1992 - classification_loss: 0.0081 192/500 [==========>...................] - ETA: 2:37 - loss: 0.2064 - regression_loss: 0.1984 - classification_loss: 0.0080 193/500 [==========>...................] - ETA: 2:36 - loss: 0.2060 - regression_loss: 0.1981 - classification_loss: 0.0080 194/500 [==========>...................] - ETA: 2:36 - loss: 0.2059 - regression_loss: 0.1979 - classification_loss: 0.0080 195/500 [==========>...................] - ETA: 2:35 - loss: 0.2062 - regression_loss: 0.1981 - classification_loss: 0.0081 196/500 [==========>...................] - ETA: 2:35 - loss: 0.2059 - regression_loss: 0.1978 - classification_loss: 0.0081 197/500 [==========>...................] - ETA: 2:34 - loss: 0.2054 - regression_loss: 0.1973 - classification_loss: 0.0081 198/500 [==========>...................] - ETA: 2:34 - loss: 0.2057 - regression_loss: 0.1976 - classification_loss: 0.0081 199/500 [==========>...................] - ETA: 2:33 - loss: 0.2056 - regression_loss: 0.1976 - classification_loss: 0.0080 200/500 [===========>..................] - ETA: 2:33 - loss: 0.2052 - regression_loss: 0.1972 - classification_loss: 0.0080 201/500 [===========>..................] - ETA: 2:32 - loss: 0.2049 - regression_loss: 0.1968 - classification_loss: 0.0080 202/500 [===========>..................] - ETA: 2:32 - loss: 0.2052 - regression_loss: 0.1971 - classification_loss: 0.0080 203/500 [===========>..................] - ETA: 2:31 - loss: 0.2049 - regression_loss: 0.1969 - classification_loss: 0.0080 204/500 [===========>..................] - ETA: 2:31 - loss: 0.2044 - regression_loss: 0.1964 - classification_loss: 0.0080 205/500 [===========>..................] - ETA: 2:30 - loss: 0.2050 - regression_loss: 0.1970 - classification_loss: 0.0080 206/500 [===========>..................] - ETA: 2:30 - loss: 0.2043 - regression_loss: 0.1963 - classification_loss: 0.0079 207/500 [===========>..................] - ETA: 2:29 - loss: 0.2044 - regression_loss: 0.1965 - classification_loss: 0.0079 208/500 [===========>..................] - ETA: 2:29 - loss: 0.2044 - regression_loss: 0.1965 - classification_loss: 0.0079 209/500 [===========>..................] - ETA: 2:28 - loss: 0.2040 - regression_loss: 0.1961 - classification_loss: 0.0079 210/500 [===========>..................] - ETA: 2:28 - loss: 0.2044 - regression_loss: 0.1965 - classification_loss: 0.0079 211/500 [===========>..................] - ETA: 2:27 - loss: 0.2040 - regression_loss: 0.1961 - classification_loss: 0.0079 212/500 [===========>..................] - ETA: 2:27 - loss: 0.2036 - regression_loss: 0.1958 - classification_loss: 0.0079 213/500 [===========>..................] - ETA: 2:26 - loss: 0.2038 - regression_loss: 0.1959 - classification_loss: 0.0079 214/500 [===========>..................] - ETA: 2:26 - loss: 0.2034 - regression_loss: 0.1955 - classification_loss: 0.0079 215/500 [===========>..................] - ETA: 2:25 - loss: 0.2038 - regression_loss: 0.1959 - classification_loss: 0.0079 216/500 [===========>..................] - ETA: 2:25 - loss: 0.2031 - regression_loss: 0.1953 - classification_loss: 0.0079 217/500 [============>.................] - ETA: 2:24 - loss: 0.2030 - regression_loss: 0.1951 - classification_loss: 0.0078 218/500 [============>.................] - ETA: 2:24 - loss: 0.2028 - regression_loss: 0.1949 - classification_loss: 0.0078 219/500 [============>.................] - ETA: 2:23 - loss: 0.2028 - regression_loss: 0.1950 - classification_loss: 0.0078 220/500 [============>.................] - ETA: 2:22 - loss: 0.2022 - regression_loss: 0.1944 - classification_loss: 0.0078 221/500 [============>.................] - ETA: 2:22 - loss: 0.2025 - regression_loss: 0.1947 - classification_loss: 0.0078 222/500 [============>.................] - ETA: 2:21 - loss: 0.2023 - regression_loss: 0.1945 - classification_loss: 0.0078 223/500 [============>.................] - ETA: 2:21 - loss: 0.2020 - regression_loss: 0.1942 - classification_loss: 0.0078 224/500 [============>.................] - ETA: 2:20 - loss: 0.2023 - regression_loss: 0.1945 - classification_loss: 0.0078 225/500 [============>.................] - ETA: 2:20 - loss: 0.2026 - regression_loss: 0.1949 - classification_loss: 0.0078 226/500 [============>.................] - ETA: 2:19 - loss: 0.2023 - regression_loss: 0.1945 - classification_loss: 0.0078 227/500 [============>.................] - ETA: 2:19 - loss: 0.2032 - regression_loss: 0.1954 - classification_loss: 0.0078 228/500 [============>.................] - ETA: 2:18 - loss: 0.2033 - regression_loss: 0.1955 - classification_loss: 0.0079 229/500 [============>.................] - ETA: 2:18 - loss: 0.2031 - regression_loss: 0.1953 - classification_loss: 0.0078 230/500 [============>.................] - ETA: 2:17 - loss: 0.2030 - regression_loss: 0.1952 - classification_loss: 0.0078 231/500 [============>.................] - ETA: 2:17 - loss: 0.2029 - regression_loss: 0.1951 - classification_loss: 0.0078 232/500 [============>.................] - ETA: 2:16 - loss: 0.2027 - regression_loss: 0.1948 - classification_loss: 0.0078 233/500 [============>.................] - ETA: 2:16 - loss: 0.2034 - regression_loss: 0.1953 - classification_loss: 0.0081 234/500 [=============>................] - ETA: 2:15 - loss: 0.2029 - regression_loss: 0.1949 - classification_loss: 0.0080 235/500 [=============>................] - ETA: 2:15 - loss: 0.2023 - regression_loss: 0.1943 - classification_loss: 0.0080 236/500 [=============>................] - ETA: 2:14 - loss: 0.2022 - regression_loss: 0.1942 - classification_loss: 0.0080 237/500 [=============>................] - ETA: 2:14 - loss: 0.2018 - regression_loss: 0.1938 - classification_loss: 0.0080 238/500 [=============>................] - ETA: 2:13 - loss: 0.2022 - regression_loss: 0.1942 - classification_loss: 0.0080 239/500 [=============>................] - ETA: 2:13 - loss: 0.2020 - regression_loss: 0.1941 - classification_loss: 0.0080 240/500 [=============>................] - ETA: 2:12 - loss: 0.2016 - regression_loss: 0.1937 - classification_loss: 0.0080 241/500 [=============>................] - ETA: 2:12 - loss: 0.2013 - regression_loss: 0.1933 - classification_loss: 0.0079 242/500 [=============>................] - ETA: 2:11 - loss: 0.2012 - regression_loss: 0.1933 - classification_loss: 0.0079 243/500 [=============>................] - ETA: 2:11 - loss: 0.2008 - regression_loss: 0.1929 - classification_loss: 0.0079 244/500 [=============>................] - ETA: 2:10 - loss: 0.2012 - regression_loss: 0.1933 - classification_loss: 0.0079 245/500 [=============>................] - ETA: 2:10 - loss: 0.2013 - regression_loss: 0.1934 - classification_loss: 0.0079 246/500 [=============>................] - ETA: 2:09 - loss: 0.2014 - regression_loss: 0.1936 - classification_loss: 0.0079 247/500 [=============>................] - ETA: 2:09 - loss: 0.2015 - regression_loss: 0.1937 - classification_loss: 0.0079 248/500 [=============>................] - ETA: 2:08 - loss: 0.2015 - regression_loss: 0.1937 - classification_loss: 0.0079 249/500 [=============>................] - ETA: 2:08 - loss: 0.2016 - regression_loss: 0.1937 - classification_loss: 0.0079 250/500 [==============>...............] - ETA: 2:07 - loss: 0.2013 - regression_loss: 0.1934 - classification_loss: 0.0079 251/500 [==============>...............] - ETA: 2:07 - loss: 0.2008 - regression_loss: 0.1930 - classification_loss: 0.0078 252/500 [==============>...............] - ETA: 2:06 - loss: 0.2010 - regression_loss: 0.1932 - classification_loss: 0.0078 253/500 [==============>...............] - ETA: 2:05 - loss: 0.2013 - regression_loss: 0.1934 - classification_loss: 0.0078 254/500 [==============>...............] - ETA: 2:05 - loss: 0.2010 - regression_loss: 0.1932 - classification_loss: 0.0078 255/500 [==============>...............] - ETA: 2:04 - loss: 0.2009 - regression_loss: 0.1931 - classification_loss: 0.0078 256/500 [==============>...............] - ETA: 2:04 - loss: 0.2022 - regression_loss: 0.1941 - classification_loss: 0.0080 257/500 [==============>...............] - ETA: 2:03 - loss: 0.2022 - regression_loss: 0.1941 - classification_loss: 0.0080 258/500 [==============>...............] - ETA: 2:03 - loss: 0.2023 - regression_loss: 0.1943 - classification_loss: 0.0080 259/500 [==============>...............] - ETA: 2:02 - loss: 0.2033 - regression_loss: 0.1952 - classification_loss: 0.0081 260/500 [==============>...............] - ETA: 2:02 - loss: 0.2035 - regression_loss: 0.1955 - classification_loss: 0.0081 261/500 [==============>...............] - ETA: 2:01 - loss: 0.2036 - regression_loss: 0.1955 - classification_loss: 0.0081 262/500 [==============>...............] - ETA: 2:01 - loss: 0.2034 - regression_loss: 0.1954 - classification_loss: 0.0080 263/500 [==============>...............] - ETA: 2:00 - loss: 0.2031 - regression_loss: 0.1951 - classification_loss: 0.0080 264/500 [==============>...............] - ETA: 2:00 - loss: 0.2037 - regression_loss: 0.1956 - classification_loss: 0.0080 265/500 [==============>...............] - ETA: 1:59 - loss: 0.2036 - regression_loss: 0.1956 - classification_loss: 0.0080 266/500 [==============>...............] - ETA: 1:59 - loss: 0.2032 - regression_loss: 0.1952 - classification_loss: 0.0080 267/500 [===============>..............] - ETA: 1:58 - loss: 0.2032 - regression_loss: 0.1952 - classification_loss: 0.0080 268/500 [===============>..............] - ETA: 1:58 - loss: 0.2032 - regression_loss: 0.1953 - classification_loss: 0.0080 269/500 [===============>..............] - ETA: 1:57 - loss: 0.2031 - regression_loss: 0.1951 - classification_loss: 0.0080 270/500 [===============>..............] - ETA: 1:57 - loss: 0.2036 - regression_loss: 0.1956 - classification_loss: 0.0080 271/500 [===============>..............] - ETA: 1:56 - loss: 0.2044 - regression_loss: 0.1964 - classification_loss: 0.0080 272/500 [===============>..............] - ETA: 1:56 - loss: 0.2051 - regression_loss: 0.1971 - classification_loss: 0.0080 273/500 [===============>..............] - ETA: 1:55 - loss: 0.2049 - regression_loss: 0.1969 - classification_loss: 0.0080 274/500 [===============>..............] - ETA: 1:55 - loss: 0.2050 - regression_loss: 0.1970 - classification_loss: 0.0080 275/500 [===============>..............] - ETA: 1:54 - loss: 0.2050 - regression_loss: 0.1970 - classification_loss: 0.0080 276/500 [===============>..............] - ETA: 1:54 - loss: 0.2049 - regression_loss: 0.1969 - classification_loss: 0.0080 277/500 [===============>..............] - ETA: 1:53 - loss: 0.2059 - regression_loss: 0.1979 - classification_loss: 0.0080 278/500 [===============>..............] - ETA: 1:53 - loss: 0.2058 - regression_loss: 0.1978 - classification_loss: 0.0080 279/500 [===============>..............] - ETA: 1:52 - loss: 0.2057 - regression_loss: 0.1977 - classification_loss: 0.0080 280/500 [===============>..............] - ETA: 1:52 - loss: 0.2056 - regression_loss: 0.1975 - classification_loss: 0.0080 281/500 [===============>..............] - ETA: 1:51 - loss: 0.2056 - regression_loss: 0.1976 - classification_loss: 0.0080 282/500 [===============>..............] - ETA: 1:51 - loss: 0.2051 - regression_loss: 0.1971 - classification_loss: 0.0080 283/500 [===============>..............] - ETA: 1:50 - loss: 0.2045 - regression_loss: 0.1966 - classification_loss: 0.0080 284/500 [================>.............] - ETA: 1:50 - loss: 0.2048 - regression_loss: 0.1969 - classification_loss: 0.0080 285/500 [================>.............] - ETA: 1:49 - loss: 0.2042 - regression_loss: 0.1963 - classification_loss: 0.0079 286/500 [================>.............] - ETA: 1:49 - loss: 0.2045 - regression_loss: 0.1965 - classification_loss: 0.0080 287/500 [================>.............] - ETA: 1:48 - loss: 0.2049 - regression_loss: 0.1970 - classification_loss: 0.0080 288/500 [================>.............] - ETA: 1:48 - loss: 0.2050 - regression_loss: 0.1970 - classification_loss: 0.0080 289/500 [================>.............] - ETA: 1:47 - loss: 0.2047 - regression_loss: 0.1968 - classification_loss: 0.0079 290/500 [================>.............] - ETA: 1:47 - loss: 0.2049 - regression_loss: 0.1969 - classification_loss: 0.0080 291/500 [================>.............] - ETA: 1:46 - loss: 0.2048 - regression_loss: 0.1969 - classification_loss: 0.0079 292/500 [================>.............] - ETA: 1:46 - loss: 0.2046 - regression_loss: 0.1966 - classification_loss: 0.0079 293/500 [================>.............] - ETA: 1:45 - loss: 0.2044 - regression_loss: 0.1965 - classification_loss: 0.0079 294/500 [================>.............] - ETA: 1:45 - loss: 0.2045 - regression_loss: 0.1966 - classification_loss: 0.0079 295/500 [================>.............] - ETA: 1:44 - loss: 0.2045 - regression_loss: 0.1966 - classification_loss: 0.0079 296/500 [================>.............] - ETA: 1:43 - loss: 0.2045 - regression_loss: 0.1966 - classification_loss: 0.0079 297/500 [================>.............] - ETA: 1:43 - loss: 0.2041 - regression_loss: 0.1962 - classification_loss: 0.0079 298/500 [================>.............] - ETA: 1:42 - loss: 0.2037 - regression_loss: 0.1959 - classification_loss: 0.0079 299/500 [================>.............] - ETA: 1:42 - loss: 0.2039 - regression_loss: 0.1960 - classification_loss: 0.0079 300/500 [=================>............] - ETA: 1:41 - loss: 0.2039 - regression_loss: 0.1960 - classification_loss: 0.0079 301/500 [=================>............] - ETA: 1:41 - loss: 0.2038 - regression_loss: 0.1958 - classification_loss: 0.0079 302/500 [=================>............] - ETA: 1:40 - loss: 0.2035 - regression_loss: 0.1956 - classification_loss: 0.0079 303/500 [=================>............] - ETA: 1:40 - loss: 0.2032 - regression_loss: 0.1953 - classification_loss: 0.0079 304/500 [=================>............] - ETA: 1:39 - loss: 0.2032 - regression_loss: 0.1954 - classification_loss: 0.0079 305/500 [=================>............] - ETA: 1:39 - loss: 0.2028 - regression_loss: 0.1950 - classification_loss: 0.0079 306/500 [=================>............] - ETA: 1:38 - loss: 0.2029 - regression_loss: 0.1950 - classification_loss: 0.0079 307/500 [=================>............] - ETA: 1:38 - loss: 0.2027 - regression_loss: 0.1949 - classification_loss: 0.0079 308/500 [=================>............] - ETA: 1:37 - loss: 0.2025 - regression_loss: 0.1947 - classification_loss: 0.0079 309/500 [=================>............] - ETA: 1:37 - loss: 0.2026 - regression_loss: 0.1947 - classification_loss: 0.0079 310/500 [=================>............] - ETA: 1:36 - loss: 0.2026 - regression_loss: 0.1947 - classification_loss: 0.0079 311/500 [=================>............] - ETA: 1:36 - loss: 0.2022 - regression_loss: 0.1943 - classification_loss: 0.0079 312/500 [=================>............] - ETA: 1:35 - loss: 0.2021 - regression_loss: 0.1942 - classification_loss: 0.0079 313/500 [=================>............] - ETA: 1:35 - loss: 0.2020 - regression_loss: 0.1942 - classification_loss: 0.0078 314/500 [=================>............] - ETA: 1:34 - loss: 0.2017 - regression_loss: 0.1939 - classification_loss: 0.0078 315/500 [=================>............] - ETA: 1:34 - loss: 0.2017 - regression_loss: 0.1939 - classification_loss: 0.0078 316/500 [=================>............] - ETA: 1:33 - loss: 0.2017 - regression_loss: 0.1939 - classification_loss: 0.0078 317/500 [==================>...........] - ETA: 1:33 - loss: 0.2016 - regression_loss: 0.1938 - classification_loss: 0.0078 318/500 [==================>...........] - ETA: 1:32 - loss: 0.2016 - regression_loss: 0.1938 - classification_loss: 0.0078 319/500 [==================>...........] - ETA: 1:32 - loss: 0.2013 - regression_loss: 0.1936 - classification_loss: 0.0078 320/500 [==================>...........] - ETA: 1:31 - loss: 0.2016 - regression_loss: 0.1938 - classification_loss: 0.0078 321/500 [==================>...........] - ETA: 1:31 - loss: 0.2015 - regression_loss: 0.1938 - classification_loss: 0.0078 322/500 [==================>...........] - ETA: 1:30 - loss: 0.2015 - regression_loss: 0.1937 - classification_loss: 0.0078 323/500 [==================>...........] - ETA: 1:30 - loss: 0.2018 - regression_loss: 0.1940 - classification_loss: 0.0077 324/500 [==================>...........] - ETA: 1:29 - loss: 0.2020 - regression_loss: 0.1943 - classification_loss: 0.0078 325/500 [==================>...........] - ETA: 1:29 - loss: 0.2017 - regression_loss: 0.1940 - classification_loss: 0.0077 326/500 [==================>...........] - ETA: 1:28 - loss: 0.2013 - regression_loss: 0.1936 - classification_loss: 0.0077 327/500 [==================>...........] - ETA: 1:28 - loss: 0.2010 - regression_loss: 0.1933 - classification_loss: 0.0077 328/500 [==================>...........] - ETA: 1:27 - loss: 0.2009 - regression_loss: 0.1932 - classification_loss: 0.0077 329/500 [==================>...........] - ETA: 1:27 - loss: 0.2007 - regression_loss: 0.1930 - classification_loss: 0.0077 330/500 [==================>...........] - ETA: 1:26 - loss: 0.2002 - regression_loss: 0.1926 - classification_loss: 0.0077 331/500 [==================>...........] - ETA: 1:26 - loss: 0.2000 - regression_loss: 0.1924 - classification_loss: 0.0077 332/500 [==================>...........] - ETA: 1:25 - loss: 0.2000 - regression_loss: 0.1923 - classification_loss: 0.0076 333/500 [==================>...........] - ETA: 1:25 - loss: 0.1998 - regression_loss: 0.1922 - classification_loss: 0.0076 334/500 [===================>..........] - ETA: 1:24 - loss: 0.1998 - regression_loss: 0.1922 - classification_loss: 0.0076 335/500 [===================>..........] - ETA: 1:24 - loss: 0.2002 - regression_loss: 0.1925 - classification_loss: 0.0077 336/500 [===================>..........] - ETA: 1:23 - loss: 0.1998 - regression_loss: 0.1921 - classification_loss: 0.0076 337/500 [===================>..........] - ETA: 1:23 - loss: 0.1997 - regression_loss: 0.1921 - classification_loss: 0.0076 338/500 [===================>..........] - ETA: 1:22 - loss: 0.1997 - regression_loss: 0.1921 - classification_loss: 0.0076 339/500 [===================>..........] - ETA: 1:22 - loss: 0.1994 - regression_loss: 0.1918 - classification_loss: 0.0076 340/500 [===================>..........] - ETA: 1:21 - loss: 0.1992 - regression_loss: 0.1916 - classification_loss: 0.0076 341/500 [===================>..........] - ETA: 1:21 - loss: 0.1991 - regression_loss: 0.1915 - classification_loss: 0.0076 342/500 [===================>..........] - ETA: 1:20 - loss: 0.1989 - regression_loss: 0.1913 - classification_loss: 0.0076 343/500 [===================>..........] - ETA: 1:20 - loss: 0.1987 - regression_loss: 0.1911 - classification_loss: 0.0076 344/500 [===================>..........] - ETA: 1:19 - loss: 0.1985 - regression_loss: 0.1909 - classification_loss: 0.0076 345/500 [===================>..........] - ETA: 1:19 - loss: 0.1985 - regression_loss: 0.1909 - classification_loss: 0.0076 346/500 [===================>..........] - ETA: 1:18 - loss: 0.1987 - regression_loss: 0.1911 - classification_loss: 0.0076 347/500 [===================>..........] - ETA: 1:18 - loss: 0.1984 - regression_loss: 0.1908 - classification_loss: 0.0076 348/500 [===================>..........] - ETA: 1:17 - loss: 0.1987 - regression_loss: 0.1911 - classification_loss: 0.0076 349/500 [===================>..........] - ETA: 1:17 - loss: 0.1988 - regression_loss: 0.1911 - classification_loss: 0.0076 350/500 [====================>.........] - ETA: 1:16 - loss: 0.1991 - regression_loss: 0.1915 - classification_loss: 0.0076 351/500 [====================>.........] - ETA: 1:16 - loss: 0.1991 - regression_loss: 0.1914 - classification_loss: 0.0076 352/500 [====================>.........] - ETA: 1:15 - loss: 0.1989 - regression_loss: 0.1912 - classification_loss: 0.0076 353/500 [====================>.........] - ETA: 1:14 - loss: 0.1987 - regression_loss: 0.1911 - classification_loss: 0.0076 354/500 [====================>.........] - ETA: 1:14 - loss: 0.1991 - regression_loss: 0.1914 - classification_loss: 0.0076 355/500 [====================>.........] - ETA: 1:13 - loss: 0.1990 - regression_loss: 0.1913 - classification_loss: 0.0077 356/500 [====================>.........] - ETA: 1:13 - loss: 0.1990 - regression_loss: 0.1914 - classification_loss: 0.0077 357/500 [====================>.........] - ETA: 1:12 - loss: 0.1991 - regression_loss: 0.1914 - classification_loss: 0.0077 358/500 [====================>.........] - ETA: 1:12 - loss: 0.1986 - regression_loss: 0.1910 - classification_loss: 0.0076 359/500 [====================>.........] - ETA: 1:11 - loss: 0.1987 - regression_loss: 0.1911 - classification_loss: 0.0077 360/500 [====================>.........] - ETA: 1:11 - loss: 0.1989 - regression_loss: 0.1912 - classification_loss: 0.0076 361/500 [====================>.........] - ETA: 1:10 - loss: 0.1988 - regression_loss: 0.1912 - classification_loss: 0.0076 362/500 [====================>.........] - ETA: 1:10 - loss: 0.1990 - regression_loss: 0.1914 - classification_loss: 0.0076 363/500 [====================>.........] - ETA: 1:09 - loss: 0.1990 - regression_loss: 0.1913 - classification_loss: 0.0076 364/500 [====================>.........] - ETA: 1:09 - loss: 0.1988 - regression_loss: 0.1912 - classification_loss: 0.0076 365/500 [====================>.........] - ETA: 1:08 - loss: 0.1986 - regression_loss: 0.1910 - classification_loss: 0.0076 366/500 [====================>.........] - ETA: 1:08 - loss: 0.1986 - regression_loss: 0.1910 - classification_loss: 0.0076 367/500 [=====================>........] - ETA: 1:07 - loss: 0.1985 - regression_loss: 0.1909 - classification_loss: 0.0076 368/500 [=====================>........] - ETA: 1:07 - loss: 0.1983 - regression_loss: 0.1907 - classification_loss: 0.0076 369/500 [=====================>........] - ETA: 1:06 - loss: 0.1981 - regression_loss: 0.1905 - classification_loss: 0.0076 370/500 [=====================>........] - ETA: 1:06 - loss: 0.1978 - regression_loss: 0.1903 - classification_loss: 0.0075 371/500 [=====================>........] - ETA: 1:05 - loss: 0.1979 - regression_loss: 0.1903 - classification_loss: 0.0076 372/500 [=====================>........] - ETA: 1:05 - loss: 0.1980 - regression_loss: 0.1904 - classification_loss: 0.0076 373/500 [=====================>........] - ETA: 1:04 - loss: 0.1980 - regression_loss: 0.1904 - classification_loss: 0.0076 374/500 [=====================>........] - ETA: 1:04 - loss: 0.1979 - regression_loss: 0.1903 - classification_loss: 0.0076 375/500 [=====================>........] - ETA: 1:03 - loss: 0.1978 - regression_loss: 0.1902 - classification_loss: 0.0076 376/500 [=====================>........] - ETA: 1:03 - loss: 0.1978 - regression_loss: 0.1902 - classification_loss: 0.0076 377/500 [=====================>........] - ETA: 1:02 - loss: 0.1980 - regression_loss: 0.1904 - classification_loss: 0.0076 378/500 [=====================>........] - ETA: 1:02 - loss: 0.1980 - regression_loss: 0.1904 - classification_loss: 0.0076 379/500 [=====================>........] - ETA: 1:01 - loss: 0.1982 - regression_loss: 0.1905 - classification_loss: 0.0076 380/500 [=====================>........] - ETA: 1:01 - loss: 0.1980 - regression_loss: 0.1904 - classification_loss: 0.0076 381/500 [=====================>........] - ETA: 1:00 - loss: 0.1980 - regression_loss: 0.1904 - classification_loss: 0.0076 382/500 [=====================>........] - ETA: 1:00 - loss: 0.1979 - regression_loss: 0.1903 - classification_loss: 0.0076 383/500 [=====================>........] - ETA: 59s - loss: 0.1979 - regression_loss: 0.1903 - classification_loss: 0.0076  384/500 [======================>.......] - ETA: 59s - loss: 0.1978 - regression_loss: 0.1901 - classification_loss: 0.0076 385/500 [======================>.......] - ETA: 58s - loss: 0.1977 - regression_loss: 0.1901 - classification_loss: 0.0076 386/500 [======================>.......] - ETA: 58s - loss: 0.1977 - regression_loss: 0.1901 - classification_loss: 0.0076 387/500 [======================>.......] - ETA: 57s - loss: 0.1980 - regression_loss: 0.1903 - classification_loss: 0.0076 388/500 [======================>.......] - ETA: 57s - loss: 0.1977 - regression_loss: 0.1901 - classification_loss: 0.0076 389/500 [======================>.......] - ETA: 56s - loss: 0.1977 - regression_loss: 0.1901 - classification_loss: 0.0076 390/500 [======================>.......] - ETA: 56s - loss: 0.1977 - regression_loss: 0.1901 - classification_loss: 0.0076 391/500 [======================>.......] - ETA: 55s - loss: 0.1980 - regression_loss: 0.1904 - classification_loss: 0.0076 392/500 [======================>.......] - ETA: 55s - loss: 0.1977 - regression_loss: 0.1902 - classification_loss: 0.0076 393/500 [======================>.......] - ETA: 54s - loss: 0.1977 - regression_loss: 0.1901 - classification_loss: 0.0076 394/500 [======================>.......] - ETA: 54s - loss: 0.1974 - regression_loss: 0.1899 - classification_loss: 0.0076 395/500 [======================>.......] - ETA: 53s - loss: 0.1972 - regression_loss: 0.1896 - classification_loss: 0.0075 396/500 [======================>.......] - ETA: 53s - loss: 0.1974 - regression_loss: 0.1899 - classification_loss: 0.0076 397/500 [======================>.......] - ETA: 52s - loss: 0.1973 - regression_loss: 0.1898 - classification_loss: 0.0075 398/500 [======================>.......] - ETA: 52s - loss: 0.1976 - regression_loss: 0.1901 - classification_loss: 0.0075 399/500 [======================>.......] - ETA: 51s - loss: 0.1974 - regression_loss: 0.1898 - classification_loss: 0.0075 400/500 [=======================>......] - ETA: 51s - loss: 0.1971 - regression_loss: 0.1896 - classification_loss: 0.0075 401/500 [=======================>......] - ETA: 50s - loss: 0.1970 - regression_loss: 0.1895 - classification_loss: 0.0075 402/500 [=======================>......] - ETA: 49s - loss: 0.1967 - regression_loss: 0.1892 - classification_loss: 0.0075 403/500 [=======================>......] - ETA: 49s - loss: 0.1966 - regression_loss: 0.1891 - classification_loss: 0.0075 404/500 [=======================>......] - ETA: 48s - loss: 0.1967 - regression_loss: 0.1892 - classification_loss: 0.0075 405/500 [=======================>......] - ETA: 48s - loss: 0.1965 - regression_loss: 0.1890 - classification_loss: 0.0075 406/500 [=======================>......] - ETA: 47s - loss: 0.1964 - regression_loss: 0.1888 - classification_loss: 0.0075 407/500 [=======================>......] - ETA: 47s - loss: 0.1959 - regression_loss: 0.1884 - classification_loss: 0.0075 408/500 [=======================>......] - ETA: 46s - loss: 0.1958 - regression_loss: 0.1883 - classification_loss: 0.0075 409/500 [=======================>......] - ETA: 46s - loss: 0.1959 - regression_loss: 0.1884 - classification_loss: 0.0075 410/500 [=======================>......] - ETA: 45s - loss: 0.1958 - regression_loss: 0.1883 - classification_loss: 0.0075 411/500 [=======================>......] - ETA: 45s - loss: 0.1958 - regression_loss: 0.1883 - classification_loss: 0.0075 412/500 [=======================>......] - ETA: 44s - loss: 0.1959 - regression_loss: 0.1885 - classification_loss: 0.0075 413/500 [=======================>......] - ETA: 44s - loss: 0.1961 - regression_loss: 0.1886 - classification_loss: 0.0075 414/500 [=======================>......] - ETA: 43s - loss: 0.1964 - regression_loss: 0.1889 - classification_loss: 0.0075 415/500 [=======================>......] - ETA: 43s - loss: 0.1964 - regression_loss: 0.1889 - classification_loss: 0.0075 416/500 [=======================>......] - ETA: 42s - loss: 0.1967 - regression_loss: 0.1891 - classification_loss: 0.0075 417/500 [========================>.....] - ETA: 42s - loss: 0.1966 - regression_loss: 0.1890 - classification_loss: 0.0076 418/500 [========================>.....] - ETA: 41s - loss: 0.1965 - regression_loss: 0.1889 - classification_loss: 0.0075 419/500 [========================>.....] - ETA: 41s - loss: 0.1962 - regression_loss: 0.1886 - classification_loss: 0.0075 420/500 [========================>.....] - ETA: 40s - loss: 0.1961 - regression_loss: 0.1886 - classification_loss: 0.0075 421/500 [========================>.....] - ETA: 40s - loss: 0.1961 - regression_loss: 0.1886 - classification_loss: 0.0075 422/500 [========================>.....] - ETA: 39s - loss: 0.1959 - regression_loss: 0.1884 - classification_loss: 0.0075 423/500 [========================>.....] - ETA: 39s - loss: 0.1958 - regression_loss: 0.1883 - classification_loss: 0.0075 424/500 [========================>.....] - ETA: 38s - loss: 0.1960 - regression_loss: 0.1885 - classification_loss: 0.0075 425/500 [========================>.....] - ETA: 38s - loss: 0.1957 - regression_loss: 0.1882 - classification_loss: 0.0075 426/500 [========================>.....] - ETA: 37s - loss: 0.1959 - regression_loss: 0.1884 - classification_loss: 0.0075 427/500 [========================>.....] - ETA: 37s - loss: 0.1961 - regression_loss: 0.1887 - classification_loss: 0.0075 428/500 [========================>.....] - ETA: 36s - loss: 0.1960 - regression_loss: 0.1885 - classification_loss: 0.0075 429/500 [========================>.....] - ETA: 36s - loss: 0.1960 - regression_loss: 0.1885 - classification_loss: 0.0075 430/500 [========================>.....] - ETA: 35s - loss: 0.1958 - regression_loss: 0.1883 - classification_loss: 0.0075 431/500 [========================>.....] - ETA: 35s - loss: 0.1955 - regression_loss: 0.1880 - classification_loss: 0.0075 432/500 [========================>.....] - ETA: 34s - loss: 0.1956 - regression_loss: 0.1881 - classification_loss: 0.0075 433/500 [========================>.....] - ETA: 34s - loss: 0.1954 - regression_loss: 0.1880 - classification_loss: 0.0075 434/500 [=========================>....] - ETA: 33s - loss: 0.1954 - regression_loss: 0.1879 - classification_loss: 0.0074 435/500 [=========================>....] - ETA: 33s - loss: 0.1953 - regression_loss: 0.1878 - classification_loss: 0.0074 436/500 [=========================>....] - ETA: 32s - loss: 0.1955 - regression_loss: 0.1881 - classification_loss: 0.0074 437/500 [=========================>....] - ETA: 32s - loss: 0.1952 - regression_loss: 0.1878 - classification_loss: 0.0074 438/500 [=========================>....] - ETA: 31s - loss: 0.1953 - regression_loss: 0.1879 - classification_loss: 0.0074 439/500 [=========================>....] - ETA: 31s - loss: 0.1953 - regression_loss: 0.1878 - classification_loss: 0.0074 440/500 [=========================>....] - ETA: 30s - loss: 0.1953 - regression_loss: 0.1879 - classification_loss: 0.0074 441/500 [=========================>....] - ETA: 30s - loss: 0.1955 - regression_loss: 0.1880 - classification_loss: 0.0074 442/500 [=========================>....] - ETA: 29s - loss: 0.1955 - regression_loss: 0.1881 - classification_loss: 0.0074 443/500 [=========================>....] - ETA: 29s - loss: 0.1954 - regression_loss: 0.1880 - classification_loss: 0.0074 444/500 [=========================>....] - ETA: 28s - loss: 0.1955 - regression_loss: 0.1880 - classification_loss: 0.0074 445/500 [=========================>....] - ETA: 28s - loss: 0.1956 - regression_loss: 0.1882 - classification_loss: 0.0074 446/500 [=========================>....] - ETA: 27s - loss: 0.1956 - regression_loss: 0.1881 - classification_loss: 0.0074 447/500 [=========================>....] - ETA: 27s - loss: 0.1954 - regression_loss: 0.1880 - classification_loss: 0.0074 448/500 [=========================>....] - ETA: 26s - loss: 0.1955 - regression_loss: 0.1881 - classification_loss: 0.0074 449/500 [=========================>....] - ETA: 25s - loss: 0.1954 - regression_loss: 0.1880 - classification_loss: 0.0074 450/500 [==========================>...] - ETA: 25s - loss: 0.1954 - regression_loss: 0.1880 - classification_loss: 0.0074 451/500 [==========================>...] - ETA: 24s - loss: 0.1958 - regression_loss: 0.1883 - classification_loss: 0.0075 452/500 [==========================>...] - ETA: 24s - loss: 0.1955 - regression_loss: 0.1880 - classification_loss: 0.0074 453/500 [==========================>...] - ETA: 23s - loss: 0.1953 - regression_loss: 0.1878 - classification_loss: 0.0074 454/500 [==========================>...] - ETA: 23s - loss: 0.1954 - regression_loss: 0.1880 - classification_loss: 0.0074 455/500 [==========================>...] - ETA: 22s - loss: 0.1957 - regression_loss: 0.1883 - classification_loss: 0.0074 456/500 [==========================>...] - ETA: 22s - loss: 0.1955 - regression_loss: 0.1880 - classification_loss: 0.0074 457/500 [==========================>...] - ETA: 21s - loss: 0.1955 - regression_loss: 0.1881 - classification_loss: 0.0074 458/500 [==========================>...] - ETA: 21s - loss: 0.1955 - regression_loss: 0.1881 - classification_loss: 0.0074 459/500 [==========================>...] - ETA: 20s - loss: 0.1953 - regression_loss: 0.1879 - classification_loss: 0.0074 460/500 [==========================>...] - ETA: 20s - loss: 0.1957 - regression_loss: 0.1883 - classification_loss: 0.0074 461/500 [==========================>...] - ETA: 19s - loss: 0.1967 - regression_loss: 0.1892 - classification_loss: 0.0074 462/500 [==========================>...] - ETA: 19s - loss: 0.1965 - regression_loss: 0.1890 - classification_loss: 0.0074 463/500 [==========================>...] - ETA: 18s - loss: 0.1964 - regression_loss: 0.1890 - classification_loss: 0.0074 464/500 [==========================>...] - ETA: 18s - loss: 0.1964 - regression_loss: 0.1889 - classification_loss: 0.0074 465/500 [==========================>...] - ETA: 17s - loss: 0.1967 - regression_loss: 0.1893 - classification_loss: 0.0074 466/500 [==========================>...] - ETA: 17s - loss: 0.1974 - regression_loss: 0.1900 - classification_loss: 0.0074 467/500 [===========================>..] - ETA: 16s - loss: 0.1973 - regression_loss: 0.1899 - classification_loss: 0.0075 468/500 [===========================>..] - ETA: 16s - loss: 0.1972 - regression_loss: 0.1898 - classification_loss: 0.0074 469/500 [===========================>..] - ETA: 15s - loss: 0.1971 - regression_loss: 0.1897 - classification_loss: 0.0074 470/500 [===========================>..] - ETA: 15s - loss: 0.1971 - regression_loss: 0.1897 - classification_loss: 0.0074 471/500 [===========================>..] - ETA: 14s - loss: 0.1970 - regression_loss: 0.1896 - classification_loss: 0.0074 472/500 [===========================>..] - ETA: 14s - loss: 0.1969 - regression_loss: 0.1895 - classification_loss: 0.0074 473/500 [===========================>..] - ETA: 13s - loss: 0.1968 - regression_loss: 0.1894 - classification_loss: 0.0074 474/500 [===========================>..] - ETA: 13s - loss: 0.1967 - regression_loss: 0.1893 - classification_loss: 0.0074 475/500 [===========================>..] - ETA: 12s - loss: 0.1966 - regression_loss: 0.1892 - classification_loss: 0.0074 476/500 [===========================>..] - ETA: 12s - loss: 0.1964 - regression_loss: 0.1890 - classification_loss: 0.0074 477/500 [===========================>..] - ETA: 11s - loss: 0.1963 - regression_loss: 0.1889 - classification_loss: 0.0074 478/500 [===========================>..] - ETA: 11s - loss: 0.1968 - regression_loss: 0.1894 - classification_loss: 0.0074 479/500 [===========================>..] - ETA: 10s - loss: 0.1966 - regression_loss: 0.1893 - classification_loss: 0.0073 480/500 [===========================>..] - ETA: 10s - loss: 0.1966 - regression_loss: 0.1892 - classification_loss: 0.0073 481/500 [===========================>..] - ETA: 9s - loss: 0.1965 - regression_loss: 0.1892 - classification_loss: 0.0073  482/500 [===========================>..] - ETA: 9s - loss: 0.1963 - regression_loss: 0.1889 - classification_loss: 0.0073 483/500 [===========================>..] - ETA: 8s - loss: 0.1960 - regression_loss: 0.1887 - classification_loss: 0.0073 484/500 [============================>.] - ETA: 8s - loss: 0.1962 - regression_loss: 0.1889 - classification_loss: 0.0073 485/500 [============================>.] - ETA: 7s - loss: 0.1960 - regression_loss: 0.1887 - classification_loss: 0.0073 486/500 [============================>.] - ETA: 7s - loss: 0.1961 - regression_loss: 0.1888 - classification_loss: 0.0073 487/500 [============================>.] - ETA: 6s - loss: 0.1960 - regression_loss: 0.1887 - classification_loss: 0.0073 488/500 [============================>.] - ETA: 6s - loss: 0.1959 - regression_loss: 0.1886 - classification_loss: 0.0073 489/500 [============================>.] - ETA: 5s - loss: 0.1960 - regression_loss: 0.1887 - classification_loss: 0.0073 490/500 [============================>.] - ETA: 5s - loss: 0.1959 - regression_loss: 0.1886 - classification_loss: 0.0073 491/500 [============================>.] - ETA: 4s - loss: 0.1959 - regression_loss: 0.1886 - classification_loss: 0.0073 492/500 [============================>.] - ETA: 4s - loss: 0.1957 - regression_loss: 0.1884 - classification_loss: 0.0073 493/500 [============================>.] - ETA: 3s - loss: 0.1955 - regression_loss: 0.1882 - classification_loss: 0.0073 494/500 [============================>.] - ETA: 3s - loss: 0.1954 - regression_loss: 0.1882 - classification_loss: 0.0073 495/500 [============================>.] - ETA: 2s - loss: 0.1953 - regression_loss: 0.1880 - classification_loss: 0.0073 496/500 [============================>.] - ETA: 2s - loss: 0.1951 - regression_loss: 0.1878 - classification_loss: 0.0073 497/500 [============================>.] - ETA: 1s - loss: 0.1951 - regression_loss: 0.1878 - classification_loss: 0.0073 498/500 [============================>.] - ETA: 1s - loss: 0.1950 - regression_loss: 0.1878 - classification_loss: 0.0073 499/500 [============================>.] - ETA: 0s - loss: 0.1949 - regression_loss: 0.1877 - classification_loss: 0.0073 500/500 [==============================] - 255s 510ms/step - loss: 0.1950 - regression_loss: 0.1877 - classification_loss: 0.0073 1172 instances of class plum with average precision: 0.7513 mAP: 0.7513 Epoch 00041: saving model to ./training/snapshots/resnet101_pascal_41.h5 Epoch 42/150 1/500 [..............................] - ETA: 4:09 - loss: 0.1987 - regression_loss: 0.1965 - classification_loss: 0.0022 2/500 [..............................] - ETA: 4:10 - loss: 0.2002 - regression_loss: 0.1956 - classification_loss: 0.0045 3/500 [..............................] - ETA: 4:12 - loss: 0.1790 - regression_loss: 0.1749 - classification_loss: 0.0042 4/500 [..............................] - ETA: 4:14 - loss: 0.1769 - regression_loss: 0.1716 - classification_loss: 0.0052 5/500 [..............................] - ETA: 4:16 - loss: 0.1977 - regression_loss: 0.1902 - classification_loss: 0.0075 6/500 [..............................] - ETA: 4:16 - loss: 0.1965 - regression_loss: 0.1893 - classification_loss: 0.0072 7/500 [..............................] - ETA: 4:14 - loss: 0.1972 - regression_loss: 0.1908 - classification_loss: 0.0063 8/500 [..............................] - ETA: 4:13 - loss: 0.2067 - regression_loss: 0.2000 - classification_loss: 0.0067 9/500 [..............................] - ETA: 4:12 - loss: 0.2078 - regression_loss: 0.2017 - classification_loss: 0.0062 10/500 [..............................] - ETA: 4:11 - loss: 0.2148 - regression_loss: 0.2070 - classification_loss: 0.0078 11/500 [..............................] - ETA: 4:10 - loss: 0.2110 - regression_loss: 0.2034 - classification_loss: 0.0077 12/500 [..............................] - ETA: 4:10 - loss: 0.2053 - regression_loss: 0.1978 - classification_loss: 0.0075 13/500 [..............................] - ETA: 4:10 - loss: 0.2068 - regression_loss: 0.1980 - classification_loss: 0.0088 14/500 [..............................] - ETA: 4:10 - loss: 0.2062 - regression_loss: 0.1978 - classification_loss: 0.0085 15/500 [..............................] - ETA: 4:09 - loss: 0.1957 - regression_loss: 0.1877 - classification_loss: 0.0080 16/500 [..............................] - ETA: 4:08 - loss: 0.1993 - regression_loss: 0.1912 - classification_loss: 0.0082 17/500 [>.............................] - ETA: 4:08 - loss: 0.1922 - regression_loss: 0.1844 - classification_loss: 0.0077 18/500 [>.............................] - ETA: 4:07 - loss: 0.1884 - regression_loss: 0.1810 - classification_loss: 0.0074 19/500 [>.............................] - ETA: 4:06 - loss: 0.1863 - regression_loss: 0.1788 - classification_loss: 0.0075 20/500 [>.............................] - ETA: 4:06 - loss: 0.1934 - regression_loss: 0.1830 - classification_loss: 0.0104 21/500 [>.............................] - ETA: 4:05 - loss: 0.1895 - regression_loss: 0.1796 - classification_loss: 0.0099 22/500 [>.............................] - ETA: 4:05 - loss: 0.1898 - regression_loss: 0.1801 - classification_loss: 0.0098 23/500 [>.............................] - ETA: 4:04 - loss: 0.1893 - regression_loss: 0.1796 - classification_loss: 0.0097 24/500 [>.............................] - ETA: 4:03 - loss: 0.1868 - regression_loss: 0.1773 - classification_loss: 0.0095 25/500 [>.............................] - ETA: 4:03 - loss: 0.1840 - regression_loss: 0.1748 - classification_loss: 0.0092 26/500 [>.............................] - ETA: 4:03 - loss: 0.1831 - regression_loss: 0.1741 - classification_loss: 0.0090 27/500 [>.............................] - ETA: 4:02 - loss: 0.1863 - regression_loss: 0.1765 - classification_loss: 0.0098 28/500 [>.............................] - ETA: 4:02 - loss: 0.1890 - regression_loss: 0.1790 - classification_loss: 0.0100 29/500 [>.............................] - ETA: 4:01 - loss: 0.1899 - regression_loss: 0.1798 - classification_loss: 0.0101 30/500 [>.............................] - ETA: 4:01 - loss: 0.1921 - regression_loss: 0.1820 - classification_loss: 0.0101 31/500 [>.............................] - ETA: 4:00 - loss: 0.1910 - regression_loss: 0.1812 - classification_loss: 0.0098 32/500 [>.............................] - ETA: 3:59 - loss: 0.1923 - regression_loss: 0.1826 - classification_loss: 0.0097 33/500 [>.............................] - ETA: 3:59 - loss: 0.1928 - regression_loss: 0.1833 - classification_loss: 0.0094 34/500 [=>............................] - ETA: 3:59 - loss: 0.1892 - regression_loss: 0.1800 - classification_loss: 0.0092 35/500 [=>............................] - ETA: 3:58 - loss: 0.1898 - regression_loss: 0.1806 - classification_loss: 0.0092 36/500 [=>............................] - ETA: 3:58 - loss: 0.1869 - regression_loss: 0.1779 - classification_loss: 0.0090 37/500 [=>............................] - ETA: 3:57 - loss: 0.1882 - regression_loss: 0.1793 - classification_loss: 0.0089 38/500 [=>............................] - ETA: 3:57 - loss: 0.1902 - regression_loss: 0.1812 - classification_loss: 0.0090 39/500 [=>............................] - ETA: 3:56 - loss: 0.1872 - regression_loss: 0.1784 - classification_loss: 0.0088 40/500 [=>............................] - ETA: 3:56 - loss: 0.1880 - regression_loss: 0.1794 - classification_loss: 0.0086 41/500 [=>............................] - ETA: 3:55 - loss: 0.1876 - regression_loss: 0.1790 - classification_loss: 0.0086 42/500 [=>............................] - ETA: 3:54 - loss: 0.1907 - regression_loss: 0.1820 - classification_loss: 0.0087 43/500 [=>............................] - ETA: 3:54 - loss: 0.1908 - regression_loss: 0.1821 - classification_loss: 0.0087 44/500 [=>............................] - ETA: 3:53 - loss: 0.1923 - regression_loss: 0.1835 - classification_loss: 0.0088 45/500 [=>............................] - ETA: 3:53 - loss: 0.1921 - regression_loss: 0.1833 - classification_loss: 0.0087 46/500 [=>............................] - ETA: 3:53 - loss: 0.1913 - regression_loss: 0.1826 - classification_loss: 0.0087 47/500 [=>............................] - ETA: 3:52 - loss: 0.1907 - regression_loss: 0.1821 - classification_loss: 0.0086 48/500 [=>............................] - ETA: 3:51 - loss: 0.1899 - regression_loss: 0.1813 - classification_loss: 0.0086 49/500 [=>............................] - ETA: 3:51 - loss: 0.1882 - regression_loss: 0.1798 - classification_loss: 0.0084 50/500 [==>...........................] - ETA: 3:50 - loss: 0.1878 - regression_loss: 0.1795 - classification_loss: 0.0083 51/500 [==>...........................] - ETA: 3:50 - loss: 0.1892 - regression_loss: 0.1808 - classification_loss: 0.0084 52/500 [==>...........................] - ETA: 3:50 - loss: 0.1907 - regression_loss: 0.1820 - classification_loss: 0.0087 53/500 [==>...........................] - ETA: 3:49 - loss: 0.1905 - regression_loss: 0.1817 - classification_loss: 0.0089 54/500 [==>...........................] - ETA: 3:49 - loss: 0.1898 - regression_loss: 0.1811 - classification_loss: 0.0088 55/500 [==>...........................] - ETA: 3:48 - loss: 0.1914 - regression_loss: 0.1827 - classification_loss: 0.0087 56/500 [==>...........................] - ETA: 3:48 - loss: 0.1964 - regression_loss: 0.1876 - classification_loss: 0.0088 57/500 [==>...........................] - ETA: 3:47 - loss: 0.1952 - regression_loss: 0.1866 - classification_loss: 0.0086 58/500 [==>...........................] - ETA: 3:46 - loss: 0.1934 - regression_loss: 0.1849 - classification_loss: 0.0085 59/500 [==>...........................] - ETA: 3:46 - loss: 0.1947 - regression_loss: 0.1862 - classification_loss: 0.0085 60/500 [==>...........................] - ETA: 3:45 - loss: 0.1923 - regression_loss: 0.1839 - classification_loss: 0.0084 61/500 [==>...........................] - ETA: 3:44 - loss: 0.1929 - regression_loss: 0.1845 - classification_loss: 0.0084 62/500 [==>...........................] - ETA: 3:44 - loss: 0.1921 - regression_loss: 0.1838 - classification_loss: 0.0083 63/500 [==>...........................] - ETA: 3:43 - loss: 0.1898 - regression_loss: 0.1816 - classification_loss: 0.0082 64/500 [==>...........................] - ETA: 3:43 - loss: 0.1905 - regression_loss: 0.1823 - classification_loss: 0.0082 65/500 [==>...........................] - ETA: 3:42 - loss: 0.1889 - regression_loss: 0.1808 - classification_loss: 0.0081 66/500 [==>...........................] - ETA: 3:42 - loss: 0.1879 - regression_loss: 0.1799 - classification_loss: 0.0080 67/500 [===>..........................] - ETA: 3:41 - loss: 0.1863 - regression_loss: 0.1784 - classification_loss: 0.0079 68/500 [===>..........................] - ETA: 3:41 - loss: 0.1887 - regression_loss: 0.1809 - classification_loss: 0.0079 69/500 [===>..........................] - ETA: 3:40 - loss: 0.1880 - regression_loss: 0.1802 - classification_loss: 0.0079 70/500 [===>..........................] - ETA: 3:40 - loss: 0.1869 - regression_loss: 0.1791 - classification_loss: 0.0078 71/500 [===>..........................] - ETA: 3:39 - loss: 0.1873 - regression_loss: 0.1794 - classification_loss: 0.0078 72/500 [===>..........................] - ETA: 3:39 - loss: 0.1861 - regression_loss: 0.1784 - classification_loss: 0.0078 73/500 [===>..........................] - ETA: 3:38 - loss: 0.1886 - regression_loss: 0.1807 - classification_loss: 0.0079 74/500 [===>..........................] - ETA: 3:38 - loss: 0.1879 - regression_loss: 0.1801 - classification_loss: 0.0078 75/500 [===>..........................] - ETA: 3:37 - loss: 0.1878 - regression_loss: 0.1801 - classification_loss: 0.0077 76/500 [===>..........................] - ETA: 3:37 - loss: 0.1882 - regression_loss: 0.1805 - classification_loss: 0.0077 77/500 [===>..........................] - ETA: 3:36 - loss: 0.1870 - regression_loss: 0.1793 - classification_loss: 0.0077 78/500 [===>..........................] - ETA: 3:36 - loss: 0.1870 - regression_loss: 0.1793 - classification_loss: 0.0076 79/500 [===>..........................] - ETA: 3:35 - loss: 0.1856 - regression_loss: 0.1781 - classification_loss: 0.0076 80/500 [===>..........................] - ETA: 3:34 - loss: 0.1861 - regression_loss: 0.1785 - classification_loss: 0.0076 81/500 [===>..........................] - ETA: 3:34 - loss: 0.1869 - regression_loss: 0.1793 - classification_loss: 0.0076 82/500 [===>..........................] - ETA: 3:33 - loss: 0.1873 - regression_loss: 0.1797 - classification_loss: 0.0076 83/500 [===>..........................] - ETA: 3:33 - loss: 0.1860 - regression_loss: 0.1785 - classification_loss: 0.0075 84/500 [====>.........................] - ETA: 3:32 - loss: 0.1872 - regression_loss: 0.1796 - classification_loss: 0.0076 85/500 [====>.........................] - ETA: 3:32 - loss: 0.1862 - regression_loss: 0.1786 - classification_loss: 0.0076 86/500 [====>.........................] - ETA: 3:31 - loss: 0.1857 - regression_loss: 0.1781 - classification_loss: 0.0075 87/500 [====>.........................] - ETA: 3:31 - loss: 0.1849 - regression_loss: 0.1773 - classification_loss: 0.0075 88/500 [====>.........................] - ETA: 3:30 - loss: 0.1842 - regression_loss: 0.1768 - classification_loss: 0.0075 89/500 [====>.........................] - ETA: 3:29 - loss: 0.1840 - regression_loss: 0.1766 - classification_loss: 0.0075 90/500 [====>.........................] - ETA: 3:29 - loss: 0.1846 - regression_loss: 0.1771 - classification_loss: 0.0075 91/500 [====>.........................] - ETA: 3:29 - loss: 0.1838 - regression_loss: 0.1763 - classification_loss: 0.0075 92/500 [====>.........................] - ETA: 3:28 - loss: 0.1838 - regression_loss: 0.1764 - classification_loss: 0.0074 93/500 [====>.........................] - ETA: 3:28 - loss: 0.1833 - regression_loss: 0.1759 - classification_loss: 0.0074 94/500 [====>.........................] - ETA: 3:27 - loss: 0.1828 - regression_loss: 0.1754 - classification_loss: 0.0074 95/500 [====>.........................] - ETA: 3:27 - loss: 0.1834 - regression_loss: 0.1760 - classification_loss: 0.0074 96/500 [====>.........................] - ETA: 3:26 - loss: 0.1825 - regression_loss: 0.1752 - classification_loss: 0.0073 97/500 [====>.........................] - ETA: 3:26 - loss: 0.1819 - regression_loss: 0.1746 - classification_loss: 0.0073 98/500 [====>.........................] - ETA: 3:25 - loss: 0.1807 - regression_loss: 0.1735 - classification_loss: 0.0073 99/500 [====>.........................] - ETA: 3:24 - loss: 0.1826 - regression_loss: 0.1753 - classification_loss: 0.0073 100/500 [=====>........................] - ETA: 3:24 - loss: 0.1837 - regression_loss: 0.1764 - classification_loss: 0.0073 101/500 [=====>........................] - ETA: 3:23 - loss: 0.1856 - regression_loss: 0.1782 - classification_loss: 0.0074 102/500 [=====>........................] - ETA: 3:23 - loss: 0.1846 - regression_loss: 0.1773 - classification_loss: 0.0073 103/500 [=====>........................] - ETA: 3:22 - loss: 0.1844 - regression_loss: 0.1770 - classification_loss: 0.0073 104/500 [=====>........................] - ETA: 3:22 - loss: 0.1855 - regression_loss: 0.1782 - classification_loss: 0.0073 105/500 [=====>........................] - ETA: 3:21 - loss: 0.1854 - regression_loss: 0.1781 - classification_loss: 0.0073 106/500 [=====>........................] - ETA: 3:21 - loss: 0.1864 - regression_loss: 0.1790 - classification_loss: 0.0074 107/500 [=====>........................] - ETA: 3:20 - loss: 0.1858 - regression_loss: 0.1784 - classification_loss: 0.0074 108/500 [=====>........................] - ETA: 3:20 - loss: 0.1861 - regression_loss: 0.1787 - classification_loss: 0.0074 109/500 [=====>........................] - ETA: 3:19 - loss: 0.1878 - regression_loss: 0.1804 - classification_loss: 0.0074 110/500 [=====>........................] - ETA: 3:19 - loss: 0.1875 - regression_loss: 0.1801 - classification_loss: 0.0074 111/500 [=====>........................] - ETA: 3:18 - loss: 0.1870 - regression_loss: 0.1796 - classification_loss: 0.0074 112/500 [=====>........................] - ETA: 3:18 - loss: 0.1859 - regression_loss: 0.1786 - classification_loss: 0.0073 113/500 [=====>........................] - ETA: 3:17 - loss: 0.1853 - regression_loss: 0.1780 - classification_loss: 0.0073 114/500 [=====>........................] - ETA: 3:17 - loss: 0.1852 - regression_loss: 0.1779 - classification_loss: 0.0073 115/500 [=====>........................] - ETA: 3:16 - loss: 0.1841 - regression_loss: 0.1768 - classification_loss: 0.0072 116/500 [=====>........................] - ETA: 3:16 - loss: 0.1829 - regression_loss: 0.1758 - classification_loss: 0.0071 117/500 [======>.......................] - ETA: 3:15 - loss: 0.1823 - regression_loss: 0.1752 - classification_loss: 0.0071 118/500 [======>.......................] - ETA: 3:15 - loss: 0.1820 - regression_loss: 0.1749 - classification_loss: 0.0071 119/500 [======>.......................] - ETA: 3:14 - loss: 0.1808 - regression_loss: 0.1738 - classification_loss: 0.0070 120/500 [======>.......................] - ETA: 3:14 - loss: 0.1818 - regression_loss: 0.1747 - classification_loss: 0.0071 121/500 [======>.......................] - ETA: 3:13 - loss: 0.1825 - regression_loss: 0.1755 - classification_loss: 0.0070 122/500 [======>.......................] - ETA: 3:12 - loss: 0.1824 - regression_loss: 0.1753 - classification_loss: 0.0071 123/500 [======>.......................] - ETA: 3:12 - loss: 0.1815 - regression_loss: 0.1745 - classification_loss: 0.0070 124/500 [======>.......................] - ETA: 3:11 - loss: 0.1809 - regression_loss: 0.1739 - classification_loss: 0.0070 125/500 [======>.......................] - ETA: 3:11 - loss: 0.1800 - regression_loss: 0.1731 - classification_loss: 0.0069 126/500 [======>.......................] - ETA: 3:10 - loss: 0.1789 - regression_loss: 0.1720 - classification_loss: 0.0069 127/500 [======>.......................] - ETA: 3:10 - loss: 0.1783 - regression_loss: 0.1714 - classification_loss: 0.0069 128/500 [======>.......................] - ETA: 3:09 - loss: 0.1782 - regression_loss: 0.1714 - classification_loss: 0.0068 129/500 [======>.......................] - ETA: 3:09 - loss: 0.1776 - regression_loss: 0.1708 - classification_loss: 0.0068 130/500 [======>.......................] - ETA: 3:08 - loss: 0.1791 - regression_loss: 0.1723 - classification_loss: 0.0069 131/500 [======>.......................] - ETA: 3:08 - loss: 0.1805 - regression_loss: 0.1735 - classification_loss: 0.0070 132/500 [======>.......................] - ETA: 3:07 - loss: 0.1802 - regression_loss: 0.1732 - classification_loss: 0.0069 133/500 [======>.......................] - ETA: 3:07 - loss: 0.1797 - regression_loss: 0.1727 - classification_loss: 0.0070 134/500 [=======>......................] - ETA: 3:06 - loss: 0.1790 - regression_loss: 0.1720 - classification_loss: 0.0069 135/500 [=======>......................] - ETA: 3:06 - loss: 0.1789 - regression_loss: 0.1720 - classification_loss: 0.0070 136/500 [=======>......................] - ETA: 3:05 - loss: 0.1784 - regression_loss: 0.1714 - classification_loss: 0.0069 137/500 [=======>......................] - ETA: 3:05 - loss: 0.1792 - regression_loss: 0.1722 - classification_loss: 0.0070 138/500 [=======>......................] - ETA: 3:04 - loss: 0.1789 - regression_loss: 0.1719 - classification_loss: 0.0070 139/500 [=======>......................] - ETA: 3:04 - loss: 0.1803 - regression_loss: 0.1733 - classification_loss: 0.0070 140/500 [=======>......................] - ETA: 3:03 - loss: 0.1800 - regression_loss: 0.1730 - classification_loss: 0.0070 141/500 [=======>......................] - ETA: 3:03 - loss: 0.1799 - regression_loss: 0.1729 - classification_loss: 0.0070 142/500 [=======>......................] - ETA: 3:02 - loss: 0.1796 - regression_loss: 0.1726 - classification_loss: 0.0070 143/500 [=======>......................] - ETA: 3:02 - loss: 0.1790 - regression_loss: 0.1720 - classification_loss: 0.0070 144/500 [=======>......................] - ETA: 3:01 - loss: 0.1794 - regression_loss: 0.1724 - classification_loss: 0.0070 145/500 [=======>......................] - ETA: 3:01 - loss: 0.1793 - regression_loss: 0.1724 - classification_loss: 0.0070 146/500 [=======>......................] - ETA: 3:00 - loss: 0.1788 - regression_loss: 0.1719 - classification_loss: 0.0069 147/500 [=======>......................] - ETA: 2:59 - loss: 0.1789 - regression_loss: 0.1720 - classification_loss: 0.0069 148/500 [=======>......................] - ETA: 2:59 - loss: 0.1800 - regression_loss: 0.1732 - classification_loss: 0.0069 149/500 [=======>......................] - ETA: 2:58 - loss: 0.1801 - regression_loss: 0.1732 - classification_loss: 0.0069 150/500 [========>.....................] - ETA: 2:58 - loss: 0.1805 - regression_loss: 0.1736 - classification_loss: 0.0069 151/500 [========>.....................] - ETA: 2:57 - loss: 0.1800 - regression_loss: 0.1732 - classification_loss: 0.0068 152/500 [========>.....................] - ETA: 2:57 - loss: 0.1805 - regression_loss: 0.1737 - classification_loss: 0.0068 153/500 [========>.....................] - ETA: 2:56 - loss: 0.1799 - regression_loss: 0.1731 - classification_loss: 0.0068 154/500 [========>.....................] - ETA: 2:56 - loss: 0.1799 - regression_loss: 0.1731 - classification_loss: 0.0068 155/500 [========>.....................] - ETA: 2:55 - loss: 0.1807 - regression_loss: 0.1738 - classification_loss: 0.0068 156/500 [========>.....................] - ETA: 2:55 - loss: 0.1802 - regression_loss: 0.1733 - classification_loss: 0.0068 157/500 [========>.....................] - ETA: 2:54 - loss: 0.1797 - regression_loss: 0.1729 - classification_loss: 0.0068 158/500 [========>.....................] - ETA: 2:54 - loss: 0.1799 - regression_loss: 0.1731 - classification_loss: 0.0068 159/500 [========>.....................] - ETA: 2:53 - loss: 0.1797 - regression_loss: 0.1730 - classification_loss: 0.0067 160/500 [========>.....................] - ETA: 2:53 - loss: 0.1801 - regression_loss: 0.1733 - classification_loss: 0.0068 161/500 [========>.....................] - ETA: 2:52 - loss: 0.1801 - regression_loss: 0.1733 - classification_loss: 0.0068 162/500 [========>.....................] - ETA: 2:52 - loss: 0.1798 - regression_loss: 0.1731 - classification_loss: 0.0068 163/500 [========>.....................] - ETA: 2:51 - loss: 0.1793 - regression_loss: 0.1726 - classification_loss: 0.0067 164/500 [========>.....................] - ETA: 2:51 - loss: 0.1787 - regression_loss: 0.1720 - classification_loss: 0.0067 165/500 [========>.....................] - ETA: 2:50 - loss: 0.1788 - regression_loss: 0.1721 - classification_loss: 0.0067 166/500 [========>.....................] - ETA: 2:50 - loss: 0.1785 - regression_loss: 0.1719 - classification_loss: 0.0067 167/500 [=========>....................] - ETA: 2:49 - loss: 0.1782 - regression_loss: 0.1715 - classification_loss: 0.0067 168/500 [=========>....................] - ETA: 2:49 - loss: 0.1777 - regression_loss: 0.1710 - classification_loss: 0.0067 169/500 [=========>....................] - ETA: 2:48 - loss: 0.1782 - regression_loss: 0.1715 - classification_loss: 0.0067 170/500 [=========>....................] - ETA: 2:47 - loss: 0.1786 - regression_loss: 0.1718 - classification_loss: 0.0068 171/500 [=========>....................] - ETA: 2:47 - loss: 0.1783 - regression_loss: 0.1715 - classification_loss: 0.0068 172/500 [=========>....................] - ETA: 2:46 - loss: 0.1778 - regression_loss: 0.1710 - classification_loss: 0.0068 173/500 [=========>....................] - ETA: 2:46 - loss: 0.1778 - regression_loss: 0.1710 - classification_loss: 0.0068 174/500 [=========>....................] - ETA: 2:45 - loss: 0.1778 - regression_loss: 0.1711 - classification_loss: 0.0068 175/500 [=========>....................] - ETA: 2:45 - loss: 0.1780 - regression_loss: 0.1712 - classification_loss: 0.0068 176/500 [=========>....................] - ETA: 2:44 - loss: 0.1785 - regression_loss: 0.1716 - classification_loss: 0.0069 177/500 [=========>....................] - ETA: 2:44 - loss: 0.1793 - regression_loss: 0.1723 - classification_loss: 0.0070 178/500 [=========>....................] - ETA: 2:43 - loss: 0.1792 - regression_loss: 0.1723 - classification_loss: 0.0070 179/500 [=========>....................] - ETA: 2:43 - loss: 0.1792 - regression_loss: 0.1722 - classification_loss: 0.0070 180/500 [=========>....................] - ETA: 2:42 - loss: 0.1801 - regression_loss: 0.1731 - classification_loss: 0.0070 181/500 [=========>....................] - ETA: 2:42 - loss: 0.1805 - regression_loss: 0.1734 - classification_loss: 0.0071 182/500 [=========>....................] - ETA: 2:41 - loss: 0.1805 - regression_loss: 0.1734 - classification_loss: 0.0071 183/500 [=========>....................] - ETA: 2:41 - loss: 0.1806 - regression_loss: 0.1735 - classification_loss: 0.0071 184/500 [==========>...................] - ETA: 2:40 - loss: 0.1808 - regression_loss: 0.1738 - classification_loss: 0.0071 185/500 [==========>...................] - ETA: 2:40 - loss: 0.1816 - regression_loss: 0.1745 - classification_loss: 0.0071 186/500 [==========>...................] - ETA: 2:39 - loss: 0.1814 - regression_loss: 0.1744 - classification_loss: 0.0070 187/500 [==========>...................] - ETA: 2:39 - loss: 0.1813 - regression_loss: 0.1743 - classification_loss: 0.0070 188/500 [==========>...................] - ETA: 2:38 - loss: 0.1809 - regression_loss: 0.1739 - classification_loss: 0.0070 189/500 [==========>...................] - ETA: 2:38 - loss: 0.1805 - regression_loss: 0.1735 - classification_loss: 0.0070 190/500 [==========>...................] - ETA: 2:37 - loss: 0.1804 - regression_loss: 0.1735 - classification_loss: 0.0069 191/500 [==========>...................] - ETA: 2:37 - loss: 0.1799 - regression_loss: 0.1730 - classification_loss: 0.0069 192/500 [==========>...................] - ETA: 2:36 - loss: 0.1799 - regression_loss: 0.1730 - classification_loss: 0.0069 193/500 [==========>...................] - ETA: 2:36 - loss: 0.1799 - regression_loss: 0.1729 - classification_loss: 0.0069 194/500 [==========>...................] - ETA: 2:35 - loss: 0.1803 - regression_loss: 0.1734 - classification_loss: 0.0069 195/500 [==========>...................] - ETA: 2:35 - loss: 0.1803 - regression_loss: 0.1734 - classification_loss: 0.0069 196/500 [==========>...................] - ETA: 2:34 - loss: 0.1803 - regression_loss: 0.1733 - classification_loss: 0.0069 197/500 [==========>...................] - ETA: 2:34 - loss: 0.1801 - regression_loss: 0.1732 - classification_loss: 0.0069 198/500 [==========>...................] - ETA: 2:33 - loss: 0.1803 - regression_loss: 0.1733 - classification_loss: 0.0069 199/500 [==========>...................] - ETA: 2:33 - loss: 0.1806 - regression_loss: 0.1737 - classification_loss: 0.0069 200/500 [===========>..................] - ETA: 2:32 - loss: 0.1808 - regression_loss: 0.1739 - classification_loss: 0.0069 201/500 [===========>..................] - ETA: 2:32 - loss: 0.1811 - regression_loss: 0.1741 - classification_loss: 0.0070 202/500 [===========>..................] - ETA: 2:31 - loss: 0.1810 - regression_loss: 0.1740 - classification_loss: 0.0070 203/500 [===========>..................] - ETA: 2:31 - loss: 0.1810 - regression_loss: 0.1740 - classification_loss: 0.0070 204/500 [===========>..................] - ETA: 2:30 - loss: 0.1810 - regression_loss: 0.1741 - classification_loss: 0.0070 205/500 [===========>..................] - ETA: 2:30 - loss: 0.1815 - regression_loss: 0.1745 - classification_loss: 0.0070 206/500 [===========>..................] - ETA: 2:29 - loss: 0.1816 - regression_loss: 0.1746 - classification_loss: 0.0070 207/500 [===========>..................] - ETA: 2:28 - loss: 0.1814 - regression_loss: 0.1745 - classification_loss: 0.0069 208/500 [===========>..................] - ETA: 2:28 - loss: 0.1825 - regression_loss: 0.1753 - classification_loss: 0.0072 209/500 [===========>..................] - ETA: 2:27 - loss: 0.1821 - regression_loss: 0.1749 - classification_loss: 0.0072 210/500 [===========>..................] - ETA: 2:27 - loss: 0.1823 - regression_loss: 0.1751 - classification_loss: 0.0072 211/500 [===========>..................] - ETA: 2:26 - loss: 0.1821 - regression_loss: 0.1749 - classification_loss: 0.0072 212/500 [===========>..................] - ETA: 2:26 - loss: 0.1819 - regression_loss: 0.1747 - classification_loss: 0.0072 213/500 [===========>..................] - ETA: 2:25 - loss: 0.1811 - regression_loss: 0.1740 - classification_loss: 0.0071 214/500 [===========>..................] - ETA: 2:25 - loss: 0.1819 - regression_loss: 0.1747 - classification_loss: 0.0072 215/500 [===========>..................] - ETA: 2:24 - loss: 0.1818 - regression_loss: 0.1746 - classification_loss: 0.0072 216/500 [===========>..................] - ETA: 2:24 - loss: 0.1821 - regression_loss: 0.1749 - classification_loss: 0.0072 217/500 [============>.................] - ETA: 2:23 - loss: 0.1820 - regression_loss: 0.1749 - classification_loss: 0.0072 218/500 [============>.................] - ETA: 2:23 - loss: 0.1818 - regression_loss: 0.1747 - classification_loss: 0.0071 219/500 [============>.................] - ETA: 2:22 - loss: 0.1816 - regression_loss: 0.1745 - classification_loss: 0.0071 220/500 [============>.................] - ETA: 2:22 - loss: 0.1825 - regression_loss: 0.1754 - classification_loss: 0.0071 221/500 [============>.................] - ETA: 2:21 - loss: 0.1825 - regression_loss: 0.1754 - classification_loss: 0.0071 222/500 [============>.................] - ETA: 2:21 - loss: 0.1824 - regression_loss: 0.1753 - classification_loss: 0.0071 223/500 [============>.................] - ETA: 2:20 - loss: 0.1820 - regression_loss: 0.1749 - classification_loss: 0.0071 224/500 [============>.................] - ETA: 2:20 - loss: 0.1817 - regression_loss: 0.1746 - classification_loss: 0.0071 225/500 [============>.................] - ETA: 2:19 - loss: 0.1821 - regression_loss: 0.1749 - classification_loss: 0.0071 226/500 [============>.................] - ETA: 2:19 - loss: 0.1816 - regression_loss: 0.1745 - classification_loss: 0.0071 227/500 [============>.................] - ETA: 2:18 - loss: 0.1817 - regression_loss: 0.1746 - classification_loss: 0.0071 228/500 [============>.................] - ETA: 2:18 - loss: 0.1813 - regression_loss: 0.1743 - classification_loss: 0.0071 229/500 [============>.................] - ETA: 2:17 - loss: 0.1819 - regression_loss: 0.1748 - classification_loss: 0.0071 230/500 [============>.................] - ETA: 2:17 - loss: 0.1815 - regression_loss: 0.1744 - classification_loss: 0.0071 231/500 [============>.................] - ETA: 2:16 - loss: 0.1813 - regression_loss: 0.1742 - classification_loss: 0.0071 232/500 [============>.................] - ETA: 2:16 - loss: 0.1811 - regression_loss: 0.1740 - classification_loss: 0.0071 233/500 [============>.................] - ETA: 2:15 - loss: 0.1811 - regression_loss: 0.1740 - classification_loss: 0.0071 234/500 [=============>................] - ETA: 2:15 - loss: 0.1810 - regression_loss: 0.1739 - classification_loss: 0.0071 235/500 [=============>................] - ETA: 2:14 - loss: 0.1805 - regression_loss: 0.1734 - classification_loss: 0.0070 236/500 [=============>................] - ETA: 2:14 - loss: 0.1805 - regression_loss: 0.1734 - classification_loss: 0.0071 237/500 [=============>................] - ETA: 2:13 - loss: 0.1806 - regression_loss: 0.1736 - classification_loss: 0.0070 238/500 [=============>................] - ETA: 2:13 - loss: 0.1801 - regression_loss: 0.1731 - classification_loss: 0.0070 239/500 [=============>................] - ETA: 2:12 - loss: 0.1802 - regression_loss: 0.1732 - classification_loss: 0.0070 240/500 [=============>................] - ETA: 2:12 - loss: 0.1802 - regression_loss: 0.1732 - classification_loss: 0.0070 241/500 [=============>................] - ETA: 2:11 - loss: 0.1802 - regression_loss: 0.1732 - classification_loss: 0.0070 242/500 [=============>................] - ETA: 2:11 - loss: 0.1801 - regression_loss: 0.1732 - classification_loss: 0.0070 243/500 [=============>................] - ETA: 2:10 - loss: 0.1803 - regression_loss: 0.1733 - classification_loss: 0.0070 244/500 [=============>................] - ETA: 2:10 - loss: 0.1801 - regression_loss: 0.1732 - classification_loss: 0.0070 245/500 [=============>................] - ETA: 2:09 - loss: 0.1800 - regression_loss: 0.1730 - classification_loss: 0.0070 246/500 [=============>................] - ETA: 2:09 - loss: 0.1799 - regression_loss: 0.1729 - classification_loss: 0.0070 247/500 [=============>................] - ETA: 2:08 - loss: 0.1802 - regression_loss: 0.1732 - classification_loss: 0.0070 248/500 [=============>................] - ETA: 2:08 - loss: 0.1806 - regression_loss: 0.1736 - classification_loss: 0.0070 249/500 [=============>................] - ETA: 2:07 - loss: 0.1811 - regression_loss: 0.1740 - classification_loss: 0.0070 250/500 [==============>...............] - ETA: 2:07 - loss: 0.1810 - regression_loss: 0.1740 - classification_loss: 0.0070 251/500 [==============>...............] - ETA: 2:06 - loss: 0.1808 - regression_loss: 0.1738 - classification_loss: 0.0070 252/500 [==============>...............] - ETA: 2:06 - loss: 0.1806 - regression_loss: 0.1736 - classification_loss: 0.0070 253/500 [==============>...............] - ETA: 2:05 - loss: 0.1803 - regression_loss: 0.1734 - classification_loss: 0.0070 254/500 [==============>...............] - ETA: 2:05 - loss: 0.1806 - regression_loss: 0.1736 - classification_loss: 0.0070 255/500 [==============>...............] - ETA: 2:04 - loss: 0.1805 - regression_loss: 0.1736 - classification_loss: 0.0070 256/500 [==============>...............] - ETA: 2:03 - loss: 0.1806 - regression_loss: 0.1736 - classification_loss: 0.0070 257/500 [==============>...............] - ETA: 2:03 - loss: 0.1805 - regression_loss: 0.1735 - classification_loss: 0.0070 258/500 [==============>...............] - ETA: 2:02 - loss: 0.1810 - regression_loss: 0.1740 - classification_loss: 0.0070 259/500 [==============>...............] - ETA: 2:02 - loss: 0.1809 - regression_loss: 0.1739 - classification_loss: 0.0070 260/500 [==============>...............] - ETA: 2:01 - loss: 0.1809 - regression_loss: 0.1739 - classification_loss: 0.0070 261/500 [==============>...............] - ETA: 2:01 - loss: 0.1810 - regression_loss: 0.1739 - classification_loss: 0.0070 262/500 [==============>...............] - ETA: 2:00 - loss: 0.1812 - regression_loss: 0.1741 - classification_loss: 0.0071 263/500 [==============>...............] - ETA: 2:00 - loss: 0.1809 - regression_loss: 0.1738 - classification_loss: 0.0071 264/500 [==============>...............] - ETA: 1:59 - loss: 0.1806 - regression_loss: 0.1736 - classification_loss: 0.0071 265/500 [==============>...............] - ETA: 1:59 - loss: 0.1807 - regression_loss: 0.1736 - classification_loss: 0.0070 266/500 [==============>...............] - ETA: 1:58 - loss: 0.1805 - regression_loss: 0.1735 - classification_loss: 0.0070 267/500 [===============>..............] - ETA: 1:58 - loss: 0.1801 - regression_loss: 0.1731 - classification_loss: 0.0070 268/500 [===============>..............] - ETA: 1:57 - loss: 0.1802 - regression_loss: 0.1731 - classification_loss: 0.0070 269/500 [===============>..............] - ETA: 1:57 - loss: 0.1799 - regression_loss: 0.1729 - classification_loss: 0.0070 270/500 [===============>..............] - ETA: 1:56 - loss: 0.1806 - regression_loss: 0.1736 - classification_loss: 0.0070 271/500 [===============>..............] - ETA: 1:56 - loss: 0.1811 - regression_loss: 0.1741 - classification_loss: 0.0071 272/500 [===============>..............] - ETA: 1:55 - loss: 0.1811 - regression_loss: 0.1741 - classification_loss: 0.0070 273/500 [===============>..............] - ETA: 1:55 - loss: 0.1811 - regression_loss: 0.1740 - classification_loss: 0.0070 274/500 [===============>..............] - ETA: 1:54 - loss: 0.1807 - regression_loss: 0.1737 - classification_loss: 0.0070 275/500 [===============>..............] - ETA: 1:54 - loss: 0.1806 - regression_loss: 0.1736 - classification_loss: 0.0070 276/500 [===============>..............] - ETA: 1:53 - loss: 0.1804 - regression_loss: 0.1734 - classification_loss: 0.0070 277/500 [===============>..............] - ETA: 1:53 - loss: 0.1808 - regression_loss: 0.1738 - classification_loss: 0.0070 278/500 [===============>..............] - ETA: 1:52 - loss: 0.1806 - regression_loss: 0.1736 - classification_loss: 0.0070 279/500 [===============>..............] - ETA: 1:52 - loss: 0.1808 - regression_loss: 0.1738 - classification_loss: 0.0070 280/500 [===============>..............] - ETA: 1:51 - loss: 0.1811 - regression_loss: 0.1741 - classification_loss: 0.0070 281/500 [===============>..............] - ETA: 1:51 - loss: 0.1817 - regression_loss: 0.1747 - classification_loss: 0.0070 282/500 [===============>..............] - ETA: 1:50 - loss: 0.1818 - regression_loss: 0.1749 - classification_loss: 0.0070 283/500 [===============>..............] - ETA: 1:50 - loss: 0.1818 - regression_loss: 0.1749 - classification_loss: 0.0070 284/500 [================>.............] - ETA: 1:49 - loss: 0.1819 - regression_loss: 0.1749 - classification_loss: 0.0070 285/500 [================>.............] - ETA: 1:49 - loss: 0.1823 - regression_loss: 0.1753 - classification_loss: 0.0070 286/500 [================>.............] - ETA: 1:48 - loss: 0.1820 - regression_loss: 0.1750 - classification_loss: 0.0070 287/500 [================>.............] - ETA: 1:48 - loss: 0.1822 - regression_loss: 0.1752 - classification_loss: 0.0070 288/500 [================>.............] - ETA: 1:47 - loss: 0.1824 - regression_loss: 0.1754 - classification_loss: 0.0070 289/500 [================>.............] - ETA: 1:47 - loss: 0.1820 - regression_loss: 0.1751 - classification_loss: 0.0070 290/500 [================>.............] - ETA: 1:46 - loss: 0.1818 - regression_loss: 0.1749 - classification_loss: 0.0069 291/500 [================>.............] - ETA: 1:46 - loss: 0.1817 - regression_loss: 0.1748 - classification_loss: 0.0069 292/500 [================>.............] - ETA: 1:45 - loss: 0.1818 - regression_loss: 0.1749 - classification_loss: 0.0069 293/500 [================>.............] - ETA: 1:45 - loss: 0.1818 - regression_loss: 0.1749 - classification_loss: 0.0070 294/500 [================>.............] - ETA: 1:44 - loss: 0.1826 - regression_loss: 0.1754 - classification_loss: 0.0072 295/500 [================>.............] - ETA: 1:44 - loss: 0.1824 - regression_loss: 0.1752 - classification_loss: 0.0072 296/500 [================>.............] - ETA: 1:43 - loss: 0.1825 - regression_loss: 0.1753 - classification_loss: 0.0072 297/500 [================>.............] - ETA: 1:43 - loss: 0.1826 - regression_loss: 0.1754 - classification_loss: 0.0072 298/500 [================>.............] - ETA: 1:42 - loss: 0.1828 - regression_loss: 0.1755 - classification_loss: 0.0072 299/500 [================>.............] - ETA: 1:42 - loss: 0.1829 - regression_loss: 0.1756 - classification_loss: 0.0073 300/500 [=================>............] - ETA: 1:41 - loss: 0.1830 - regression_loss: 0.1757 - classification_loss: 0.0073 301/500 [=================>............] - ETA: 1:41 - loss: 0.1828 - regression_loss: 0.1756 - classification_loss: 0.0073 302/500 [=================>............] - ETA: 1:40 - loss: 0.1832 - regression_loss: 0.1760 - classification_loss: 0.0072 303/500 [=================>............] - ETA: 1:40 - loss: 0.1831 - regression_loss: 0.1759 - classification_loss: 0.0072 304/500 [=================>............] - ETA: 1:39 - loss: 0.1833 - regression_loss: 0.1760 - classification_loss: 0.0072 305/500 [=================>............] - ETA: 1:39 - loss: 0.1835 - regression_loss: 0.1762 - classification_loss: 0.0073 306/500 [=================>............] - ETA: 1:38 - loss: 0.1832 - regression_loss: 0.1759 - classification_loss: 0.0073 307/500 [=================>............] - ETA: 1:38 - loss: 0.1831 - regression_loss: 0.1758 - classification_loss: 0.0073 308/500 [=================>............] - ETA: 1:37 - loss: 0.1830 - regression_loss: 0.1757 - classification_loss: 0.0073 309/500 [=================>............] - ETA: 1:37 - loss: 0.1836 - regression_loss: 0.1763 - classification_loss: 0.0073 310/500 [=================>............] - ETA: 1:36 - loss: 0.1834 - regression_loss: 0.1761 - classification_loss: 0.0073 311/500 [=================>............] - ETA: 1:36 - loss: 0.1834 - regression_loss: 0.1761 - classification_loss: 0.0073 312/500 [=================>............] - ETA: 1:35 - loss: 0.1836 - regression_loss: 0.1762 - classification_loss: 0.0073 313/500 [=================>............] - ETA: 1:35 - loss: 0.1839 - regression_loss: 0.1766 - classification_loss: 0.0074 314/500 [=================>............] - ETA: 1:34 - loss: 0.1839 - regression_loss: 0.1765 - classification_loss: 0.0074 315/500 [=================>............] - ETA: 1:33 - loss: 0.1843 - regression_loss: 0.1769 - classification_loss: 0.0074 316/500 [=================>............] - ETA: 1:33 - loss: 0.1844 - regression_loss: 0.1770 - classification_loss: 0.0074 317/500 [==================>...........] - ETA: 1:32 - loss: 0.1841 - regression_loss: 0.1767 - classification_loss: 0.0074 318/500 [==================>...........] - ETA: 1:32 - loss: 0.1839 - regression_loss: 0.1766 - classification_loss: 0.0074 319/500 [==================>...........] - ETA: 1:31 - loss: 0.1839 - regression_loss: 0.1766 - classification_loss: 0.0074 320/500 [==================>...........] - ETA: 1:31 - loss: 0.1841 - regression_loss: 0.1767 - classification_loss: 0.0074 321/500 [==================>...........] - ETA: 1:30 - loss: 0.1842 - regression_loss: 0.1769 - classification_loss: 0.0074 322/500 [==================>...........] - ETA: 1:30 - loss: 0.1846 - regression_loss: 0.1771 - classification_loss: 0.0074 323/500 [==================>...........] - ETA: 1:29 - loss: 0.1844 - regression_loss: 0.1770 - classification_loss: 0.0074 324/500 [==================>...........] - ETA: 1:29 - loss: 0.1844 - regression_loss: 0.1770 - classification_loss: 0.0074 325/500 [==================>...........] - ETA: 1:28 - loss: 0.1844 - regression_loss: 0.1771 - classification_loss: 0.0074 326/500 [==================>...........] - ETA: 1:28 - loss: 0.1842 - regression_loss: 0.1768 - classification_loss: 0.0074 327/500 [==================>...........] - ETA: 1:27 - loss: 0.1839 - regression_loss: 0.1766 - classification_loss: 0.0073 328/500 [==================>...........] - ETA: 1:27 - loss: 0.1840 - regression_loss: 0.1767 - classification_loss: 0.0073 329/500 [==================>...........] - ETA: 1:26 - loss: 0.1839 - regression_loss: 0.1766 - classification_loss: 0.0073 330/500 [==================>...........] - ETA: 1:26 - loss: 0.1842 - regression_loss: 0.1769 - classification_loss: 0.0074 331/500 [==================>...........] - ETA: 1:25 - loss: 0.1841 - regression_loss: 0.1768 - classification_loss: 0.0074 332/500 [==================>...........] - ETA: 1:25 - loss: 0.1838 - regression_loss: 0.1764 - classification_loss: 0.0073 333/500 [==================>...........] - ETA: 1:24 - loss: 0.1840 - regression_loss: 0.1767 - classification_loss: 0.0073 334/500 [===================>..........] - ETA: 1:24 - loss: 0.1839 - regression_loss: 0.1766 - classification_loss: 0.0073 335/500 [===================>..........] - ETA: 1:23 - loss: 0.1842 - regression_loss: 0.1768 - classification_loss: 0.0073 336/500 [===================>..........] - ETA: 1:23 - loss: 0.1845 - regression_loss: 0.1772 - classification_loss: 0.0074 337/500 [===================>..........] - ETA: 1:22 - loss: 0.1844 - regression_loss: 0.1770 - classification_loss: 0.0074 338/500 [===================>..........] - ETA: 1:22 - loss: 0.1843 - regression_loss: 0.1770 - classification_loss: 0.0074 339/500 [===================>..........] - ETA: 1:21 - loss: 0.1842 - regression_loss: 0.1768 - classification_loss: 0.0073 340/500 [===================>..........] - ETA: 1:21 - loss: 0.1841 - regression_loss: 0.1768 - classification_loss: 0.0073 341/500 [===================>..........] - ETA: 1:20 - loss: 0.1838 - regression_loss: 0.1765 - classification_loss: 0.0073 342/500 [===================>..........] - ETA: 1:20 - loss: 0.1837 - regression_loss: 0.1764 - classification_loss: 0.0073 343/500 [===================>..........] - ETA: 1:19 - loss: 0.1834 - regression_loss: 0.1761 - classification_loss: 0.0073 344/500 [===================>..........] - ETA: 1:19 - loss: 0.1834 - regression_loss: 0.1762 - classification_loss: 0.0073 345/500 [===================>..........] - ETA: 1:18 - loss: 0.1832 - regression_loss: 0.1759 - classification_loss: 0.0073 346/500 [===================>..........] - ETA: 1:18 - loss: 0.1832 - regression_loss: 0.1760 - classification_loss: 0.0073 347/500 [===================>..........] - ETA: 1:17 - loss: 0.1831 - regression_loss: 0.1759 - classification_loss: 0.0073 348/500 [===================>..........] - ETA: 1:17 - loss: 0.1832 - regression_loss: 0.1759 - classification_loss: 0.0073 349/500 [===================>..........] - ETA: 1:16 - loss: 0.1834 - regression_loss: 0.1761 - classification_loss: 0.0073 350/500 [====================>.........] - ETA: 1:16 - loss: 0.1837 - regression_loss: 0.1764 - classification_loss: 0.0073 351/500 [====================>.........] - ETA: 1:15 - loss: 0.1835 - regression_loss: 0.1763 - classification_loss: 0.0073 352/500 [====================>.........] - ETA: 1:15 - loss: 0.1837 - regression_loss: 0.1764 - classification_loss: 0.0073 353/500 [====================>.........] - ETA: 1:14 - loss: 0.1840 - regression_loss: 0.1768 - classification_loss: 0.0073 354/500 [====================>.........] - ETA: 1:14 - loss: 0.1839 - regression_loss: 0.1767 - classification_loss: 0.0073 355/500 [====================>.........] - ETA: 1:13 - loss: 0.1842 - regression_loss: 0.1769 - classification_loss: 0.0073 356/500 [====================>.........] - ETA: 1:13 - loss: 0.1839 - regression_loss: 0.1766 - classification_loss: 0.0072 357/500 [====================>.........] - ETA: 1:12 - loss: 0.1836 - regression_loss: 0.1763 - classification_loss: 0.0072 358/500 [====================>.........] - ETA: 1:12 - loss: 0.1835 - regression_loss: 0.1763 - classification_loss: 0.0072 359/500 [====================>.........] - ETA: 1:11 - loss: 0.1832 - regression_loss: 0.1760 - classification_loss: 0.0072 360/500 [====================>.........] - ETA: 1:11 - loss: 0.1832 - regression_loss: 0.1759 - classification_loss: 0.0072 361/500 [====================>.........] - ETA: 1:10 - loss: 0.1829 - regression_loss: 0.1757 - classification_loss: 0.0072 362/500 [====================>.........] - ETA: 1:10 - loss: 0.1829 - regression_loss: 0.1757 - classification_loss: 0.0072 363/500 [====================>.........] - ETA: 1:09 - loss: 0.1829 - regression_loss: 0.1757 - classification_loss: 0.0072 364/500 [====================>.........] - ETA: 1:09 - loss: 0.1827 - regression_loss: 0.1756 - classification_loss: 0.0072 365/500 [====================>.........] - ETA: 1:08 - loss: 0.1827 - regression_loss: 0.1755 - classification_loss: 0.0072 366/500 [====================>.........] - ETA: 1:08 - loss: 0.1827 - regression_loss: 0.1755 - classification_loss: 0.0072 367/500 [=====================>........] - ETA: 1:07 - loss: 0.1825 - regression_loss: 0.1753 - classification_loss: 0.0072 368/500 [=====================>........] - ETA: 1:07 - loss: 0.1825 - regression_loss: 0.1753 - classification_loss: 0.0072 369/500 [=====================>........] - ETA: 1:06 - loss: 0.1825 - regression_loss: 0.1753 - classification_loss: 0.0072 370/500 [=====================>........] - ETA: 1:06 - loss: 0.1823 - regression_loss: 0.1751 - classification_loss: 0.0072 371/500 [=====================>........] - ETA: 1:05 - loss: 0.1820 - regression_loss: 0.1749 - classification_loss: 0.0072 372/500 [=====================>........] - ETA: 1:05 - loss: 0.1822 - regression_loss: 0.1751 - classification_loss: 0.0072 373/500 [=====================>........] - ETA: 1:04 - loss: 0.1820 - regression_loss: 0.1748 - classification_loss: 0.0072 374/500 [=====================>........] - ETA: 1:03 - loss: 0.1821 - regression_loss: 0.1749 - classification_loss: 0.0072 375/500 [=====================>........] - ETA: 1:03 - loss: 0.1818 - regression_loss: 0.1747 - classification_loss: 0.0071 376/500 [=====================>........] - ETA: 1:02 - loss: 0.1818 - regression_loss: 0.1746 - classification_loss: 0.0072 377/500 [=====================>........] - ETA: 1:02 - loss: 0.1829 - regression_loss: 0.1758 - classification_loss: 0.0071 378/500 [=====================>........] - ETA: 1:01 - loss: 0.1829 - regression_loss: 0.1757 - classification_loss: 0.0071 379/500 [=====================>........] - ETA: 1:01 - loss: 0.1828 - regression_loss: 0.1756 - classification_loss: 0.0071 380/500 [=====================>........] - ETA: 1:00 - loss: 0.1826 - regression_loss: 0.1755 - classification_loss: 0.0071 381/500 [=====================>........] - ETA: 1:00 - loss: 0.1828 - regression_loss: 0.1756 - classification_loss: 0.0071 382/500 [=====================>........] - ETA: 59s - loss: 0.1828 - regression_loss: 0.1757 - classification_loss: 0.0071  383/500 [=====================>........] - ETA: 59s - loss: 0.1830 - regression_loss: 0.1759 - classification_loss: 0.0071 384/500 [======================>.......] - ETA: 58s - loss: 0.1831 - regression_loss: 0.1760 - classification_loss: 0.0071 385/500 [======================>.......] - ETA: 58s - loss: 0.1831 - regression_loss: 0.1760 - classification_loss: 0.0071 386/500 [======================>.......] - ETA: 57s - loss: 0.1831 - regression_loss: 0.1760 - classification_loss: 0.0071 387/500 [======================>.......] - ETA: 57s - loss: 0.1830 - regression_loss: 0.1759 - classification_loss: 0.0071 388/500 [======================>.......] - ETA: 56s - loss: 0.1828 - regression_loss: 0.1757 - classification_loss: 0.0071 389/500 [======================>.......] - ETA: 56s - loss: 0.1833 - regression_loss: 0.1762 - classification_loss: 0.0071 390/500 [======================>.......] - ETA: 55s - loss: 0.1832 - regression_loss: 0.1761 - classification_loss: 0.0071 391/500 [======================>.......] - ETA: 55s - loss: 0.1830 - regression_loss: 0.1760 - classification_loss: 0.0070 392/500 [======================>.......] - ETA: 54s - loss: 0.1827 - regression_loss: 0.1757 - classification_loss: 0.0070 393/500 [======================>.......] - ETA: 54s - loss: 0.1825 - regression_loss: 0.1754 - classification_loss: 0.0070 394/500 [======================>.......] - ETA: 53s - loss: 0.1828 - regression_loss: 0.1758 - classification_loss: 0.0070 395/500 [======================>.......] - ETA: 53s - loss: 0.1827 - regression_loss: 0.1757 - classification_loss: 0.0070 396/500 [======================>.......] - ETA: 52s - loss: 0.1826 - regression_loss: 0.1756 - classification_loss: 0.0070 397/500 [======================>.......] - ETA: 52s - loss: 0.1826 - regression_loss: 0.1757 - classification_loss: 0.0070 398/500 [======================>.......] - ETA: 51s - loss: 0.1830 - regression_loss: 0.1760 - classification_loss: 0.0070 399/500 [======================>.......] - ETA: 51s - loss: 0.1828 - regression_loss: 0.1758 - classification_loss: 0.0070 400/500 [=======================>......] - ETA: 50s - loss: 0.1826 - regression_loss: 0.1756 - classification_loss: 0.0070 401/500 [=======================>......] - ETA: 50s - loss: 0.1824 - regression_loss: 0.1754 - classification_loss: 0.0070 402/500 [=======================>......] - ETA: 49s - loss: 0.1823 - regression_loss: 0.1753 - classification_loss: 0.0070 403/500 [=======================>......] - ETA: 49s - loss: 0.1824 - regression_loss: 0.1754 - classification_loss: 0.0070 404/500 [=======================>......] - ETA: 48s - loss: 0.1824 - regression_loss: 0.1755 - classification_loss: 0.0070 405/500 [=======================>......] - ETA: 48s - loss: 0.1827 - regression_loss: 0.1757 - classification_loss: 0.0069 406/500 [=======================>......] - ETA: 47s - loss: 0.1827 - regression_loss: 0.1757 - classification_loss: 0.0069 407/500 [=======================>......] - ETA: 47s - loss: 0.1826 - regression_loss: 0.1757 - classification_loss: 0.0069 408/500 [=======================>......] - ETA: 46s - loss: 0.1824 - regression_loss: 0.1754 - classification_loss: 0.0069 409/500 [=======================>......] - ETA: 46s - loss: 0.1826 - regression_loss: 0.1757 - classification_loss: 0.0069 410/500 [=======================>......] - ETA: 45s - loss: 0.1825 - regression_loss: 0.1756 - classification_loss: 0.0069 411/500 [=======================>......] - ETA: 45s - loss: 0.1826 - regression_loss: 0.1757 - classification_loss: 0.0069 412/500 [=======================>......] - ETA: 44s - loss: 0.1827 - regression_loss: 0.1758 - classification_loss: 0.0069 413/500 [=======================>......] - ETA: 44s - loss: 0.1823 - regression_loss: 0.1754 - classification_loss: 0.0069 414/500 [=======================>......] - ETA: 43s - loss: 0.1822 - regression_loss: 0.1753 - classification_loss: 0.0069 415/500 [=======================>......] - ETA: 43s - loss: 0.1823 - regression_loss: 0.1754 - classification_loss: 0.0069 416/500 [=======================>......] - ETA: 42s - loss: 0.1823 - regression_loss: 0.1754 - classification_loss: 0.0069 417/500 [========================>.....] - ETA: 42s - loss: 0.1824 - regression_loss: 0.1755 - classification_loss: 0.0069 418/500 [========================>.....] - ETA: 41s - loss: 0.1827 - regression_loss: 0.1758 - classification_loss: 0.0069 419/500 [========================>.....] - ETA: 41s - loss: 0.1828 - regression_loss: 0.1759 - classification_loss: 0.0069 420/500 [========================>.....] - ETA: 40s - loss: 0.1831 - regression_loss: 0.1762 - classification_loss: 0.0070 421/500 [========================>.....] - ETA: 40s - loss: 0.1833 - regression_loss: 0.1763 - classification_loss: 0.0070 422/500 [========================>.....] - ETA: 39s - loss: 0.1835 - regression_loss: 0.1765 - classification_loss: 0.0070 423/500 [========================>.....] - ETA: 39s - loss: 0.1837 - regression_loss: 0.1767 - classification_loss: 0.0070 424/500 [========================>.....] - ETA: 38s - loss: 0.1843 - regression_loss: 0.1773 - classification_loss: 0.0070 425/500 [========================>.....] - ETA: 38s - loss: 0.1846 - regression_loss: 0.1776 - classification_loss: 0.0070 426/500 [========================>.....] - ETA: 37s - loss: 0.1848 - regression_loss: 0.1778 - classification_loss: 0.0070 427/500 [========================>.....] - ETA: 37s - loss: 0.1849 - regression_loss: 0.1779 - classification_loss: 0.0070 428/500 [========================>.....] - ETA: 36s - loss: 0.1850 - regression_loss: 0.1779 - classification_loss: 0.0070 429/500 [========================>.....] - ETA: 36s - loss: 0.1853 - regression_loss: 0.1783 - classification_loss: 0.0070 430/500 [========================>.....] - ETA: 35s - loss: 0.1854 - regression_loss: 0.1783 - classification_loss: 0.0070 431/500 [========================>.....] - ETA: 35s - loss: 0.1854 - regression_loss: 0.1784 - classification_loss: 0.0070 432/500 [========================>.....] - ETA: 34s - loss: 0.1853 - regression_loss: 0.1783 - classification_loss: 0.0070 433/500 [========================>.....] - ETA: 34s - loss: 0.1851 - regression_loss: 0.1781 - classification_loss: 0.0070 434/500 [=========================>....] - ETA: 33s - loss: 0.1851 - regression_loss: 0.1781 - classification_loss: 0.0070 435/500 [=========================>....] - ETA: 33s - loss: 0.1853 - regression_loss: 0.1783 - classification_loss: 0.0070 436/500 [=========================>....] - ETA: 32s - loss: 0.1852 - regression_loss: 0.1783 - classification_loss: 0.0070 437/500 [=========================>....] - ETA: 32s - loss: 0.1851 - regression_loss: 0.1781 - classification_loss: 0.0070 438/500 [=========================>....] - ETA: 31s - loss: 0.1849 - regression_loss: 0.1780 - classification_loss: 0.0070 439/500 [=========================>....] - ETA: 31s - loss: 0.1848 - regression_loss: 0.1779 - classification_loss: 0.0070 440/500 [=========================>....] - ETA: 30s - loss: 0.1847 - regression_loss: 0.1777 - classification_loss: 0.0069 441/500 [=========================>....] - ETA: 29s - loss: 0.1850 - regression_loss: 0.1781 - classification_loss: 0.0070 442/500 [=========================>....] - ETA: 29s - loss: 0.1852 - regression_loss: 0.1782 - classification_loss: 0.0070 443/500 [=========================>....] - ETA: 28s - loss: 0.1850 - regression_loss: 0.1780 - classification_loss: 0.0070 444/500 [=========================>....] - ETA: 28s - loss: 0.1850 - regression_loss: 0.1780 - classification_loss: 0.0070 445/500 [=========================>....] - ETA: 27s - loss: 0.1851 - regression_loss: 0.1780 - classification_loss: 0.0070 446/500 [=========================>....] - ETA: 27s - loss: 0.1850 - regression_loss: 0.1780 - classification_loss: 0.0070 447/500 [=========================>....] - ETA: 26s - loss: 0.1849 - regression_loss: 0.1779 - classification_loss: 0.0070 448/500 [=========================>....] - ETA: 26s - loss: 0.1847 - regression_loss: 0.1777 - classification_loss: 0.0070 449/500 [=========================>....] - ETA: 25s - loss: 0.1848 - regression_loss: 0.1778 - classification_loss: 0.0070 450/500 [==========================>...] - ETA: 25s - loss: 0.1847 - regression_loss: 0.1777 - classification_loss: 0.0070 451/500 [==========================>...] - ETA: 24s - loss: 0.1849 - regression_loss: 0.1779 - classification_loss: 0.0070 452/500 [==========================>...] - ETA: 24s - loss: 0.1848 - regression_loss: 0.1778 - classification_loss: 0.0070 453/500 [==========================>...] - ETA: 23s - loss: 0.1846 - regression_loss: 0.1776 - classification_loss: 0.0070 454/500 [==========================>...] - ETA: 23s - loss: 0.1845 - regression_loss: 0.1775 - classification_loss: 0.0070 455/500 [==========================>...] - ETA: 22s - loss: 0.1846 - regression_loss: 0.1776 - classification_loss: 0.0070 456/500 [==========================>...] - ETA: 22s - loss: 0.1848 - regression_loss: 0.1778 - classification_loss: 0.0070 457/500 [==========================>...] - ETA: 21s - loss: 0.1848 - regression_loss: 0.1778 - classification_loss: 0.0070 458/500 [==========================>...] - ETA: 21s - loss: 0.1848 - regression_loss: 0.1777 - classification_loss: 0.0070 459/500 [==========================>...] - ETA: 20s - loss: 0.1848 - regression_loss: 0.1778 - classification_loss: 0.0070 460/500 [==========================>...] - ETA: 20s - loss: 0.1850 - regression_loss: 0.1779 - classification_loss: 0.0071 461/500 [==========================>...] - ETA: 19s - loss: 0.1850 - regression_loss: 0.1779 - classification_loss: 0.0071 462/500 [==========================>...] - ETA: 19s - loss: 0.1849 - regression_loss: 0.1779 - classification_loss: 0.0071 463/500 [==========================>...] - ETA: 18s - loss: 0.1850 - regression_loss: 0.1780 - classification_loss: 0.0070 464/500 [==========================>...] - ETA: 18s - loss: 0.1850 - regression_loss: 0.1780 - classification_loss: 0.0070 465/500 [==========================>...] - ETA: 17s - loss: 0.1852 - regression_loss: 0.1781 - classification_loss: 0.0071 466/500 [==========================>...] - ETA: 17s - loss: 0.1850 - regression_loss: 0.1780 - classification_loss: 0.0070 467/500 [===========================>..] - ETA: 16s - loss: 0.1852 - regression_loss: 0.1781 - classification_loss: 0.0070 468/500 [===========================>..] - ETA: 16s - loss: 0.1851 - regression_loss: 0.1781 - classification_loss: 0.0070 469/500 [===========================>..] - ETA: 15s - loss: 0.1854 - regression_loss: 0.1783 - classification_loss: 0.0070 470/500 [===========================>..] - ETA: 15s - loss: 0.1854 - regression_loss: 0.1784 - classification_loss: 0.0070 471/500 [===========================>..] - ETA: 14s - loss: 0.1855 - regression_loss: 0.1785 - classification_loss: 0.0070 472/500 [===========================>..] - ETA: 14s - loss: 0.1857 - regression_loss: 0.1787 - classification_loss: 0.0070 473/500 [===========================>..] - ETA: 13s - loss: 0.1860 - regression_loss: 0.1789 - classification_loss: 0.0070 474/500 [===========================>..] - ETA: 13s - loss: 0.1857 - regression_loss: 0.1787 - classification_loss: 0.0070 475/500 [===========================>..] - ETA: 12s - loss: 0.1855 - regression_loss: 0.1785 - classification_loss: 0.0070 476/500 [===========================>..] - ETA: 12s - loss: 0.1856 - regression_loss: 0.1785 - classification_loss: 0.0070 477/500 [===========================>..] - ETA: 11s - loss: 0.1855 - regression_loss: 0.1785 - classification_loss: 0.0070 478/500 [===========================>..] - ETA: 11s - loss: 0.1856 - regression_loss: 0.1786 - classification_loss: 0.0070 479/500 [===========================>..] - ETA: 10s - loss: 0.1854 - regression_loss: 0.1784 - classification_loss: 0.0070 480/500 [===========================>..] - ETA: 10s - loss: 0.1851 - regression_loss: 0.1781 - classification_loss: 0.0070 481/500 [===========================>..] - ETA: 9s - loss: 0.1850 - regression_loss: 0.1780 - classification_loss: 0.0070  482/500 [===========================>..] - ETA: 9s - loss: 0.1850 - regression_loss: 0.1780 - classification_loss: 0.0070 483/500 [===========================>..] - ETA: 8s - loss: 0.1847 - regression_loss: 0.1777 - classification_loss: 0.0070 484/500 [============================>.] - ETA: 8s - loss: 0.1847 - regression_loss: 0.1777 - classification_loss: 0.0070 485/500 [============================>.] - ETA: 7s - loss: 0.1847 - regression_loss: 0.1777 - classification_loss: 0.0070 486/500 [============================>.] - ETA: 7s - loss: 0.1847 - regression_loss: 0.1777 - classification_loss: 0.0070 487/500 [============================>.] - ETA: 6s - loss: 0.1847 - regression_loss: 0.1778 - classification_loss: 0.0070 488/500 [============================>.] - ETA: 6s - loss: 0.1846 - regression_loss: 0.1777 - classification_loss: 0.0070 489/500 [============================>.] - ETA: 5s - loss: 0.1847 - regression_loss: 0.1777 - classification_loss: 0.0070 490/500 [============================>.] - ETA: 5s - loss: 0.1847 - regression_loss: 0.1778 - classification_loss: 0.0070 491/500 [============================>.] - ETA: 4s - loss: 0.1850 - regression_loss: 0.1780 - classification_loss: 0.0070 492/500 [============================>.] - ETA: 4s - loss: 0.1849 - regression_loss: 0.1780 - classification_loss: 0.0070 493/500 [============================>.] - ETA: 3s - loss: 0.1848 - regression_loss: 0.1778 - classification_loss: 0.0070 494/500 [============================>.] - ETA: 3s - loss: 0.1851 - regression_loss: 0.1781 - classification_loss: 0.0070 495/500 [============================>.] - ETA: 2s - loss: 0.1853 - regression_loss: 0.1783 - classification_loss: 0.0070 496/500 [============================>.] - ETA: 2s - loss: 0.1852 - regression_loss: 0.1783 - classification_loss: 0.0070 497/500 [============================>.] - ETA: 1s - loss: 0.1852 - regression_loss: 0.1783 - classification_loss: 0.0070 498/500 [============================>.] - ETA: 1s - loss: 0.1855 - regression_loss: 0.1785 - classification_loss: 0.0070 499/500 [============================>.] - ETA: 0s - loss: 0.1857 - regression_loss: 0.1787 - classification_loss: 0.0070 500/500 [==============================] - 254s 508ms/step - loss: 0.1858 - regression_loss: 0.1788 - classification_loss: 0.0070 1172 instances of class plum with average precision: 0.7397 mAP: 0.7397 Epoch 00042: saving model to ./training/snapshots/resnet101_pascal_42.h5 Epoch 43/150 1/500 [..............................] - ETA: 4:07 - loss: 0.3873 - regression_loss: 0.3830 - classification_loss: 0.0043 2/500 [..............................] - ETA: 4:08 - loss: 0.2610 - regression_loss: 0.2557 - classification_loss: 0.0052 3/500 [..............................] - ETA: 4:08 - loss: 0.2079 - regression_loss: 0.2041 - classification_loss: 0.0038 4/500 [..............................] - ETA: 4:09 - loss: 0.2756 - regression_loss: 0.2713 - classification_loss: 0.0043 5/500 [..............................] - ETA: 4:13 - loss: 0.3409 - regression_loss: 0.3324 - classification_loss: 0.0085 6/500 [..............................] - ETA: 4:15 - loss: 0.3032 - regression_loss: 0.2959 - classification_loss: 0.0073 7/500 [..............................] - ETA: 4:14 - loss: 0.2779 - regression_loss: 0.2714 - classification_loss: 0.0065 8/500 [..............................] - ETA: 4:13 - loss: 0.2905 - regression_loss: 0.2835 - classification_loss: 0.0070 9/500 [..............................] - ETA: 4:12 - loss: 0.2705 - regression_loss: 0.2639 - classification_loss: 0.0066 10/500 [..............................] - ETA: 4:12 - loss: 0.2833 - regression_loss: 0.2747 - classification_loss: 0.0086 11/500 [..............................] - ETA: 4:12 - loss: 0.2838 - regression_loss: 0.2743 - classification_loss: 0.0095 12/500 [..............................] - ETA: 4:12 - loss: 0.2731 - regression_loss: 0.2639 - classification_loss: 0.0091 13/500 [..............................] - ETA: 4:11 - loss: 0.2647 - regression_loss: 0.2557 - classification_loss: 0.0090 14/500 [..............................] - ETA: 4:10 - loss: 0.2628 - regression_loss: 0.2542 - classification_loss: 0.0086 15/500 [..............................] - ETA: 4:09 - loss: 0.2571 - regression_loss: 0.2485 - classification_loss: 0.0086 16/500 [..............................] - ETA: 4:09 - loss: 0.2618 - regression_loss: 0.2529 - classification_loss: 0.0088 17/500 [>.............................] - ETA: 4:08 - loss: 0.2635 - regression_loss: 0.2548 - classification_loss: 0.0087 18/500 [>.............................] - ETA: 4:07 - loss: 0.2573 - regression_loss: 0.2489 - classification_loss: 0.0084 19/500 [>.............................] - ETA: 4:06 - loss: 0.2519 - regression_loss: 0.2438 - classification_loss: 0.0081 20/500 [>.............................] - ETA: 4:06 - loss: 0.2501 - regression_loss: 0.2421 - classification_loss: 0.0080 21/500 [>.............................] - ETA: 4:05 - loss: 0.2442 - regression_loss: 0.2364 - classification_loss: 0.0078 22/500 [>.............................] - ETA: 4:05 - loss: 0.2438 - regression_loss: 0.2361 - classification_loss: 0.0077 23/500 [>.............................] - ETA: 4:04 - loss: 0.2483 - regression_loss: 0.2404 - classification_loss: 0.0079 24/500 [>.............................] - ETA: 4:04 - loss: 0.2467 - regression_loss: 0.2386 - classification_loss: 0.0080 25/500 [>.............................] - ETA: 4:03 - loss: 0.2492 - regression_loss: 0.2411 - classification_loss: 0.0081 26/500 [>.............................] - ETA: 4:03 - loss: 0.2436 - regression_loss: 0.2357 - classification_loss: 0.0079 27/500 [>.............................] - ETA: 4:02 - loss: 0.2410 - regression_loss: 0.2332 - classification_loss: 0.0078 28/500 [>.............................] - ETA: 4:02 - loss: 0.2363 - regression_loss: 0.2287 - classification_loss: 0.0076 29/500 [>.............................] - ETA: 4:01 - loss: 0.2304 - regression_loss: 0.2229 - classification_loss: 0.0074 30/500 [>.............................] - ETA: 4:00 - loss: 0.2329 - regression_loss: 0.2253 - classification_loss: 0.0077 31/500 [>.............................] - ETA: 4:00 - loss: 0.2295 - regression_loss: 0.2220 - classification_loss: 0.0075 32/500 [>.............................] - ETA: 3:59 - loss: 0.2336 - regression_loss: 0.2242 - classification_loss: 0.0094 33/500 [>.............................] - ETA: 3:59 - loss: 0.2315 - regression_loss: 0.2222 - classification_loss: 0.0094 34/500 [=>............................] - ETA: 3:58 - loss: 0.2317 - regression_loss: 0.2225 - classification_loss: 0.0092 35/500 [=>............................] - ETA: 3:57 - loss: 0.2271 - regression_loss: 0.2181 - classification_loss: 0.0090 36/500 [=>............................] - ETA: 3:56 - loss: 0.2251 - regression_loss: 0.2163 - classification_loss: 0.0089 37/500 [=>............................] - ETA: 3:56 - loss: 0.2229 - regression_loss: 0.2142 - classification_loss: 0.0087 38/500 [=>............................] - ETA: 3:55 - loss: 0.2237 - regression_loss: 0.2149 - classification_loss: 0.0087 39/500 [=>............................] - ETA: 3:55 - loss: 0.2228 - regression_loss: 0.2142 - classification_loss: 0.0086 40/500 [=>............................] - ETA: 3:54 - loss: 0.2237 - regression_loss: 0.2152 - classification_loss: 0.0085 41/500 [=>............................] - ETA: 3:54 - loss: 0.2248 - regression_loss: 0.2163 - classification_loss: 0.0085 42/500 [=>............................] - ETA: 3:53 - loss: 0.2215 - regression_loss: 0.2132 - classification_loss: 0.0084 43/500 [=>............................] - ETA: 3:53 - loss: 0.2186 - regression_loss: 0.2104 - classification_loss: 0.0082 44/500 [=>............................] - ETA: 3:52 - loss: 0.2188 - regression_loss: 0.2107 - classification_loss: 0.0081 45/500 [=>............................] - ETA: 3:51 - loss: 0.2176 - regression_loss: 0.2095 - classification_loss: 0.0081 46/500 [=>............................] - ETA: 3:51 - loss: 0.2175 - regression_loss: 0.2093 - classification_loss: 0.0081 47/500 [=>............................] - ETA: 3:50 - loss: 0.2193 - regression_loss: 0.2108 - classification_loss: 0.0085 48/500 [=>............................] - ETA: 3:50 - loss: 0.2155 - regression_loss: 0.2072 - classification_loss: 0.0083 49/500 [=>............................] - ETA: 3:50 - loss: 0.2162 - regression_loss: 0.2079 - classification_loss: 0.0083 50/500 [==>...........................] - ETA: 3:49 - loss: 0.2135 - regression_loss: 0.2052 - classification_loss: 0.0082 51/500 [==>...........................] - ETA: 3:49 - loss: 0.2120 - regression_loss: 0.2039 - classification_loss: 0.0081 52/500 [==>...........................] - ETA: 3:48 - loss: 0.2127 - regression_loss: 0.2045 - classification_loss: 0.0082 53/500 [==>...........................] - ETA: 3:48 - loss: 0.2099 - regression_loss: 0.2018 - classification_loss: 0.0081 54/500 [==>...........................] - ETA: 3:47 - loss: 0.2076 - regression_loss: 0.1996 - classification_loss: 0.0080 55/500 [==>...........................] - ETA: 3:46 - loss: 0.2059 - regression_loss: 0.1980 - classification_loss: 0.0079 56/500 [==>...........................] - ETA: 3:46 - loss: 0.2043 - regression_loss: 0.1965 - classification_loss: 0.0079 57/500 [==>...........................] - ETA: 3:45 - loss: 0.2024 - regression_loss: 0.1947 - classification_loss: 0.0078 58/500 [==>...........................] - ETA: 3:45 - loss: 0.2007 - regression_loss: 0.1931 - classification_loss: 0.0077 59/500 [==>...........................] - ETA: 3:44 - loss: 0.2005 - regression_loss: 0.1928 - classification_loss: 0.0077 60/500 [==>...........................] - ETA: 3:44 - loss: 0.1992 - regression_loss: 0.1916 - classification_loss: 0.0077 61/500 [==>...........................] - ETA: 3:43 - loss: 0.2000 - regression_loss: 0.1922 - classification_loss: 0.0077 62/500 [==>...........................] - ETA: 3:43 - loss: 0.1999 - regression_loss: 0.1921 - classification_loss: 0.0078 63/500 [==>...........................] - ETA: 3:42 - loss: 0.1981 - regression_loss: 0.1904 - classification_loss: 0.0077 64/500 [==>...........................] - ETA: 3:42 - loss: 0.1973 - regression_loss: 0.1896 - classification_loss: 0.0077 65/500 [==>...........................] - ETA: 3:42 - loss: 0.1954 - regression_loss: 0.1878 - classification_loss: 0.0076 66/500 [==>...........................] - ETA: 3:41 - loss: 0.1951 - regression_loss: 0.1876 - classification_loss: 0.0075 67/500 [===>..........................] - ETA: 3:41 - loss: 0.1939 - regression_loss: 0.1865 - classification_loss: 0.0074 68/500 [===>..........................] - ETA: 3:40 - loss: 0.1955 - regression_loss: 0.1881 - classification_loss: 0.0075 69/500 [===>..........................] - ETA: 3:40 - loss: 0.1952 - regression_loss: 0.1878 - classification_loss: 0.0074 70/500 [===>..........................] - ETA: 3:39 - loss: 0.1945 - regression_loss: 0.1871 - classification_loss: 0.0073 71/500 [===>..........................] - ETA: 3:39 - loss: 0.1950 - regression_loss: 0.1876 - classification_loss: 0.0073 72/500 [===>..........................] - ETA: 3:38 - loss: 0.1941 - regression_loss: 0.1868 - classification_loss: 0.0073 73/500 [===>..........................] - ETA: 3:38 - loss: 0.1940 - regression_loss: 0.1865 - classification_loss: 0.0075 74/500 [===>..........................] - ETA: 3:37 - loss: 0.1940 - regression_loss: 0.1864 - classification_loss: 0.0075 75/500 [===>..........................] - ETA: 3:37 - loss: 0.1936 - regression_loss: 0.1862 - classification_loss: 0.0075 76/500 [===>..........................] - ETA: 3:36 - loss: 0.1931 - regression_loss: 0.1857 - classification_loss: 0.0074 77/500 [===>..........................] - ETA: 3:36 - loss: 0.1922 - regression_loss: 0.1848 - classification_loss: 0.0074 78/500 [===>..........................] - ETA: 3:35 - loss: 0.1917 - regression_loss: 0.1842 - classification_loss: 0.0074 79/500 [===>..........................] - ETA: 3:34 - loss: 0.1912 - regression_loss: 0.1838 - classification_loss: 0.0074 80/500 [===>..........................] - ETA: 3:34 - loss: 0.1908 - regression_loss: 0.1834 - classification_loss: 0.0073 81/500 [===>..........................] - ETA: 3:33 - loss: 0.1903 - regression_loss: 0.1831 - classification_loss: 0.0073 82/500 [===>..........................] - ETA: 3:33 - loss: 0.1904 - regression_loss: 0.1832 - classification_loss: 0.0073 83/500 [===>..........................] - ETA: 3:32 - loss: 0.1895 - regression_loss: 0.1823 - classification_loss: 0.0072 84/500 [====>.........................] - ETA: 3:32 - loss: 0.1884 - regression_loss: 0.1812 - classification_loss: 0.0072 85/500 [====>.........................] - ETA: 3:31 - loss: 0.1898 - regression_loss: 0.1826 - classification_loss: 0.0072 86/500 [====>.........................] - ETA: 3:31 - loss: 0.1909 - regression_loss: 0.1836 - classification_loss: 0.0072 87/500 [====>.........................] - ETA: 3:30 - loss: 0.1900 - regression_loss: 0.1828 - classification_loss: 0.0072 88/500 [====>.........................] - ETA: 3:30 - loss: 0.1888 - regression_loss: 0.1817 - classification_loss: 0.0071 89/500 [====>.........................] - ETA: 3:29 - loss: 0.1883 - regression_loss: 0.1812 - classification_loss: 0.0071 90/500 [====>.........................] - ETA: 3:29 - loss: 0.1885 - regression_loss: 0.1814 - classification_loss: 0.0071 91/500 [====>.........................] - ETA: 3:28 - loss: 0.1881 - regression_loss: 0.1810 - classification_loss: 0.0071 92/500 [====>.........................] - ETA: 3:28 - loss: 0.1868 - regression_loss: 0.1797 - classification_loss: 0.0070 93/500 [====>.........................] - ETA: 3:27 - loss: 0.1872 - regression_loss: 0.1801 - classification_loss: 0.0071 94/500 [====>.........................] - ETA: 3:27 - loss: 0.1865 - regression_loss: 0.1795 - classification_loss: 0.0071 95/500 [====>.........................] - ETA: 3:26 - loss: 0.1886 - regression_loss: 0.1815 - classification_loss: 0.0072 96/500 [====>.........................] - ETA: 3:26 - loss: 0.1877 - regression_loss: 0.1806 - classification_loss: 0.0071 97/500 [====>.........................] - ETA: 3:25 - loss: 0.1875 - regression_loss: 0.1805 - classification_loss: 0.0070 98/500 [====>.........................] - ETA: 3:25 - loss: 0.1867 - regression_loss: 0.1797 - classification_loss: 0.0070 99/500 [====>.........................] - ETA: 3:24 - loss: 0.1861 - regression_loss: 0.1791 - classification_loss: 0.0070 100/500 [=====>........................] - ETA: 3:24 - loss: 0.1858 - regression_loss: 0.1788 - classification_loss: 0.0069 101/500 [=====>........................] - ETA: 3:23 - loss: 0.1850 - regression_loss: 0.1782 - classification_loss: 0.0069 102/500 [=====>........................] - ETA: 3:22 - loss: 0.1852 - regression_loss: 0.1784 - classification_loss: 0.0069 103/500 [=====>........................] - ETA: 3:22 - loss: 0.1866 - regression_loss: 0.1797 - classification_loss: 0.0070 104/500 [=====>........................] - ETA: 3:21 - loss: 0.1872 - regression_loss: 0.1802 - classification_loss: 0.0070 105/500 [=====>........................] - ETA: 3:21 - loss: 0.1868 - regression_loss: 0.1799 - classification_loss: 0.0069 106/500 [=====>........................] - ETA: 3:21 - loss: 0.1882 - regression_loss: 0.1813 - classification_loss: 0.0069 107/500 [=====>........................] - ETA: 3:20 - loss: 0.1876 - regression_loss: 0.1807 - classification_loss: 0.0069 108/500 [=====>........................] - ETA: 3:20 - loss: 0.1874 - regression_loss: 0.1805 - classification_loss: 0.0068 109/500 [=====>........................] - ETA: 3:19 - loss: 0.1861 - regression_loss: 0.1793 - classification_loss: 0.0068 110/500 [=====>........................] - ETA: 3:19 - loss: 0.1848 - regression_loss: 0.1780 - classification_loss: 0.0068 111/500 [=====>........................] - ETA: 3:18 - loss: 0.1851 - regression_loss: 0.1783 - classification_loss: 0.0068 112/500 [=====>........................] - ETA: 3:17 - loss: 0.1870 - regression_loss: 0.1800 - classification_loss: 0.0069 113/500 [=====>........................] - ETA: 3:17 - loss: 0.1869 - regression_loss: 0.1800 - classification_loss: 0.0069 114/500 [=====>........................] - ETA: 3:16 - loss: 0.1869 - regression_loss: 0.1800 - classification_loss: 0.0069 115/500 [=====>........................] - ETA: 3:16 - loss: 0.1878 - regression_loss: 0.1808 - classification_loss: 0.0069 116/500 [=====>........................] - ETA: 3:15 - loss: 0.1875 - regression_loss: 0.1806 - classification_loss: 0.0069 117/500 [======>.......................] - ETA: 3:15 - loss: 0.1869 - regression_loss: 0.1800 - classification_loss: 0.0069 118/500 [======>.......................] - ETA: 3:14 - loss: 0.1875 - regression_loss: 0.1806 - classification_loss: 0.0069 119/500 [======>.......................] - ETA: 3:14 - loss: 0.1878 - regression_loss: 0.1809 - classification_loss: 0.0069 120/500 [======>.......................] - ETA: 3:13 - loss: 0.1871 - regression_loss: 0.1803 - classification_loss: 0.0068 121/500 [======>.......................] - ETA: 3:13 - loss: 0.1864 - regression_loss: 0.1796 - classification_loss: 0.0068 122/500 [======>.......................] - ETA: 3:12 - loss: 0.1860 - regression_loss: 0.1791 - classification_loss: 0.0069 123/500 [======>.......................] - ETA: 3:12 - loss: 0.1859 - regression_loss: 0.1790 - classification_loss: 0.0068 124/500 [======>.......................] - ETA: 3:11 - loss: 0.1852 - regression_loss: 0.1784 - classification_loss: 0.0068 125/500 [======>.......................] - ETA: 3:11 - loss: 0.1849 - regression_loss: 0.1781 - classification_loss: 0.0068 126/500 [======>.......................] - ETA: 3:10 - loss: 0.1848 - regression_loss: 0.1780 - classification_loss: 0.0068 127/500 [======>.......................] - ETA: 3:10 - loss: 0.1845 - regression_loss: 0.1777 - classification_loss: 0.0068 128/500 [======>.......................] - ETA: 3:09 - loss: 0.1840 - regression_loss: 0.1772 - classification_loss: 0.0068 129/500 [======>.......................] - ETA: 3:09 - loss: 0.1859 - regression_loss: 0.1790 - classification_loss: 0.0069 130/500 [======>.......................] - ETA: 3:08 - loss: 0.1853 - regression_loss: 0.1784 - classification_loss: 0.0069 131/500 [======>.......................] - ETA: 3:08 - loss: 0.1848 - regression_loss: 0.1780 - classification_loss: 0.0068 132/500 [======>.......................] - ETA: 3:07 - loss: 0.1844 - regression_loss: 0.1776 - classification_loss: 0.0068 133/500 [======>.......................] - ETA: 3:07 - loss: 0.1852 - regression_loss: 0.1783 - classification_loss: 0.0069 134/500 [=======>......................] - ETA: 3:06 - loss: 0.1851 - regression_loss: 0.1783 - classification_loss: 0.0069 135/500 [=======>......................] - ETA: 3:06 - loss: 0.1855 - regression_loss: 0.1787 - classification_loss: 0.0068 136/500 [=======>......................] - ETA: 3:05 - loss: 0.1868 - regression_loss: 0.1799 - classification_loss: 0.0069 137/500 [=======>......................] - ETA: 3:05 - loss: 0.1871 - regression_loss: 0.1801 - classification_loss: 0.0069 138/500 [=======>......................] - ETA: 3:04 - loss: 0.1863 - regression_loss: 0.1795 - classification_loss: 0.0069 139/500 [=======>......................] - ETA: 3:04 - loss: 0.1866 - regression_loss: 0.1797 - classification_loss: 0.0069 140/500 [=======>......................] - ETA: 3:03 - loss: 0.1878 - regression_loss: 0.1809 - classification_loss: 0.0069 141/500 [=======>......................] - ETA: 3:03 - loss: 0.1873 - regression_loss: 0.1805 - classification_loss: 0.0069 142/500 [=======>......................] - ETA: 3:02 - loss: 0.1869 - regression_loss: 0.1801 - classification_loss: 0.0068 143/500 [=======>......................] - ETA: 3:02 - loss: 0.1869 - regression_loss: 0.1800 - classification_loss: 0.0069 144/500 [=======>......................] - ETA: 3:01 - loss: 0.1868 - regression_loss: 0.1800 - classification_loss: 0.0068 145/500 [=======>......................] - ETA: 3:01 - loss: 0.1874 - regression_loss: 0.1805 - classification_loss: 0.0069 146/500 [=======>......................] - ETA: 3:00 - loss: 0.1874 - regression_loss: 0.1806 - classification_loss: 0.0069 147/500 [=======>......................] - ETA: 3:00 - loss: 0.1867 - regression_loss: 0.1798 - classification_loss: 0.0068 148/500 [=======>......................] - ETA: 2:59 - loss: 0.1868 - regression_loss: 0.1800 - classification_loss: 0.0068 149/500 [=======>......................] - ETA: 2:58 - loss: 0.1865 - regression_loss: 0.1797 - classification_loss: 0.0068 150/500 [========>.....................] - ETA: 2:58 - loss: 0.1865 - regression_loss: 0.1797 - classification_loss: 0.0068 151/500 [========>.....................] - ETA: 2:57 - loss: 0.1892 - regression_loss: 0.1823 - classification_loss: 0.0068 152/500 [========>.....................] - ETA: 2:57 - loss: 0.1886 - regression_loss: 0.1818 - classification_loss: 0.0068 153/500 [========>.....................] - ETA: 2:56 - loss: 0.1881 - regression_loss: 0.1813 - classification_loss: 0.0068 154/500 [========>.....................] - ETA: 2:56 - loss: 0.1873 - regression_loss: 0.1806 - classification_loss: 0.0068 155/500 [========>.....................] - ETA: 2:55 - loss: 0.1866 - regression_loss: 0.1798 - classification_loss: 0.0067 156/500 [========>.....................] - ETA: 2:55 - loss: 0.1874 - regression_loss: 0.1806 - classification_loss: 0.0067 157/500 [========>.....................] - ETA: 2:54 - loss: 0.1879 - regression_loss: 0.1811 - classification_loss: 0.0067 158/500 [========>.....................] - ETA: 2:54 - loss: 0.1882 - regression_loss: 0.1814 - classification_loss: 0.0068 159/500 [========>.....................] - ETA: 2:53 - loss: 0.1881 - regression_loss: 0.1814 - classification_loss: 0.0068 160/500 [========>.....................] - ETA: 2:53 - loss: 0.1880 - regression_loss: 0.1812 - classification_loss: 0.0067 161/500 [========>.....................] - ETA: 2:52 - loss: 0.1882 - regression_loss: 0.1815 - classification_loss: 0.0067 162/500 [========>.....................] - ETA: 2:52 - loss: 0.1881 - regression_loss: 0.1814 - classification_loss: 0.0067 163/500 [========>.....................] - ETA: 2:51 - loss: 0.1882 - regression_loss: 0.1814 - classification_loss: 0.0068 164/500 [========>.....................] - ETA: 2:51 - loss: 0.1887 - regression_loss: 0.1819 - classification_loss: 0.0068 165/500 [========>.....................] - ETA: 2:50 - loss: 0.1884 - regression_loss: 0.1816 - classification_loss: 0.0068 166/500 [========>.....................] - ETA: 2:50 - loss: 0.1883 - regression_loss: 0.1814 - classification_loss: 0.0068 167/500 [=========>....................] - ETA: 2:49 - loss: 0.1876 - regression_loss: 0.1807 - classification_loss: 0.0068 168/500 [=========>....................] - ETA: 2:49 - loss: 0.1875 - regression_loss: 0.1807 - classification_loss: 0.0068 169/500 [=========>....................] - ETA: 2:48 - loss: 0.1871 - regression_loss: 0.1803 - classification_loss: 0.0068 170/500 [=========>....................] - ETA: 2:48 - loss: 0.1867 - regression_loss: 0.1800 - classification_loss: 0.0068 171/500 [=========>....................] - ETA: 2:47 - loss: 0.1871 - regression_loss: 0.1802 - classification_loss: 0.0069 172/500 [=========>....................] - ETA: 2:47 - loss: 0.1874 - regression_loss: 0.1805 - classification_loss: 0.0069 173/500 [=========>....................] - ETA: 2:46 - loss: 0.1877 - regression_loss: 0.1809 - classification_loss: 0.0068 174/500 [=========>....................] - ETA: 2:46 - loss: 0.1875 - regression_loss: 0.1807 - classification_loss: 0.0069 175/500 [=========>....................] - ETA: 2:45 - loss: 0.1869 - regression_loss: 0.1801 - classification_loss: 0.0068 176/500 [=========>....................] - ETA: 2:45 - loss: 0.1869 - regression_loss: 0.1800 - classification_loss: 0.0068 177/500 [=========>....................] - ETA: 2:44 - loss: 0.1871 - regression_loss: 0.1802 - classification_loss: 0.0068 178/500 [=========>....................] - ETA: 2:44 - loss: 0.1885 - regression_loss: 0.1817 - classification_loss: 0.0068 179/500 [=========>....................] - ETA: 2:43 - loss: 0.1885 - regression_loss: 0.1817 - classification_loss: 0.0068 180/500 [=========>....................] - ETA: 2:43 - loss: 0.1884 - regression_loss: 0.1816 - classification_loss: 0.0068 181/500 [=========>....................] - ETA: 2:42 - loss: 0.1889 - regression_loss: 0.1821 - classification_loss: 0.0069 182/500 [=========>....................] - ETA: 2:42 - loss: 0.1883 - regression_loss: 0.1815 - classification_loss: 0.0068 183/500 [=========>....................] - ETA: 2:41 - loss: 0.1882 - regression_loss: 0.1814 - classification_loss: 0.0068 184/500 [==========>...................] - ETA: 2:41 - loss: 0.1883 - regression_loss: 0.1815 - classification_loss: 0.0068 185/500 [==========>...................] - ETA: 2:40 - loss: 0.1881 - regression_loss: 0.1813 - classification_loss: 0.0068 186/500 [==========>...................] - ETA: 2:40 - loss: 0.1880 - regression_loss: 0.1812 - classification_loss: 0.0068 187/500 [==========>...................] - ETA: 2:39 - loss: 0.1872 - regression_loss: 0.1805 - classification_loss: 0.0068 188/500 [==========>...................] - ETA: 2:39 - loss: 0.1883 - regression_loss: 0.1816 - classification_loss: 0.0067 189/500 [==========>...................] - ETA: 2:38 - loss: 0.1881 - regression_loss: 0.1813 - classification_loss: 0.0067 190/500 [==========>...................] - ETA: 2:37 - loss: 0.1887 - regression_loss: 0.1819 - classification_loss: 0.0068 191/500 [==========>...................] - ETA: 2:37 - loss: 0.1893 - regression_loss: 0.1826 - classification_loss: 0.0067 192/500 [==========>...................] - ETA: 2:36 - loss: 0.1899 - regression_loss: 0.1831 - classification_loss: 0.0068 193/500 [==========>...................] - ETA: 2:36 - loss: 0.1898 - regression_loss: 0.1830 - classification_loss: 0.0068 194/500 [==========>...................] - ETA: 2:35 - loss: 0.1897 - regression_loss: 0.1829 - classification_loss: 0.0067 195/500 [==========>...................] - ETA: 2:35 - loss: 0.1897 - regression_loss: 0.1830 - classification_loss: 0.0067 196/500 [==========>...................] - ETA: 2:34 - loss: 0.1893 - regression_loss: 0.1826 - classification_loss: 0.0067 197/500 [==========>...................] - ETA: 2:34 - loss: 0.1896 - regression_loss: 0.1829 - classification_loss: 0.0067 198/500 [==========>...................] - ETA: 2:33 - loss: 0.1892 - regression_loss: 0.1826 - classification_loss: 0.0067 199/500 [==========>...................] - ETA: 2:33 - loss: 0.1886 - regression_loss: 0.1820 - classification_loss: 0.0066 200/500 [===========>..................] - ETA: 2:32 - loss: 0.1882 - regression_loss: 0.1816 - classification_loss: 0.0066 201/500 [===========>..................] - ETA: 2:32 - loss: 0.1880 - regression_loss: 0.1814 - classification_loss: 0.0066 202/500 [===========>..................] - ETA: 2:31 - loss: 0.1882 - regression_loss: 0.1816 - classification_loss: 0.0066 203/500 [===========>..................] - ETA: 2:31 - loss: 0.1882 - regression_loss: 0.1816 - classification_loss: 0.0066 204/500 [===========>..................] - ETA: 2:30 - loss: 0.1891 - regression_loss: 0.1822 - classification_loss: 0.0069 205/500 [===========>..................] - ETA: 2:30 - loss: 0.1888 - regression_loss: 0.1819 - classification_loss: 0.0069 206/500 [===========>..................] - ETA: 2:29 - loss: 0.1891 - regression_loss: 0.1823 - classification_loss: 0.0069 207/500 [===========>..................] - ETA: 2:29 - loss: 0.1892 - regression_loss: 0.1823 - classification_loss: 0.0069 208/500 [===========>..................] - ETA: 2:28 - loss: 0.1892 - regression_loss: 0.1823 - classification_loss: 0.0069 209/500 [===========>..................] - ETA: 2:28 - loss: 0.1900 - regression_loss: 0.1832 - classification_loss: 0.0069 210/500 [===========>..................] - ETA: 2:27 - loss: 0.1894 - regression_loss: 0.1826 - classification_loss: 0.0068 211/500 [===========>..................] - ETA: 2:27 - loss: 0.1894 - regression_loss: 0.1826 - classification_loss: 0.0069 212/500 [===========>..................] - ETA: 2:26 - loss: 0.1895 - regression_loss: 0.1826 - classification_loss: 0.0068 213/500 [===========>..................] - ETA: 2:26 - loss: 0.1896 - regression_loss: 0.1827 - classification_loss: 0.0069 214/500 [===========>..................] - ETA: 2:25 - loss: 0.1893 - regression_loss: 0.1825 - classification_loss: 0.0069 215/500 [===========>..................] - ETA: 2:25 - loss: 0.1899 - regression_loss: 0.1830 - classification_loss: 0.0069 216/500 [===========>..................] - ETA: 2:24 - loss: 0.1899 - regression_loss: 0.1830 - classification_loss: 0.0069 217/500 [============>.................] - ETA: 2:24 - loss: 0.1898 - regression_loss: 0.1829 - classification_loss: 0.0069 218/500 [============>.................] - ETA: 2:23 - loss: 0.1903 - regression_loss: 0.1834 - classification_loss: 0.0069 219/500 [============>.................] - ETA: 2:23 - loss: 0.1900 - regression_loss: 0.1832 - classification_loss: 0.0069 220/500 [============>.................] - ETA: 2:22 - loss: 0.1901 - regression_loss: 0.1833 - classification_loss: 0.0069 221/500 [============>.................] - ETA: 2:22 - loss: 0.1902 - regression_loss: 0.1833 - classification_loss: 0.0069 222/500 [============>.................] - ETA: 2:21 - loss: 0.1899 - regression_loss: 0.1831 - classification_loss: 0.0069 223/500 [============>.................] - ETA: 2:21 - loss: 0.1906 - regression_loss: 0.1837 - classification_loss: 0.0069 224/500 [============>.................] - ETA: 2:20 - loss: 0.1903 - regression_loss: 0.1835 - classification_loss: 0.0069 225/500 [============>.................] - ETA: 2:19 - loss: 0.1898 - regression_loss: 0.1830 - classification_loss: 0.0068 226/500 [============>.................] - ETA: 2:19 - loss: 0.1903 - regression_loss: 0.1834 - classification_loss: 0.0069 227/500 [============>.................] - ETA: 2:18 - loss: 0.1901 - regression_loss: 0.1832 - classification_loss: 0.0069 228/500 [============>.................] - ETA: 2:18 - loss: 0.1902 - regression_loss: 0.1834 - classification_loss: 0.0069 229/500 [============>.................] - ETA: 2:17 - loss: 0.1897 - regression_loss: 0.1829 - classification_loss: 0.0069 230/500 [============>.................] - ETA: 2:17 - loss: 0.1902 - regression_loss: 0.1833 - classification_loss: 0.0069 231/500 [============>.................] - ETA: 2:17 - loss: 0.1900 - regression_loss: 0.1831 - classification_loss: 0.0069 232/500 [============>.................] - ETA: 2:16 - loss: 0.1905 - regression_loss: 0.1836 - classification_loss: 0.0069 233/500 [============>.................] - ETA: 2:16 - loss: 0.1903 - regression_loss: 0.1835 - classification_loss: 0.0069 234/500 [=============>................] - ETA: 2:15 - loss: 0.1901 - regression_loss: 0.1833 - classification_loss: 0.0069 235/500 [=============>................] - ETA: 2:15 - loss: 0.1902 - regression_loss: 0.1833 - classification_loss: 0.0069 236/500 [=============>................] - ETA: 2:14 - loss: 0.1901 - regression_loss: 0.1832 - classification_loss: 0.0069 237/500 [=============>................] - ETA: 2:13 - loss: 0.1901 - regression_loss: 0.1832 - classification_loss: 0.0069 238/500 [=============>................] - ETA: 2:13 - loss: 0.1897 - regression_loss: 0.1828 - classification_loss: 0.0069 239/500 [=============>................] - ETA: 2:12 - loss: 0.1894 - regression_loss: 0.1825 - classification_loss: 0.0068 240/500 [=============>................] - ETA: 2:12 - loss: 0.1890 - regression_loss: 0.1822 - classification_loss: 0.0068 241/500 [=============>................] - ETA: 2:11 - loss: 0.1893 - regression_loss: 0.1825 - classification_loss: 0.0068 242/500 [=============>................] - ETA: 2:11 - loss: 0.1897 - regression_loss: 0.1828 - classification_loss: 0.0069 243/500 [=============>................] - ETA: 2:10 - loss: 0.1897 - regression_loss: 0.1828 - classification_loss: 0.0069 244/500 [=============>................] - ETA: 2:10 - loss: 0.1897 - regression_loss: 0.1828 - classification_loss: 0.0069 245/500 [=============>................] - ETA: 2:09 - loss: 0.1897 - regression_loss: 0.1828 - classification_loss: 0.0069 246/500 [=============>................] - ETA: 2:09 - loss: 0.1892 - regression_loss: 0.1823 - classification_loss: 0.0068 247/500 [=============>................] - ETA: 2:08 - loss: 0.1894 - regression_loss: 0.1825 - classification_loss: 0.0068 248/500 [=============>................] - ETA: 2:08 - loss: 0.1897 - regression_loss: 0.1827 - classification_loss: 0.0069 249/500 [=============>................] - ETA: 2:07 - loss: 0.1898 - regression_loss: 0.1828 - classification_loss: 0.0069 250/500 [==============>...............] - ETA: 2:07 - loss: 0.1891 - regression_loss: 0.1822 - classification_loss: 0.0069 251/500 [==============>...............] - ETA: 2:06 - loss: 0.1890 - regression_loss: 0.1821 - classification_loss: 0.0069 252/500 [==============>...............] - ETA: 2:06 - loss: 0.1885 - regression_loss: 0.1816 - classification_loss: 0.0069 253/500 [==============>...............] - ETA: 2:05 - loss: 0.1886 - regression_loss: 0.1817 - classification_loss: 0.0069 254/500 [==============>...............] - ETA: 2:05 - loss: 0.1884 - regression_loss: 0.1815 - classification_loss: 0.0069 255/500 [==============>...............] - ETA: 2:04 - loss: 0.1892 - regression_loss: 0.1822 - classification_loss: 0.0070 256/500 [==============>...............] - ETA: 2:04 - loss: 0.1897 - regression_loss: 0.1827 - classification_loss: 0.0070 257/500 [==============>...............] - ETA: 2:03 - loss: 0.1896 - regression_loss: 0.1827 - classification_loss: 0.0069 258/500 [==============>...............] - ETA: 2:03 - loss: 0.1895 - regression_loss: 0.1826 - classification_loss: 0.0069 259/500 [==============>...............] - ETA: 2:02 - loss: 0.1893 - regression_loss: 0.1824 - classification_loss: 0.0069 260/500 [==============>...............] - ETA: 2:02 - loss: 0.1894 - regression_loss: 0.1825 - classification_loss: 0.0069 261/500 [==============>...............] - ETA: 2:01 - loss: 0.1891 - regression_loss: 0.1822 - classification_loss: 0.0069 262/500 [==============>...............] - ETA: 2:01 - loss: 0.1893 - regression_loss: 0.1824 - classification_loss: 0.0069 263/500 [==============>...............] - ETA: 2:00 - loss: 0.1892 - regression_loss: 0.1823 - classification_loss: 0.0069 264/500 [==============>...............] - ETA: 2:00 - loss: 0.1888 - regression_loss: 0.1819 - classification_loss: 0.0069 265/500 [==============>...............] - ETA: 1:59 - loss: 0.1888 - regression_loss: 0.1819 - classification_loss: 0.0069 266/500 [==============>...............] - ETA: 1:59 - loss: 0.1887 - regression_loss: 0.1818 - classification_loss: 0.0069 267/500 [===============>..............] - ETA: 1:58 - loss: 0.1885 - regression_loss: 0.1816 - classification_loss: 0.0069 268/500 [===============>..............] - ETA: 1:58 - loss: 0.1882 - regression_loss: 0.1814 - classification_loss: 0.0068 269/500 [===============>..............] - ETA: 1:57 - loss: 0.1892 - regression_loss: 0.1824 - classification_loss: 0.0068 270/500 [===============>..............] - ETA: 1:57 - loss: 0.1889 - regression_loss: 0.1821 - classification_loss: 0.0068 271/500 [===============>..............] - ETA: 1:56 - loss: 0.1886 - regression_loss: 0.1818 - classification_loss: 0.0068 272/500 [===============>..............] - ETA: 1:56 - loss: 0.1882 - regression_loss: 0.1814 - classification_loss: 0.0068 273/500 [===============>..............] - ETA: 1:55 - loss: 0.1883 - regression_loss: 0.1815 - classification_loss: 0.0068 274/500 [===============>..............] - ETA: 1:55 - loss: 0.1883 - regression_loss: 0.1815 - classification_loss: 0.0068 275/500 [===============>..............] - ETA: 1:54 - loss: 0.1882 - regression_loss: 0.1814 - classification_loss: 0.0068 276/500 [===============>..............] - ETA: 1:54 - loss: 0.1880 - regression_loss: 0.1813 - classification_loss: 0.0068 277/500 [===============>..............] - ETA: 1:53 - loss: 0.1879 - regression_loss: 0.1812 - classification_loss: 0.0067 278/500 [===============>..............] - ETA: 1:53 - loss: 0.1879 - regression_loss: 0.1812 - classification_loss: 0.0067 279/500 [===============>..............] - ETA: 1:52 - loss: 0.1882 - regression_loss: 0.1815 - classification_loss: 0.0067 280/500 [===============>..............] - ETA: 1:52 - loss: 0.1883 - regression_loss: 0.1816 - classification_loss: 0.0067 281/500 [===============>..............] - ETA: 1:51 - loss: 0.1885 - regression_loss: 0.1818 - classification_loss: 0.0067 282/500 [===============>..............] - ETA: 1:50 - loss: 0.1883 - regression_loss: 0.1816 - classification_loss: 0.0067 283/500 [===============>..............] - ETA: 1:50 - loss: 0.1887 - regression_loss: 0.1820 - classification_loss: 0.0067 284/500 [================>.............] - ETA: 1:49 - loss: 0.1890 - regression_loss: 0.1823 - classification_loss: 0.0067 285/500 [================>.............] - ETA: 1:49 - loss: 0.1891 - regression_loss: 0.1823 - classification_loss: 0.0067 286/500 [================>.............] - ETA: 1:48 - loss: 0.1889 - regression_loss: 0.1822 - classification_loss: 0.0067 287/500 [================>.............] - ETA: 1:48 - loss: 0.1891 - regression_loss: 0.1824 - classification_loss: 0.0068 288/500 [================>.............] - ETA: 1:47 - loss: 0.1890 - regression_loss: 0.1823 - classification_loss: 0.0068 289/500 [================>.............] - ETA: 1:47 - loss: 0.1893 - regression_loss: 0.1825 - classification_loss: 0.0068 290/500 [================>.............] - ETA: 1:46 - loss: 0.1888 - regression_loss: 0.1820 - classification_loss: 0.0068 291/500 [================>.............] - ETA: 1:46 - loss: 0.1888 - regression_loss: 0.1820 - classification_loss: 0.0069 292/500 [================>.............] - ETA: 1:45 - loss: 0.1888 - regression_loss: 0.1819 - classification_loss: 0.0068 293/500 [================>.............] - ETA: 1:45 - loss: 0.1887 - regression_loss: 0.1818 - classification_loss: 0.0068 294/500 [================>.............] - ETA: 1:44 - loss: 0.1884 - regression_loss: 0.1816 - classification_loss: 0.0068 295/500 [================>.............] - ETA: 1:44 - loss: 0.1885 - regression_loss: 0.1817 - classification_loss: 0.0068 296/500 [================>.............] - ETA: 1:43 - loss: 0.1887 - regression_loss: 0.1819 - classification_loss: 0.0068 297/500 [================>.............] - ETA: 1:43 - loss: 0.1891 - regression_loss: 0.1822 - classification_loss: 0.0069 298/500 [================>.............] - ETA: 1:42 - loss: 0.1890 - regression_loss: 0.1821 - classification_loss: 0.0068 299/500 [================>.............] - ETA: 1:42 - loss: 0.1889 - regression_loss: 0.1820 - classification_loss: 0.0068 300/500 [=================>............] - ETA: 1:41 - loss: 0.1893 - regression_loss: 0.1824 - classification_loss: 0.0068 301/500 [=================>............] - ETA: 1:41 - loss: 0.1896 - regression_loss: 0.1827 - classification_loss: 0.0069 302/500 [=================>............] - ETA: 1:40 - loss: 0.1897 - regression_loss: 0.1828 - classification_loss: 0.0069 303/500 [=================>............] - ETA: 1:40 - loss: 0.1892 - regression_loss: 0.1823 - classification_loss: 0.0069 304/500 [=================>............] - ETA: 1:39 - loss: 0.1897 - regression_loss: 0.1828 - classification_loss: 0.0069 305/500 [=================>............] - ETA: 1:39 - loss: 0.1895 - regression_loss: 0.1826 - classification_loss: 0.0069 306/500 [=================>............] - ETA: 1:38 - loss: 0.1898 - regression_loss: 0.1829 - classification_loss: 0.0069 307/500 [=================>............] - ETA: 1:38 - loss: 0.1900 - regression_loss: 0.1831 - classification_loss: 0.0069 308/500 [=================>............] - ETA: 1:37 - loss: 0.1899 - regression_loss: 0.1830 - classification_loss: 0.0069 309/500 [=================>............] - ETA: 1:37 - loss: 0.1895 - regression_loss: 0.1826 - classification_loss: 0.0069 310/500 [=================>............] - ETA: 1:36 - loss: 0.1893 - regression_loss: 0.1824 - classification_loss: 0.0069 311/500 [=================>............] - ETA: 1:36 - loss: 0.1893 - regression_loss: 0.1824 - classification_loss: 0.0069 312/500 [=================>............] - ETA: 1:35 - loss: 0.1892 - regression_loss: 0.1824 - classification_loss: 0.0069 313/500 [=================>............] - ETA: 1:35 - loss: 0.1889 - regression_loss: 0.1820 - classification_loss: 0.0069 314/500 [=================>............] - ETA: 1:34 - loss: 0.1886 - regression_loss: 0.1818 - classification_loss: 0.0069 315/500 [=================>............] - ETA: 1:34 - loss: 0.1884 - regression_loss: 0.1815 - classification_loss: 0.0069 316/500 [=================>............] - ETA: 1:33 - loss: 0.1884 - regression_loss: 0.1815 - classification_loss: 0.0068 317/500 [==================>...........] - ETA: 1:33 - loss: 0.1881 - regression_loss: 0.1813 - classification_loss: 0.0068 318/500 [==================>...........] - ETA: 1:32 - loss: 0.1879 - regression_loss: 0.1810 - classification_loss: 0.0068 319/500 [==================>...........] - ETA: 1:32 - loss: 0.1879 - regression_loss: 0.1811 - classification_loss: 0.0068 320/500 [==================>...........] - ETA: 1:31 - loss: 0.1876 - regression_loss: 0.1808 - classification_loss: 0.0068 321/500 [==================>...........] - ETA: 1:31 - loss: 0.1877 - regression_loss: 0.1809 - classification_loss: 0.0068 322/500 [==================>...........] - ETA: 1:30 - loss: 0.1878 - regression_loss: 0.1810 - classification_loss: 0.0068 323/500 [==================>...........] - ETA: 1:30 - loss: 0.1880 - regression_loss: 0.1812 - classification_loss: 0.0068 324/500 [==================>...........] - ETA: 1:29 - loss: 0.1879 - regression_loss: 0.1811 - classification_loss: 0.0068 325/500 [==================>...........] - ETA: 1:29 - loss: 0.1878 - regression_loss: 0.1810 - classification_loss: 0.0068 326/500 [==================>...........] - ETA: 1:28 - loss: 0.1877 - regression_loss: 0.1809 - classification_loss: 0.0068 327/500 [==================>...........] - ETA: 1:28 - loss: 0.1878 - regression_loss: 0.1810 - classification_loss: 0.0068 328/500 [==================>...........] - ETA: 1:27 - loss: 0.1880 - regression_loss: 0.1812 - classification_loss: 0.0068 329/500 [==================>...........] - ETA: 1:27 - loss: 0.1878 - regression_loss: 0.1809 - classification_loss: 0.0068 330/500 [==================>...........] - ETA: 1:26 - loss: 0.1876 - regression_loss: 0.1808 - classification_loss: 0.0068 331/500 [==================>...........] - ETA: 1:26 - loss: 0.1877 - regression_loss: 0.1808 - classification_loss: 0.0068 332/500 [==================>...........] - ETA: 1:25 - loss: 0.1877 - regression_loss: 0.1809 - classification_loss: 0.0068 333/500 [==================>...........] - ETA: 1:25 - loss: 0.1877 - regression_loss: 0.1809 - classification_loss: 0.0068 334/500 [===================>..........] - ETA: 1:24 - loss: 0.1875 - regression_loss: 0.1807 - classification_loss: 0.0068 335/500 [===================>..........] - ETA: 1:24 - loss: 0.1872 - regression_loss: 0.1804 - classification_loss: 0.0068 336/500 [===================>..........] - ETA: 1:23 - loss: 0.1872 - regression_loss: 0.1804 - classification_loss: 0.0068 337/500 [===================>..........] - ETA: 1:22 - loss: 0.1881 - regression_loss: 0.1813 - classification_loss: 0.0068 338/500 [===================>..........] - ETA: 1:22 - loss: 0.1880 - regression_loss: 0.1812 - classification_loss: 0.0068 339/500 [===================>..........] - ETA: 1:21 - loss: 0.1876 - regression_loss: 0.1808 - classification_loss: 0.0067 340/500 [===================>..........] - ETA: 1:21 - loss: 0.1876 - regression_loss: 0.1809 - classification_loss: 0.0067 341/500 [===================>..........] - ETA: 1:20 - loss: 0.1873 - regression_loss: 0.1806 - classification_loss: 0.0067 342/500 [===================>..........] - ETA: 1:20 - loss: 0.1876 - regression_loss: 0.1808 - classification_loss: 0.0068 343/500 [===================>..........] - ETA: 1:19 - loss: 0.1872 - regression_loss: 0.1804 - classification_loss: 0.0068 344/500 [===================>..........] - ETA: 1:19 - loss: 0.1872 - regression_loss: 0.1804 - classification_loss: 0.0068 345/500 [===================>..........] - ETA: 1:18 - loss: 0.1876 - regression_loss: 0.1808 - classification_loss: 0.0068 346/500 [===================>..........] - ETA: 1:18 - loss: 0.1877 - regression_loss: 0.1809 - classification_loss: 0.0068 347/500 [===================>..........] - ETA: 1:17 - loss: 0.1876 - regression_loss: 0.1808 - classification_loss: 0.0068 348/500 [===================>..........] - ETA: 1:17 - loss: 0.1875 - regression_loss: 0.1807 - classification_loss: 0.0068 349/500 [===================>..........] - ETA: 1:16 - loss: 0.1874 - regression_loss: 0.1806 - classification_loss: 0.0068 350/500 [====================>.........] - ETA: 1:16 - loss: 0.1875 - regression_loss: 0.1807 - classification_loss: 0.0068 351/500 [====================>.........] - ETA: 1:15 - loss: 0.1872 - regression_loss: 0.1805 - classification_loss: 0.0068 352/500 [====================>.........] - ETA: 1:15 - loss: 0.1869 - regression_loss: 0.1801 - classification_loss: 0.0068 353/500 [====================>.........] - ETA: 1:14 - loss: 0.1868 - regression_loss: 0.1801 - classification_loss: 0.0068 354/500 [====================>.........] - ETA: 1:14 - loss: 0.1870 - regression_loss: 0.1802 - classification_loss: 0.0068 355/500 [====================>.........] - ETA: 1:13 - loss: 0.1871 - regression_loss: 0.1804 - classification_loss: 0.0068 356/500 [====================>.........] - ETA: 1:13 - loss: 0.1871 - regression_loss: 0.1803 - classification_loss: 0.0068 357/500 [====================>.........] - ETA: 1:12 - loss: 0.1870 - regression_loss: 0.1802 - classification_loss: 0.0068 358/500 [====================>.........] - ETA: 1:12 - loss: 0.1867 - regression_loss: 0.1799 - classification_loss: 0.0068 359/500 [====================>.........] - ETA: 1:11 - loss: 0.1865 - regression_loss: 0.1798 - classification_loss: 0.0068 360/500 [====================>.........] - ETA: 1:11 - loss: 0.1865 - regression_loss: 0.1797 - classification_loss: 0.0068 361/500 [====================>.........] - ETA: 1:10 - loss: 0.1865 - regression_loss: 0.1797 - classification_loss: 0.0067 362/500 [====================>.........] - ETA: 1:10 - loss: 0.1862 - regression_loss: 0.1795 - classification_loss: 0.0067 363/500 [====================>.........] - ETA: 1:09 - loss: 0.1868 - regression_loss: 0.1799 - classification_loss: 0.0069 364/500 [====================>.........] - ETA: 1:09 - loss: 0.1868 - regression_loss: 0.1799 - classification_loss: 0.0069 365/500 [====================>.........] - ETA: 1:08 - loss: 0.1870 - regression_loss: 0.1801 - classification_loss: 0.0069 366/500 [====================>.........] - ETA: 1:08 - loss: 0.1869 - regression_loss: 0.1799 - classification_loss: 0.0069 367/500 [=====================>........] - ETA: 1:07 - loss: 0.1867 - regression_loss: 0.1798 - classification_loss: 0.0069 368/500 [=====================>........] - ETA: 1:07 - loss: 0.1869 - regression_loss: 0.1800 - classification_loss: 0.0069 369/500 [=====================>........] - ETA: 1:06 - loss: 0.1869 - regression_loss: 0.1799 - classification_loss: 0.0069 370/500 [=====================>........] - ETA: 1:06 - loss: 0.1867 - regression_loss: 0.1798 - classification_loss: 0.0069 371/500 [=====================>........] - ETA: 1:05 - loss: 0.1867 - regression_loss: 0.1798 - classification_loss: 0.0069 372/500 [=====================>........] - ETA: 1:05 - loss: 0.1866 - regression_loss: 0.1797 - classification_loss: 0.0069 373/500 [=====================>........] - ETA: 1:04 - loss: 0.1863 - regression_loss: 0.1794 - classification_loss: 0.0069 374/500 [=====================>........] - ETA: 1:04 - loss: 0.1864 - regression_loss: 0.1795 - classification_loss: 0.0069 375/500 [=====================>........] - ETA: 1:03 - loss: 0.1862 - regression_loss: 0.1793 - classification_loss: 0.0069 376/500 [=====================>........] - ETA: 1:03 - loss: 0.1861 - regression_loss: 0.1792 - classification_loss: 0.0069 377/500 [=====================>........] - ETA: 1:02 - loss: 0.1860 - regression_loss: 0.1791 - classification_loss: 0.0069 378/500 [=====================>........] - ETA: 1:02 - loss: 0.1859 - regression_loss: 0.1790 - classification_loss: 0.0069 379/500 [=====================>........] - ETA: 1:01 - loss: 0.1862 - regression_loss: 0.1793 - classification_loss: 0.0069 380/500 [=====================>........] - ETA: 1:01 - loss: 0.1863 - regression_loss: 0.1795 - classification_loss: 0.0069 381/500 [=====================>........] - ETA: 1:00 - loss: 0.1862 - regression_loss: 0.1793 - classification_loss: 0.0069 382/500 [=====================>........] - ETA: 1:00 - loss: 0.1863 - regression_loss: 0.1794 - classification_loss: 0.0069 383/500 [=====================>........] - ETA: 59s - loss: 0.1863 - regression_loss: 0.1794 - classification_loss: 0.0069  384/500 [======================>.......] - ETA: 59s - loss: 0.1862 - regression_loss: 0.1793 - classification_loss: 0.0069 385/500 [======================>.......] - ETA: 58s - loss: 0.1859 - regression_loss: 0.1790 - classification_loss: 0.0069 386/500 [======================>.......] - ETA: 58s - loss: 0.1860 - regression_loss: 0.1791 - classification_loss: 0.0069 387/500 [======================>.......] - ETA: 57s - loss: 0.1858 - regression_loss: 0.1790 - classification_loss: 0.0069 388/500 [======================>.......] - ETA: 56s - loss: 0.1855 - regression_loss: 0.1787 - classification_loss: 0.0068 389/500 [======================>.......] - ETA: 56s - loss: 0.1857 - regression_loss: 0.1789 - classification_loss: 0.0068 390/500 [======================>.......] - ETA: 55s - loss: 0.1860 - regression_loss: 0.1791 - classification_loss: 0.0068 391/500 [======================>.......] - ETA: 55s - loss: 0.1862 - regression_loss: 0.1793 - classification_loss: 0.0068 392/500 [======================>.......] - ETA: 54s - loss: 0.1860 - regression_loss: 0.1792 - classification_loss: 0.0068 393/500 [======================>.......] - ETA: 54s - loss: 0.1862 - regression_loss: 0.1794 - classification_loss: 0.0068 394/500 [======================>.......] - ETA: 53s - loss: 0.1864 - regression_loss: 0.1795 - classification_loss: 0.0068 395/500 [======================>.......] - ETA: 53s - loss: 0.1864 - regression_loss: 0.1796 - classification_loss: 0.0068 396/500 [======================>.......] - ETA: 52s - loss: 0.1864 - regression_loss: 0.1796 - classification_loss: 0.0068 397/500 [======================>.......] - ETA: 52s - loss: 0.1864 - regression_loss: 0.1796 - classification_loss: 0.0068 398/500 [======================>.......] - ETA: 51s - loss: 0.1863 - regression_loss: 0.1795 - classification_loss: 0.0068 399/500 [======================>.......] - ETA: 51s - loss: 0.1864 - regression_loss: 0.1795 - classification_loss: 0.0068 400/500 [=======================>......] - ETA: 50s - loss: 0.1862 - regression_loss: 0.1793 - classification_loss: 0.0068 401/500 [=======================>......] - ETA: 50s - loss: 0.1860 - regression_loss: 0.1792 - classification_loss: 0.0068 402/500 [=======================>......] - ETA: 49s - loss: 0.1856 - regression_loss: 0.1788 - classification_loss: 0.0068 403/500 [=======================>......] - ETA: 49s - loss: 0.1859 - regression_loss: 0.1790 - classification_loss: 0.0068 404/500 [=======================>......] - ETA: 48s - loss: 0.1858 - regression_loss: 0.1789 - classification_loss: 0.0068 405/500 [=======================>......] - ETA: 48s - loss: 0.1856 - regression_loss: 0.1788 - classification_loss: 0.0068 406/500 [=======================>......] - ETA: 47s - loss: 0.1859 - regression_loss: 0.1791 - classification_loss: 0.0068 407/500 [=======================>......] - ETA: 47s - loss: 0.1857 - regression_loss: 0.1789 - classification_loss: 0.0068 408/500 [=======================>......] - ETA: 46s - loss: 0.1859 - regression_loss: 0.1791 - classification_loss: 0.0068 409/500 [=======================>......] - ETA: 46s - loss: 0.1860 - regression_loss: 0.1792 - classification_loss: 0.0068 410/500 [=======================>......] - ETA: 45s - loss: 0.1860 - regression_loss: 0.1792 - classification_loss: 0.0068 411/500 [=======================>......] - ETA: 45s - loss: 0.1866 - regression_loss: 0.1797 - classification_loss: 0.0068 412/500 [=======================>......] - ETA: 44s - loss: 0.1863 - regression_loss: 0.1794 - classification_loss: 0.0068 413/500 [=======================>......] - ETA: 44s - loss: 0.1866 - regression_loss: 0.1797 - classification_loss: 0.0068 414/500 [=======================>......] - ETA: 43s - loss: 0.1868 - regression_loss: 0.1800 - classification_loss: 0.0068 415/500 [=======================>......] - ETA: 43s - loss: 0.1866 - regression_loss: 0.1798 - classification_loss: 0.0068 416/500 [=======================>......] - ETA: 42s - loss: 0.1871 - regression_loss: 0.1802 - classification_loss: 0.0068 417/500 [========================>.....] - ETA: 42s - loss: 0.1873 - regression_loss: 0.1805 - classification_loss: 0.0068 418/500 [========================>.....] - ETA: 41s - loss: 0.1873 - regression_loss: 0.1805 - classification_loss: 0.0068 419/500 [========================>.....] - ETA: 41s - loss: 0.1873 - regression_loss: 0.1805 - classification_loss: 0.0068 420/500 [========================>.....] - ETA: 40s - loss: 0.1874 - regression_loss: 0.1806 - classification_loss: 0.0068 421/500 [========================>.....] - ETA: 40s - loss: 0.1872 - regression_loss: 0.1804 - classification_loss: 0.0068 422/500 [========================>.....] - ETA: 39s - loss: 0.1874 - regression_loss: 0.1806 - classification_loss: 0.0068 423/500 [========================>.....] - ETA: 39s - loss: 0.1874 - regression_loss: 0.1806 - classification_loss: 0.0068 424/500 [========================>.....] - ETA: 38s - loss: 0.1873 - regression_loss: 0.1805 - classification_loss: 0.0068 425/500 [========================>.....] - ETA: 38s - loss: 0.1873 - regression_loss: 0.1805 - classification_loss: 0.0068 426/500 [========================>.....] - ETA: 37s - loss: 0.1872 - regression_loss: 0.1804 - classification_loss: 0.0068 427/500 [========================>.....] - ETA: 37s - loss: 0.1873 - regression_loss: 0.1805 - classification_loss: 0.0068 428/500 [========================>.....] - ETA: 36s - loss: 0.1871 - regression_loss: 0.1803 - classification_loss: 0.0068 429/500 [========================>.....] - ETA: 36s - loss: 0.1874 - regression_loss: 0.1806 - classification_loss: 0.0068 430/500 [========================>.....] - ETA: 35s - loss: 0.1875 - regression_loss: 0.1807 - classification_loss: 0.0068 431/500 [========================>.....] - ETA: 35s - loss: 0.1875 - regression_loss: 0.1806 - classification_loss: 0.0068 432/500 [========================>.....] - ETA: 34s - loss: 0.1873 - regression_loss: 0.1804 - classification_loss: 0.0068 433/500 [========================>.....] - ETA: 34s - loss: 0.1871 - regression_loss: 0.1803 - classification_loss: 0.0068 434/500 [=========================>....] - ETA: 33s - loss: 0.1871 - regression_loss: 0.1802 - classification_loss: 0.0068 435/500 [=========================>....] - ETA: 33s - loss: 0.1870 - regression_loss: 0.1802 - classification_loss: 0.0068 436/500 [=========================>....] - ETA: 32s - loss: 0.1871 - regression_loss: 0.1803 - classification_loss: 0.0068 437/500 [=========================>....] - ETA: 32s - loss: 0.1870 - regression_loss: 0.1802 - classification_loss: 0.0068 438/500 [=========================>....] - ETA: 31s - loss: 0.1869 - regression_loss: 0.1801 - classification_loss: 0.0068 439/500 [=========================>....] - ETA: 31s - loss: 0.1870 - regression_loss: 0.1802 - classification_loss: 0.0068 440/500 [=========================>....] - ETA: 30s - loss: 0.1873 - regression_loss: 0.1804 - classification_loss: 0.0068 441/500 [=========================>....] - ETA: 30s - loss: 0.1874 - regression_loss: 0.1805 - classification_loss: 0.0069 442/500 [=========================>....] - ETA: 29s - loss: 0.1875 - regression_loss: 0.1806 - classification_loss: 0.0069 443/500 [=========================>....] - ETA: 29s - loss: 0.1879 - regression_loss: 0.1810 - classification_loss: 0.0069 444/500 [=========================>....] - ETA: 28s - loss: 0.1877 - regression_loss: 0.1808 - classification_loss: 0.0069 445/500 [=========================>....] - ETA: 27s - loss: 0.1877 - regression_loss: 0.1808 - classification_loss: 0.0069 446/500 [=========================>....] - ETA: 27s - loss: 0.1875 - regression_loss: 0.1807 - classification_loss: 0.0069 447/500 [=========================>....] - ETA: 26s - loss: 0.1875 - regression_loss: 0.1806 - classification_loss: 0.0069 448/500 [=========================>....] - ETA: 26s - loss: 0.1874 - regression_loss: 0.1805 - classification_loss: 0.0068 449/500 [=========================>....] - ETA: 25s - loss: 0.1876 - regression_loss: 0.1807 - classification_loss: 0.0068 450/500 [==========================>...] - ETA: 25s - loss: 0.1877 - regression_loss: 0.1809 - classification_loss: 0.0068 451/500 [==========================>...] - ETA: 24s - loss: 0.1880 - regression_loss: 0.1812 - classification_loss: 0.0068 452/500 [==========================>...] - ETA: 24s - loss: 0.1881 - regression_loss: 0.1813 - classification_loss: 0.0068 453/500 [==========================>...] - ETA: 23s - loss: 0.1880 - regression_loss: 0.1812 - classification_loss: 0.0068 454/500 [==========================>...] - ETA: 23s - loss: 0.1883 - regression_loss: 0.1814 - classification_loss: 0.0068 455/500 [==========================>...] - ETA: 22s - loss: 0.1882 - regression_loss: 0.1814 - classification_loss: 0.0068 456/500 [==========================>...] - ETA: 22s - loss: 0.1881 - regression_loss: 0.1813 - classification_loss: 0.0068 457/500 [==========================>...] - ETA: 21s - loss: 0.1879 - regression_loss: 0.1811 - classification_loss: 0.0068 458/500 [==========================>...] - ETA: 21s - loss: 0.1878 - regression_loss: 0.1810 - classification_loss: 0.0068 459/500 [==========================>...] - ETA: 20s - loss: 0.1877 - regression_loss: 0.1809 - classification_loss: 0.0068 460/500 [==========================>...] - ETA: 20s - loss: 0.1874 - regression_loss: 0.1807 - classification_loss: 0.0068 461/500 [==========================>...] - ETA: 19s - loss: 0.1874 - regression_loss: 0.1806 - classification_loss: 0.0068 462/500 [==========================>...] - ETA: 19s - loss: 0.1872 - regression_loss: 0.1804 - classification_loss: 0.0068 463/500 [==========================>...] - ETA: 18s - loss: 0.1872 - regression_loss: 0.1804 - classification_loss: 0.0068 464/500 [==========================>...] - ETA: 18s - loss: 0.1873 - regression_loss: 0.1805 - classification_loss: 0.0068 465/500 [==========================>...] - ETA: 17s - loss: 0.1876 - regression_loss: 0.1808 - classification_loss: 0.0068 466/500 [==========================>...] - ETA: 17s - loss: 0.1876 - regression_loss: 0.1808 - classification_loss: 0.0068 467/500 [===========================>..] - ETA: 16s - loss: 0.1878 - regression_loss: 0.1810 - classification_loss: 0.0068 468/500 [===========================>..] - ETA: 16s - loss: 0.1879 - regression_loss: 0.1811 - classification_loss: 0.0068 469/500 [===========================>..] - ETA: 15s - loss: 0.1877 - regression_loss: 0.1810 - classification_loss: 0.0068 470/500 [===========================>..] - ETA: 15s - loss: 0.1878 - regression_loss: 0.1810 - classification_loss: 0.0068 471/500 [===========================>..] - ETA: 14s - loss: 0.1876 - regression_loss: 0.1809 - classification_loss: 0.0068 472/500 [===========================>..] - ETA: 14s - loss: 0.1877 - regression_loss: 0.1809 - classification_loss: 0.0068 473/500 [===========================>..] - ETA: 13s - loss: 0.1877 - regression_loss: 0.1809 - classification_loss: 0.0068 474/500 [===========================>..] - ETA: 13s - loss: 0.1876 - regression_loss: 0.1808 - classification_loss: 0.0068 475/500 [===========================>..] - ETA: 12s - loss: 0.1874 - regression_loss: 0.1806 - classification_loss: 0.0068 476/500 [===========================>..] - ETA: 12s - loss: 0.1874 - regression_loss: 0.1806 - classification_loss: 0.0067 477/500 [===========================>..] - ETA: 11s - loss: 0.1874 - regression_loss: 0.1807 - classification_loss: 0.0067 478/500 [===========================>..] - ETA: 11s - loss: 0.1874 - regression_loss: 0.1807 - classification_loss: 0.0067 479/500 [===========================>..] - ETA: 10s - loss: 0.1877 - regression_loss: 0.1810 - classification_loss: 0.0067 480/500 [===========================>..] - ETA: 10s - loss: 0.1877 - regression_loss: 0.1810 - classification_loss: 0.0067 481/500 [===========================>..] - ETA: 9s - loss: 0.1879 - regression_loss: 0.1812 - classification_loss: 0.0067  482/500 [===========================>..] - ETA: 9s - loss: 0.1880 - regression_loss: 0.1812 - classification_loss: 0.0067 483/500 [===========================>..] - ETA: 8s - loss: 0.1878 - regression_loss: 0.1811 - classification_loss: 0.0067 484/500 [============================>.] - ETA: 8s - loss: 0.1879 - regression_loss: 0.1812 - classification_loss: 0.0067 485/500 [============================>.] - ETA: 7s - loss: 0.1879 - regression_loss: 0.1812 - classification_loss: 0.0067 486/500 [============================>.] - ETA: 7s - loss: 0.1878 - regression_loss: 0.1811 - classification_loss: 0.0067 487/500 [============================>.] - ETA: 6s - loss: 0.1879 - regression_loss: 0.1812 - classification_loss: 0.0067 488/500 [============================>.] - ETA: 6s - loss: 0.1877 - regression_loss: 0.1811 - classification_loss: 0.0067 489/500 [============================>.] - ETA: 5s - loss: 0.1879 - regression_loss: 0.1812 - classification_loss: 0.0067 490/500 [============================>.] - ETA: 5s - loss: 0.1877 - regression_loss: 0.1810 - classification_loss: 0.0067 491/500 [============================>.] - ETA: 4s - loss: 0.1879 - regression_loss: 0.1811 - classification_loss: 0.0067 492/500 [============================>.] - ETA: 4s - loss: 0.1878 - regression_loss: 0.1811 - classification_loss: 0.0067 493/500 [============================>.] - ETA: 3s - loss: 0.1877 - regression_loss: 0.1810 - classification_loss: 0.0067 494/500 [============================>.] - ETA: 3s - loss: 0.1880 - regression_loss: 0.1813 - classification_loss: 0.0067 495/500 [============================>.] - ETA: 2s - loss: 0.1880 - regression_loss: 0.1813 - classification_loss: 0.0067 496/500 [============================>.] - ETA: 2s - loss: 0.1878 - regression_loss: 0.1811 - classification_loss: 0.0067 497/500 [============================>.] - ETA: 1s - loss: 0.1879 - regression_loss: 0.1812 - classification_loss: 0.0067 498/500 [============================>.] - ETA: 1s - loss: 0.1878 - regression_loss: 0.1811 - classification_loss: 0.0067 499/500 [============================>.] - ETA: 0s - loss: 0.1881 - regression_loss: 0.1814 - classification_loss: 0.0067 500/500 [==============================] - 255s 509ms/step - loss: 0.1881 - regression_loss: 0.1814 - classification_loss: 0.0067 1172 instances of class plum with average precision: 0.7525 mAP: 0.7525 Epoch 00043: saving model to ./training/snapshots/resnet101_pascal_43.h5 Epoch 44/150 1/500 [..............................] - ETA: 4:14 - loss: 0.1152 - regression_loss: 0.1120 - classification_loss: 0.0032 2/500 [..............................] - ETA: 4:12 - loss: 0.1990 - regression_loss: 0.1838 - classification_loss: 0.0152 3/500 [..............................] - ETA: 4:11 - loss: 0.1720 - regression_loss: 0.1608 - classification_loss: 0.0112 4/500 [..............................] - ETA: 4:11 - loss: 0.1499 - regression_loss: 0.1409 - classification_loss: 0.0089 5/500 [..............................] - ETA: 4:10 - loss: 0.1648 - regression_loss: 0.1560 - classification_loss: 0.0089 6/500 [..............................] - ETA: 4:10 - loss: 0.1840 - regression_loss: 0.1760 - classification_loss: 0.0080 7/500 [..............................] - ETA: 4:09 - loss: 0.1874 - regression_loss: 0.1785 - classification_loss: 0.0089 8/500 [..............................] - ETA: 4:10 - loss: 0.1886 - regression_loss: 0.1801 - classification_loss: 0.0085 9/500 [..............................] - ETA: 4:10 - loss: 0.1827 - regression_loss: 0.1745 - classification_loss: 0.0081 10/500 [..............................] - ETA: 4:10 - loss: 0.1818 - regression_loss: 0.1743 - classification_loss: 0.0075 11/500 [..............................] - ETA: 4:09 - loss: 0.1781 - regression_loss: 0.1711 - classification_loss: 0.0071 12/500 [..............................] - ETA: 4:09 - loss: 0.1837 - regression_loss: 0.1768 - classification_loss: 0.0069 13/500 [..............................] - ETA: 4:09 - loss: 0.1834 - regression_loss: 0.1764 - classification_loss: 0.0069 14/500 [..............................] - ETA: 4:08 - loss: 0.1777 - regression_loss: 0.1707 - classification_loss: 0.0071 15/500 [..............................] - ETA: 4:08 - loss: 0.1755 - regression_loss: 0.1688 - classification_loss: 0.0067 16/500 [..............................] - ETA: 4:08 - loss: 0.1686 - regression_loss: 0.1623 - classification_loss: 0.0063 17/500 [>.............................] - ETA: 4:08 - loss: 0.1624 - regression_loss: 0.1564 - classification_loss: 0.0060 18/500 [>.............................] - ETA: 4:07 - loss: 0.1714 - regression_loss: 0.1647 - classification_loss: 0.0067 19/500 [>.............................] - ETA: 4:07 - loss: 0.1712 - regression_loss: 0.1646 - classification_loss: 0.0065 20/500 [>.............................] - ETA: 4:06 - loss: 0.1750 - regression_loss: 0.1683 - classification_loss: 0.0067 21/500 [>.............................] - ETA: 4:05 - loss: 0.1765 - regression_loss: 0.1698 - classification_loss: 0.0067 22/500 [>.............................] - ETA: 4:05 - loss: 0.1811 - regression_loss: 0.1746 - classification_loss: 0.0065 23/500 [>.............................] - ETA: 4:04 - loss: 0.1819 - regression_loss: 0.1757 - classification_loss: 0.0063 24/500 [>.............................] - ETA: 4:03 - loss: 0.1864 - regression_loss: 0.1803 - classification_loss: 0.0061 25/500 [>.............................] - ETA: 4:03 - loss: 0.1875 - regression_loss: 0.1816 - classification_loss: 0.0059 26/500 [>.............................] - ETA: 4:02 - loss: 0.1908 - regression_loss: 0.1846 - classification_loss: 0.0062 27/500 [>.............................] - ETA: 4:02 - loss: 0.1870 - regression_loss: 0.1808 - classification_loss: 0.0061 28/500 [>.............................] - ETA: 4:01 - loss: 0.1903 - regression_loss: 0.1834 - classification_loss: 0.0069 29/500 [>.............................] - ETA: 4:01 - loss: 0.1883 - regression_loss: 0.1815 - classification_loss: 0.0068 30/500 [>.............................] - ETA: 4:00 - loss: 0.1953 - regression_loss: 0.1879 - classification_loss: 0.0074 31/500 [>.............................] - ETA: 4:00 - loss: 0.1955 - regression_loss: 0.1882 - classification_loss: 0.0074 32/500 [>.............................] - ETA: 3:59 - loss: 0.1984 - regression_loss: 0.1907 - classification_loss: 0.0076 33/500 [>.............................] - ETA: 3:59 - loss: 0.1970 - regression_loss: 0.1894 - classification_loss: 0.0076 34/500 [=>............................] - ETA: 3:58 - loss: 0.1960 - regression_loss: 0.1885 - classification_loss: 0.0075 35/500 [=>............................] - ETA: 3:58 - loss: 0.1951 - regression_loss: 0.1877 - classification_loss: 0.0075 36/500 [=>............................] - ETA: 3:58 - loss: 0.1978 - regression_loss: 0.1900 - classification_loss: 0.0078 37/500 [=>............................] - ETA: 3:57 - loss: 0.1944 - regression_loss: 0.1868 - classification_loss: 0.0076 38/500 [=>............................] - ETA: 3:57 - loss: 0.1944 - regression_loss: 0.1867 - classification_loss: 0.0077 39/500 [=>............................] - ETA: 3:56 - loss: 0.1919 - regression_loss: 0.1843 - classification_loss: 0.0075 40/500 [=>............................] - ETA: 3:56 - loss: 0.1915 - regression_loss: 0.1841 - classification_loss: 0.0075 41/500 [=>............................] - ETA: 3:55 - loss: 0.1947 - regression_loss: 0.1868 - classification_loss: 0.0079 42/500 [=>............................] - ETA: 3:55 - loss: 0.1939 - regression_loss: 0.1859 - classification_loss: 0.0080 43/500 [=>............................] - ETA: 3:54 - loss: 0.1963 - regression_loss: 0.1883 - classification_loss: 0.0080 44/500 [=>............................] - ETA: 3:53 - loss: 0.1965 - regression_loss: 0.1885 - classification_loss: 0.0080 45/500 [=>............................] - ETA: 3:53 - loss: 0.1961 - regression_loss: 0.1881 - classification_loss: 0.0080 46/500 [=>............................] - ETA: 3:53 - loss: 0.1962 - regression_loss: 0.1883 - classification_loss: 0.0079 47/500 [=>............................] - ETA: 3:52 - loss: 0.1958 - regression_loss: 0.1880 - classification_loss: 0.0078 48/500 [=>............................] - ETA: 3:52 - loss: 0.1960 - regression_loss: 0.1882 - classification_loss: 0.0079 49/500 [=>............................] - ETA: 3:51 - loss: 0.1993 - regression_loss: 0.1915 - classification_loss: 0.0079 50/500 [==>...........................] - ETA: 3:51 - loss: 0.1970 - regression_loss: 0.1892 - classification_loss: 0.0078 51/500 [==>...........................] - ETA: 3:51 - loss: 0.1985 - regression_loss: 0.1907 - classification_loss: 0.0078 52/500 [==>...........................] - ETA: 3:50 - loss: 0.1997 - regression_loss: 0.1915 - classification_loss: 0.0082 53/500 [==>...........................] - ETA: 3:50 - loss: 0.1997 - regression_loss: 0.1915 - classification_loss: 0.0081 54/500 [==>...........................] - ETA: 3:49 - loss: 0.2001 - regression_loss: 0.1920 - classification_loss: 0.0081 55/500 [==>...........................] - ETA: 3:48 - loss: 0.1993 - regression_loss: 0.1912 - classification_loss: 0.0082 56/500 [==>...........................] - ETA: 3:48 - loss: 0.1989 - regression_loss: 0.1908 - classification_loss: 0.0081 57/500 [==>...........................] - ETA: 3:47 - loss: 0.1978 - regression_loss: 0.1897 - classification_loss: 0.0080 58/500 [==>...........................] - ETA: 3:47 - loss: 0.1960 - regression_loss: 0.1881 - classification_loss: 0.0079 59/500 [==>...........................] - ETA: 3:46 - loss: 0.1966 - regression_loss: 0.1887 - classification_loss: 0.0079 60/500 [==>...........................] - ETA: 3:46 - loss: 0.1962 - regression_loss: 0.1883 - classification_loss: 0.0078 61/500 [==>...........................] - ETA: 3:45 - loss: 0.1950 - regression_loss: 0.1872 - classification_loss: 0.0078 62/500 [==>...........................] - ETA: 3:45 - loss: 0.1946 - regression_loss: 0.1869 - classification_loss: 0.0077 63/500 [==>...........................] - ETA: 3:44 - loss: 0.1969 - regression_loss: 0.1892 - classification_loss: 0.0076 64/500 [==>...........................] - ETA: 3:44 - loss: 0.1991 - regression_loss: 0.1913 - classification_loss: 0.0078 65/500 [==>...........................] - ETA: 3:43 - loss: 0.1996 - regression_loss: 0.1918 - classification_loss: 0.0078 66/500 [==>...........................] - ETA: 3:43 - loss: 0.1992 - regression_loss: 0.1912 - classification_loss: 0.0079 67/500 [===>..........................] - ETA: 3:42 - loss: 0.2007 - regression_loss: 0.1928 - classification_loss: 0.0079 68/500 [===>..........................] - ETA: 3:41 - loss: 0.2052 - regression_loss: 0.1969 - classification_loss: 0.0083 69/500 [===>..........................] - ETA: 3:41 - loss: 0.2040 - regression_loss: 0.1959 - classification_loss: 0.0081 70/500 [===>..........................] - ETA: 3:40 - loss: 0.2129 - regression_loss: 0.2044 - classification_loss: 0.0085 71/500 [===>..........................] - ETA: 3:40 - loss: 0.2160 - regression_loss: 0.2073 - classification_loss: 0.0087 72/500 [===>..........................] - ETA: 3:40 - loss: 0.2176 - regression_loss: 0.2087 - classification_loss: 0.0089 73/500 [===>..........................] - ETA: 3:39 - loss: 0.2181 - regression_loss: 0.2091 - classification_loss: 0.0090 74/500 [===>..........................] - ETA: 3:39 - loss: 0.2169 - regression_loss: 0.2079 - classification_loss: 0.0089 75/500 [===>..........................] - ETA: 3:38 - loss: 0.2154 - regression_loss: 0.2065 - classification_loss: 0.0089 76/500 [===>..........................] - ETA: 3:38 - loss: 0.2148 - regression_loss: 0.2060 - classification_loss: 0.0088 77/500 [===>..........................] - ETA: 3:37 - loss: 0.2160 - regression_loss: 0.2070 - classification_loss: 0.0090 78/500 [===>..........................] - ETA: 3:37 - loss: 0.2164 - regression_loss: 0.2072 - classification_loss: 0.0091 79/500 [===>..........................] - ETA: 3:36 - loss: 0.2172 - regression_loss: 0.2079 - classification_loss: 0.0093 80/500 [===>..........................] - ETA: 3:36 - loss: 0.2174 - regression_loss: 0.2083 - classification_loss: 0.0092 81/500 [===>..........................] - ETA: 3:35 - loss: 0.2157 - regression_loss: 0.2067 - classification_loss: 0.0091 82/500 [===>..........................] - ETA: 3:34 - loss: 0.2174 - regression_loss: 0.2083 - classification_loss: 0.0091 83/500 [===>..........................] - ETA: 3:34 - loss: 0.2162 - regression_loss: 0.2072 - classification_loss: 0.0090 84/500 [====>.........................] - ETA: 3:33 - loss: 0.2157 - regression_loss: 0.2067 - classification_loss: 0.0090 85/500 [====>.........................] - ETA: 3:33 - loss: 0.2155 - regression_loss: 0.2064 - classification_loss: 0.0091 86/500 [====>.........................] - ETA: 3:32 - loss: 0.2139 - regression_loss: 0.2050 - classification_loss: 0.0090 87/500 [====>.........................] - ETA: 3:32 - loss: 0.2142 - regression_loss: 0.2052 - classification_loss: 0.0090 88/500 [====>.........................] - ETA: 3:31 - loss: 0.2140 - regression_loss: 0.2050 - classification_loss: 0.0090 89/500 [====>.........................] - ETA: 3:31 - loss: 0.2129 - regression_loss: 0.2039 - classification_loss: 0.0089 90/500 [====>.........................] - ETA: 3:30 - loss: 0.2116 - regression_loss: 0.2028 - classification_loss: 0.0088 91/500 [====>.........................] - ETA: 3:30 - loss: 0.2127 - regression_loss: 0.2039 - classification_loss: 0.0088 92/500 [====>.........................] - ETA: 3:29 - loss: 0.2114 - regression_loss: 0.2026 - classification_loss: 0.0088 93/500 [====>.........................] - ETA: 3:29 - loss: 0.2101 - regression_loss: 0.2015 - classification_loss: 0.0087 94/500 [====>.........................] - ETA: 3:28 - loss: 0.2121 - regression_loss: 0.2028 - classification_loss: 0.0093 95/500 [====>.........................] - ETA: 3:27 - loss: 0.2105 - regression_loss: 0.2013 - classification_loss: 0.0092 96/500 [====>.........................] - ETA: 3:27 - loss: 0.2100 - regression_loss: 0.2007 - classification_loss: 0.0092 97/500 [====>.........................] - ETA: 3:26 - loss: 0.2100 - regression_loss: 0.2007 - classification_loss: 0.0092 98/500 [====>.........................] - ETA: 3:26 - loss: 0.2093 - regression_loss: 0.2001 - classification_loss: 0.0092 99/500 [====>.........................] - ETA: 3:25 - loss: 0.2090 - regression_loss: 0.1999 - classification_loss: 0.0091 100/500 [=====>........................] - ETA: 3:25 - loss: 0.2083 - regression_loss: 0.1993 - classification_loss: 0.0090 101/500 [=====>........................] - ETA: 3:24 - loss: 0.2077 - regression_loss: 0.1987 - classification_loss: 0.0090 102/500 [=====>........................] - ETA: 3:24 - loss: 0.2063 - regression_loss: 0.1974 - classification_loss: 0.0089 103/500 [=====>........................] - ETA: 3:23 - loss: 0.2062 - regression_loss: 0.1974 - classification_loss: 0.0088 104/500 [=====>........................] - ETA: 3:23 - loss: 0.2062 - regression_loss: 0.1974 - classification_loss: 0.0088 105/500 [=====>........................] - ETA: 3:22 - loss: 0.2058 - regression_loss: 0.1971 - classification_loss: 0.0088 106/500 [=====>........................] - ETA: 3:22 - loss: 0.2044 - regression_loss: 0.1957 - classification_loss: 0.0087 107/500 [=====>........................] - ETA: 3:21 - loss: 0.2038 - regression_loss: 0.1951 - classification_loss: 0.0087 108/500 [=====>........................] - ETA: 3:21 - loss: 0.2036 - regression_loss: 0.1949 - classification_loss: 0.0087 109/500 [=====>........................] - ETA: 3:20 - loss: 0.2033 - regression_loss: 0.1946 - classification_loss: 0.0087 110/500 [=====>........................] - ETA: 3:20 - loss: 0.2032 - regression_loss: 0.1945 - classification_loss: 0.0087 111/500 [=====>........................] - ETA: 3:19 - loss: 0.2033 - regression_loss: 0.1946 - classification_loss: 0.0087 112/500 [=====>........................] - ETA: 3:19 - loss: 0.2041 - regression_loss: 0.1954 - classification_loss: 0.0088 113/500 [=====>........................] - ETA: 3:18 - loss: 0.2044 - regression_loss: 0.1956 - classification_loss: 0.0088 114/500 [=====>........................] - ETA: 3:18 - loss: 0.2040 - regression_loss: 0.1952 - classification_loss: 0.0088 115/500 [=====>........................] - ETA: 3:17 - loss: 0.2032 - regression_loss: 0.1944 - classification_loss: 0.0088 116/500 [=====>........................] - ETA: 3:16 - loss: 0.2028 - regression_loss: 0.1940 - classification_loss: 0.0087 117/500 [======>.......................] - ETA: 3:16 - loss: 0.2026 - regression_loss: 0.1939 - classification_loss: 0.0087 118/500 [======>.......................] - ETA: 3:15 - loss: 0.2020 - regression_loss: 0.1934 - classification_loss: 0.0087 119/500 [======>.......................] - ETA: 3:15 - loss: 0.2027 - regression_loss: 0.1939 - classification_loss: 0.0087 120/500 [======>.......................] - ETA: 3:14 - loss: 0.2026 - regression_loss: 0.1940 - classification_loss: 0.0087 121/500 [======>.......................] - ETA: 3:14 - loss: 0.2025 - regression_loss: 0.1939 - classification_loss: 0.0087 122/500 [======>.......................] - ETA: 3:13 - loss: 0.2021 - regression_loss: 0.1934 - classification_loss: 0.0087 123/500 [======>.......................] - ETA: 3:13 - loss: 0.2018 - regression_loss: 0.1932 - classification_loss: 0.0086 124/500 [======>.......................] - ETA: 3:12 - loss: 0.2011 - regression_loss: 0.1925 - classification_loss: 0.0086 125/500 [======>.......................] - ETA: 3:12 - loss: 0.2007 - regression_loss: 0.1922 - classification_loss: 0.0085 126/500 [======>.......................] - ETA: 3:11 - loss: 0.2007 - regression_loss: 0.1922 - classification_loss: 0.0085 127/500 [======>.......................] - ETA: 3:11 - loss: 0.1997 - regression_loss: 0.1912 - classification_loss: 0.0084 128/500 [======>.......................] - ETA: 3:10 - loss: 0.1994 - regression_loss: 0.1910 - classification_loss: 0.0084 129/500 [======>.......................] - ETA: 3:10 - loss: 0.2010 - regression_loss: 0.1926 - classification_loss: 0.0084 130/500 [======>.......................] - ETA: 3:09 - loss: 0.2001 - regression_loss: 0.1917 - classification_loss: 0.0084 131/500 [======>.......................] - ETA: 3:09 - loss: 0.1996 - regression_loss: 0.1913 - classification_loss: 0.0083 132/500 [======>.......................] - ETA: 3:08 - loss: 0.1987 - regression_loss: 0.1905 - classification_loss: 0.0083 133/500 [======>.......................] - ETA: 3:08 - loss: 0.1976 - regression_loss: 0.1894 - classification_loss: 0.0082 134/500 [=======>......................] - ETA: 3:07 - loss: 0.1979 - regression_loss: 0.1897 - classification_loss: 0.0082 135/500 [=======>......................] - ETA: 3:07 - loss: 0.1973 - regression_loss: 0.1891 - classification_loss: 0.0081 136/500 [=======>......................] - ETA: 3:06 - loss: 0.1967 - regression_loss: 0.1886 - classification_loss: 0.0081 137/500 [=======>......................] - ETA: 3:06 - loss: 0.1967 - regression_loss: 0.1886 - classification_loss: 0.0081 138/500 [=======>......................] - ETA: 3:05 - loss: 0.1956 - regression_loss: 0.1875 - classification_loss: 0.0081 139/500 [=======>......................] - ETA: 3:05 - loss: 0.1957 - regression_loss: 0.1877 - classification_loss: 0.0080 140/500 [=======>......................] - ETA: 3:04 - loss: 0.1956 - regression_loss: 0.1876 - classification_loss: 0.0080 141/500 [=======>......................] - ETA: 3:03 - loss: 0.1954 - regression_loss: 0.1874 - classification_loss: 0.0080 142/500 [=======>......................] - ETA: 3:03 - loss: 0.1954 - regression_loss: 0.1875 - classification_loss: 0.0080 143/500 [=======>......................] - ETA: 3:02 - loss: 0.1948 - regression_loss: 0.1869 - classification_loss: 0.0079 144/500 [=======>......................] - ETA: 3:02 - loss: 0.1950 - regression_loss: 0.1871 - classification_loss: 0.0079 145/500 [=======>......................] - ETA: 3:02 - loss: 0.1952 - regression_loss: 0.1873 - classification_loss: 0.0079 146/500 [=======>......................] - ETA: 3:01 - loss: 0.1951 - regression_loss: 0.1872 - classification_loss: 0.0079 147/500 [=======>......................] - ETA: 3:00 - loss: 0.1946 - regression_loss: 0.1867 - classification_loss: 0.0078 148/500 [=======>......................] - ETA: 3:00 - loss: 0.1941 - regression_loss: 0.1863 - classification_loss: 0.0078 149/500 [=======>......................] - ETA: 2:59 - loss: 0.1940 - regression_loss: 0.1862 - classification_loss: 0.0078 150/500 [========>.....................] - ETA: 2:59 - loss: 0.1936 - regression_loss: 0.1859 - classification_loss: 0.0078 151/500 [========>.....................] - ETA: 2:58 - loss: 0.1939 - regression_loss: 0.1861 - classification_loss: 0.0077 152/500 [========>.....................] - ETA: 2:58 - loss: 0.1940 - regression_loss: 0.1863 - classification_loss: 0.0077 153/500 [========>.....................] - ETA: 2:57 - loss: 0.1937 - regression_loss: 0.1860 - classification_loss: 0.0077 154/500 [========>.....................] - ETA: 2:57 - loss: 0.1939 - regression_loss: 0.1862 - classification_loss: 0.0076 155/500 [========>.....................] - ETA: 2:56 - loss: 0.1942 - regression_loss: 0.1865 - classification_loss: 0.0077 156/500 [========>.....................] - ETA: 2:56 - loss: 0.1940 - regression_loss: 0.1863 - classification_loss: 0.0077 157/500 [========>.....................] - ETA: 2:55 - loss: 0.1939 - regression_loss: 0.1863 - classification_loss: 0.0077 158/500 [========>.....................] - ETA: 2:55 - loss: 0.1944 - regression_loss: 0.1867 - classification_loss: 0.0077 159/500 [========>.....................] - ETA: 2:54 - loss: 0.1955 - regression_loss: 0.1878 - classification_loss: 0.0077 160/500 [========>.....................] - ETA: 2:54 - loss: 0.1955 - regression_loss: 0.1877 - classification_loss: 0.0078 161/500 [========>.....................] - ETA: 2:53 - loss: 0.1948 - regression_loss: 0.1871 - classification_loss: 0.0077 162/500 [========>.....................] - ETA: 2:53 - loss: 0.1942 - regression_loss: 0.1866 - classification_loss: 0.0077 163/500 [========>.....................] - ETA: 2:52 - loss: 0.1942 - regression_loss: 0.1865 - classification_loss: 0.0076 164/500 [========>.....................] - ETA: 2:52 - loss: 0.1944 - regression_loss: 0.1866 - classification_loss: 0.0078 165/500 [========>.....................] - ETA: 2:51 - loss: 0.1938 - regression_loss: 0.1861 - classification_loss: 0.0077 166/500 [========>.....................] - ETA: 2:51 - loss: 0.1941 - regression_loss: 0.1864 - classification_loss: 0.0077 167/500 [=========>....................] - ETA: 2:50 - loss: 0.1938 - regression_loss: 0.1861 - classification_loss: 0.0077 168/500 [=========>....................] - ETA: 2:50 - loss: 0.1937 - regression_loss: 0.1860 - classification_loss: 0.0077 169/500 [=========>....................] - ETA: 2:49 - loss: 0.1935 - regression_loss: 0.1859 - classification_loss: 0.0077 170/500 [=========>....................] - ETA: 2:49 - loss: 0.1932 - regression_loss: 0.1856 - classification_loss: 0.0077 171/500 [=========>....................] - ETA: 2:48 - loss: 0.1927 - regression_loss: 0.1850 - classification_loss: 0.0076 172/500 [=========>....................] - ETA: 2:48 - loss: 0.1927 - regression_loss: 0.1851 - classification_loss: 0.0076 173/500 [=========>....................] - ETA: 2:47 - loss: 0.1925 - regression_loss: 0.1849 - classification_loss: 0.0076 174/500 [=========>....................] - ETA: 2:47 - loss: 0.1923 - regression_loss: 0.1848 - classification_loss: 0.0076 175/500 [=========>....................] - ETA: 2:46 - loss: 0.1919 - regression_loss: 0.1844 - classification_loss: 0.0075 176/500 [=========>....................] - ETA: 2:45 - loss: 0.1913 - regression_loss: 0.1838 - classification_loss: 0.0075 177/500 [=========>....................] - ETA: 2:45 - loss: 0.1912 - regression_loss: 0.1838 - classification_loss: 0.0075 178/500 [=========>....................] - ETA: 2:44 - loss: 0.1912 - regression_loss: 0.1837 - classification_loss: 0.0075 179/500 [=========>....................] - ETA: 2:44 - loss: 0.1919 - regression_loss: 0.1844 - classification_loss: 0.0076 180/500 [=========>....................] - ETA: 2:43 - loss: 0.1916 - regression_loss: 0.1841 - classification_loss: 0.0075 181/500 [=========>....................] - ETA: 2:43 - loss: 0.1916 - regression_loss: 0.1840 - classification_loss: 0.0075 182/500 [=========>....................] - ETA: 2:42 - loss: 0.1919 - regression_loss: 0.1845 - classification_loss: 0.0075 183/500 [=========>....................] - ETA: 2:42 - loss: 0.1927 - regression_loss: 0.1852 - classification_loss: 0.0075 184/500 [==========>...................] - ETA: 2:41 - loss: 0.1924 - regression_loss: 0.1849 - classification_loss: 0.0075 185/500 [==========>...................] - ETA: 2:41 - loss: 0.1920 - regression_loss: 0.1845 - classification_loss: 0.0075 186/500 [==========>...................] - ETA: 2:40 - loss: 0.1919 - regression_loss: 0.1845 - classification_loss: 0.0074 187/500 [==========>...................] - ETA: 2:40 - loss: 0.1926 - regression_loss: 0.1852 - classification_loss: 0.0074 188/500 [==========>...................] - ETA: 2:39 - loss: 0.1925 - regression_loss: 0.1851 - classification_loss: 0.0075 189/500 [==========>...................] - ETA: 2:39 - loss: 0.1925 - regression_loss: 0.1850 - classification_loss: 0.0075 190/500 [==========>...................] - ETA: 2:38 - loss: 0.1923 - regression_loss: 0.1849 - classification_loss: 0.0075 191/500 [==========>...................] - ETA: 2:38 - loss: 0.1921 - regression_loss: 0.1846 - classification_loss: 0.0075 192/500 [==========>...................] - ETA: 2:37 - loss: 0.1919 - regression_loss: 0.1845 - classification_loss: 0.0074 193/500 [==========>...................] - ETA: 2:37 - loss: 0.1918 - regression_loss: 0.1844 - classification_loss: 0.0074 194/500 [==========>...................] - ETA: 2:36 - loss: 0.1915 - regression_loss: 0.1841 - classification_loss: 0.0074 195/500 [==========>...................] - ETA: 2:35 - loss: 0.1911 - regression_loss: 0.1838 - classification_loss: 0.0074 196/500 [==========>...................] - ETA: 2:35 - loss: 0.1907 - regression_loss: 0.1834 - classification_loss: 0.0073 197/500 [==========>...................] - ETA: 2:34 - loss: 0.1907 - regression_loss: 0.1834 - classification_loss: 0.0073 198/500 [==========>...................] - ETA: 2:34 - loss: 0.1905 - regression_loss: 0.1832 - classification_loss: 0.0073 199/500 [==========>...................] - ETA: 2:33 - loss: 0.1905 - regression_loss: 0.1832 - classification_loss: 0.0074 200/500 [===========>..................] - ETA: 2:33 - loss: 0.1901 - regression_loss: 0.1828 - classification_loss: 0.0073 201/500 [===========>..................] - ETA: 2:32 - loss: 0.1900 - regression_loss: 0.1827 - classification_loss: 0.0073 202/500 [===========>..................] - ETA: 2:32 - loss: 0.1897 - regression_loss: 0.1824 - classification_loss: 0.0073 203/500 [===========>..................] - ETA: 2:31 - loss: 0.1900 - regression_loss: 0.1827 - classification_loss: 0.0073 204/500 [===========>..................] - ETA: 2:31 - loss: 0.1904 - regression_loss: 0.1831 - classification_loss: 0.0073 205/500 [===========>..................] - ETA: 2:30 - loss: 0.1902 - regression_loss: 0.1829 - classification_loss: 0.0073 206/500 [===========>..................] - ETA: 2:30 - loss: 0.1900 - regression_loss: 0.1827 - classification_loss: 0.0073 207/500 [===========>..................] - ETA: 2:29 - loss: 0.1899 - regression_loss: 0.1826 - classification_loss: 0.0073 208/500 [===========>..................] - ETA: 2:29 - loss: 0.1895 - regression_loss: 0.1822 - classification_loss: 0.0073 209/500 [===========>..................] - ETA: 2:28 - loss: 0.1896 - regression_loss: 0.1824 - classification_loss: 0.0073 210/500 [===========>..................] - ETA: 2:28 - loss: 0.1891 - regression_loss: 0.1818 - classification_loss: 0.0072 211/500 [===========>..................] - ETA: 2:27 - loss: 0.1883 - regression_loss: 0.1811 - classification_loss: 0.0072 212/500 [===========>..................] - ETA: 2:27 - loss: 0.1884 - regression_loss: 0.1812 - classification_loss: 0.0072 213/500 [===========>..................] - ETA: 2:26 - loss: 0.1890 - regression_loss: 0.1818 - classification_loss: 0.0072 214/500 [===========>..................] - ETA: 2:26 - loss: 0.1890 - regression_loss: 0.1818 - classification_loss: 0.0072 215/500 [===========>..................] - ETA: 2:25 - loss: 0.1887 - regression_loss: 0.1815 - classification_loss: 0.0072 216/500 [===========>..................] - ETA: 2:25 - loss: 0.1885 - regression_loss: 0.1813 - classification_loss: 0.0072 217/500 [============>.................] - ETA: 2:24 - loss: 0.1898 - regression_loss: 0.1826 - classification_loss: 0.0072 218/500 [============>.................] - ETA: 2:24 - loss: 0.1898 - regression_loss: 0.1826 - classification_loss: 0.0072 219/500 [============>.................] - ETA: 2:23 - loss: 0.1902 - regression_loss: 0.1830 - classification_loss: 0.0072 220/500 [============>.................] - ETA: 2:23 - loss: 0.1901 - regression_loss: 0.1829 - classification_loss: 0.0072 221/500 [============>.................] - ETA: 2:22 - loss: 0.1912 - regression_loss: 0.1839 - classification_loss: 0.0073 222/500 [============>.................] - ETA: 2:21 - loss: 0.1908 - regression_loss: 0.1836 - classification_loss: 0.0073 223/500 [============>.................] - ETA: 2:21 - loss: 0.1919 - regression_loss: 0.1846 - classification_loss: 0.0073 224/500 [============>.................] - ETA: 2:20 - loss: 0.1914 - regression_loss: 0.1842 - classification_loss: 0.0072 225/500 [============>.................] - ETA: 2:20 - loss: 0.1912 - regression_loss: 0.1839 - classification_loss: 0.0072 226/500 [============>.................] - ETA: 2:19 - loss: 0.1912 - regression_loss: 0.1840 - classification_loss: 0.0072 227/500 [============>.................] - ETA: 2:19 - loss: 0.1912 - regression_loss: 0.1840 - classification_loss: 0.0072 228/500 [============>.................] - ETA: 2:18 - loss: 0.1909 - regression_loss: 0.1837 - classification_loss: 0.0072 229/500 [============>.................] - ETA: 2:18 - loss: 0.1907 - regression_loss: 0.1834 - classification_loss: 0.0073 230/500 [============>.................] - ETA: 2:17 - loss: 0.1910 - regression_loss: 0.1838 - classification_loss: 0.0072 231/500 [============>.................] - ETA: 2:17 - loss: 0.1920 - regression_loss: 0.1848 - classification_loss: 0.0073 232/500 [============>.................] - ETA: 2:16 - loss: 0.1926 - regression_loss: 0.1854 - classification_loss: 0.0073 233/500 [============>.................] - ETA: 2:16 - loss: 0.1925 - regression_loss: 0.1852 - classification_loss: 0.0073 234/500 [=============>................] - ETA: 2:15 - loss: 0.1923 - regression_loss: 0.1851 - classification_loss: 0.0073 235/500 [=============>................] - ETA: 2:15 - loss: 0.1918 - regression_loss: 0.1846 - classification_loss: 0.0072 236/500 [=============>................] - ETA: 2:14 - loss: 0.1923 - regression_loss: 0.1850 - classification_loss: 0.0073 237/500 [=============>................] - ETA: 2:14 - loss: 0.1918 - regression_loss: 0.1846 - classification_loss: 0.0072 238/500 [=============>................] - ETA: 2:13 - loss: 0.1920 - regression_loss: 0.1847 - classification_loss: 0.0072 239/500 [=============>................] - ETA: 2:13 - loss: 0.1915 - regression_loss: 0.1843 - classification_loss: 0.0072 240/500 [=============>................] - ETA: 2:12 - loss: 0.1911 - regression_loss: 0.1839 - classification_loss: 0.0072 241/500 [=============>................] - ETA: 2:12 - loss: 0.1909 - regression_loss: 0.1837 - classification_loss: 0.0072 242/500 [=============>................] - ETA: 2:11 - loss: 0.1915 - regression_loss: 0.1842 - classification_loss: 0.0073 243/500 [=============>................] - ETA: 2:11 - loss: 0.1917 - regression_loss: 0.1844 - classification_loss: 0.0073 244/500 [=============>................] - ETA: 2:10 - loss: 0.1921 - regression_loss: 0.1849 - classification_loss: 0.0072 245/500 [=============>................] - ETA: 2:10 - loss: 0.1918 - regression_loss: 0.1846 - classification_loss: 0.0072 246/500 [=============>................] - ETA: 2:09 - loss: 0.1914 - regression_loss: 0.1842 - classification_loss: 0.0072 247/500 [=============>................] - ETA: 2:09 - loss: 0.1912 - regression_loss: 0.1841 - classification_loss: 0.0072 248/500 [=============>................] - ETA: 2:08 - loss: 0.1907 - regression_loss: 0.1835 - classification_loss: 0.0071 249/500 [=============>................] - ETA: 2:07 - loss: 0.1908 - regression_loss: 0.1837 - classification_loss: 0.0071 250/500 [==============>...............] - ETA: 2:07 - loss: 0.1915 - regression_loss: 0.1843 - classification_loss: 0.0072 251/500 [==============>...............] - ETA: 2:06 - loss: 0.1912 - regression_loss: 0.1841 - classification_loss: 0.0071 252/500 [==============>...............] - ETA: 2:06 - loss: 0.1911 - regression_loss: 0.1839 - classification_loss: 0.0071 253/500 [==============>...............] - ETA: 2:05 - loss: 0.1909 - regression_loss: 0.1838 - classification_loss: 0.0071 254/500 [==============>...............] - ETA: 2:05 - loss: 0.1910 - regression_loss: 0.1839 - classification_loss: 0.0071 255/500 [==============>...............] - ETA: 2:04 - loss: 0.1908 - regression_loss: 0.1837 - classification_loss: 0.0071 256/500 [==============>...............] - ETA: 2:04 - loss: 0.1910 - regression_loss: 0.1839 - classification_loss: 0.0071 257/500 [==============>...............] - ETA: 2:03 - loss: 0.1910 - regression_loss: 0.1840 - classification_loss: 0.0071 258/500 [==============>...............] - ETA: 2:03 - loss: 0.1915 - regression_loss: 0.1844 - classification_loss: 0.0071 259/500 [==============>...............] - ETA: 2:02 - loss: 0.1916 - regression_loss: 0.1845 - classification_loss: 0.0071 260/500 [==============>...............] - ETA: 2:02 - loss: 0.1912 - regression_loss: 0.1841 - classification_loss: 0.0070 261/500 [==============>...............] - ETA: 2:01 - loss: 0.1916 - regression_loss: 0.1845 - classification_loss: 0.0071 262/500 [==============>...............] - ETA: 2:01 - loss: 0.1919 - regression_loss: 0.1848 - classification_loss: 0.0071 263/500 [==============>...............] - ETA: 2:00 - loss: 0.1917 - regression_loss: 0.1846 - classification_loss: 0.0071 264/500 [==============>...............] - ETA: 2:00 - loss: 0.1917 - regression_loss: 0.1847 - classification_loss: 0.0071 265/500 [==============>...............] - ETA: 1:59 - loss: 0.1913 - regression_loss: 0.1842 - classification_loss: 0.0070 266/500 [==============>...............] - ETA: 1:59 - loss: 0.1911 - regression_loss: 0.1841 - classification_loss: 0.0070 267/500 [===============>..............] - ETA: 1:58 - loss: 0.1908 - regression_loss: 0.1838 - classification_loss: 0.0070 268/500 [===============>..............] - ETA: 1:58 - loss: 0.1907 - regression_loss: 0.1837 - classification_loss: 0.0070 269/500 [===============>..............] - ETA: 1:57 - loss: 0.1905 - regression_loss: 0.1836 - classification_loss: 0.0070 270/500 [===============>..............] - ETA: 1:57 - loss: 0.1908 - regression_loss: 0.1839 - classification_loss: 0.0070 271/500 [===============>..............] - ETA: 1:56 - loss: 0.1916 - regression_loss: 0.1846 - classification_loss: 0.0070 272/500 [===============>..............] - ETA: 1:56 - loss: 0.1914 - regression_loss: 0.1844 - classification_loss: 0.0070 273/500 [===============>..............] - ETA: 1:55 - loss: 0.1914 - regression_loss: 0.1844 - classification_loss: 0.0070 274/500 [===============>..............] - ETA: 1:55 - loss: 0.1912 - regression_loss: 0.1842 - classification_loss: 0.0070 275/500 [===============>..............] - ETA: 1:54 - loss: 0.1912 - regression_loss: 0.1842 - classification_loss: 0.0070 276/500 [===============>..............] - ETA: 1:54 - loss: 0.1913 - regression_loss: 0.1843 - classification_loss: 0.0070 277/500 [===============>..............] - ETA: 1:53 - loss: 0.1911 - regression_loss: 0.1842 - classification_loss: 0.0070 278/500 [===============>..............] - ETA: 1:53 - loss: 0.1910 - regression_loss: 0.1841 - classification_loss: 0.0070 279/500 [===============>..............] - ETA: 1:52 - loss: 0.1917 - regression_loss: 0.1845 - classification_loss: 0.0072 280/500 [===============>..............] - ETA: 1:52 - loss: 0.1918 - regression_loss: 0.1846 - classification_loss: 0.0072 281/500 [===============>..............] - ETA: 1:51 - loss: 0.1916 - regression_loss: 0.1844 - classification_loss: 0.0072 282/500 [===============>..............] - ETA: 1:51 - loss: 0.1913 - regression_loss: 0.1841 - classification_loss: 0.0072 283/500 [===============>..............] - ETA: 1:50 - loss: 0.1912 - regression_loss: 0.1840 - classification_loss: 0.0072 284/500 [================>.............] - ETA: 1:50 - loss: 0.1908 - regression_loss: 0.1836 - classification_loss: 0.0072 285/500 [================>.............] - ETA: 1:49 - loss: 0.1908 - regression_loss: 0.1836 - classification_loss: 0.0072 286/500 [================>.............] - ETA: 1:49 - loss: 0.1911 - regression_loss: 0.1839 - classification_loss: 0.0072 287/500 [================>.............] - ETA: 1:48 - loss: 0.1912 - regression_loss: 0.1840 - classification_loss: 0.0072 288/500 [================>.............] - ETA: 1:48 - loss: 0.1912 - regression_loss: 0.1840 - classification_loss: 0.0072 289/500 [================>.............] - ETA: 1:47 - loss: 0.1914 - regression_loss: 0.1842 - classification_loss: 0.0072 290/500 [================>.............] - ETA: 1:47 - loss: 0.1911 - regression_loss: 0.1840 - classification_loss: 0.0072 291/500 [================>.............] - ETA: 1:46 - loss: 0.1909 - regression_loss: 0.1837 - classification_loss: 0.0072 292/500 [================>.............] - ETA: 1:46 - loss: 0.1907 - regression_loss: 0.1835 - classification_loss: 0.0071 293/500 [================>.............] - ETA: 1:45 - loss: 0.1905 - regression_loss: 0.1834 - classification_loss: 0.0071 294/500 [================>.............] - ETA: 1:45 - loss: 0.1905 - regression_loss: 0.1833 - classification_loss: 0.0071 295/500 [================>.............] - ETA: 1:44 - loss: 0.1902 - regression_loss: 0.1831 - classification_loss: 0.0071 296/500 [================>.............] - ETA: 1:44 - loss: 0.1903 - regression_loss: 0.1832 - classification_loss: 0.0071 297/500 [================>.............] - ETA: 1:43 - loss: 0.1900 - regression_loss: 0.1830 - classification_loss: 0.0071 298/500 [================>.............] - ETA: 1:42 - loss: 0.1903 - regression_loss: 0.1832 - classification_loss: 0.0071 299/500 [================>.............] - ETA: 1:42 - loss: 0.1901 - regression_loss: 0.1831 - classification_loss: 0.0071 300/500 [=================>............] - ETA: 1:41 - loss: 0.1899 - regression_loss: 0.1829 - classification_loss: 0.0070 301/500 [=================>............] - ETA: 1:41 - loss: 0.1900 - regression_loss: 0.1829 - classification_loss: 0.0071 302/500 [=================>............] - ETA: 1:40 - loss: 0.1899 - regression_loss: 0.1828 - classification_loss: 0.0071 303/500 [=================>............] - ETA: 1:40 - loss: 0.1898 - regression_loss: 0.1828 - classification_loss: 0.0071 304/500 [=================>............] - ETA: 1:39 - loss: 0.1899 - regression_loss: 0.1828 - classification_loss: 0.0071 305/500 [=================>............] - ETA: 1:39 - loss: 0.1901 - regression_loss: 0.1830 - classification_loss: 0.0071 306/500 [=================>............] - ETA: 1:38 - loss: 0.1900 - regression_loss: 0.1828 - classification_loss: 0.0071 307/500 [=================>............] - ETA: 1:38 - loss: 0.1896 - regression_loss: 0.1825 - classification_loss: 0.0071 308/500 [=================>............] - ETA: 1:37 - loss: 0.1895 - regression_loss: 0.1824 - classification_loss: 0.0071 309/500 [=================>............] - ETA: 1:37 - loss: 0.1892 - regression_loss: 0.1821 - classification_loss: 0.0071 310/500 [=================>............] - ETA: 1:36 - loss: 0.1889 - regression_loss: 0.1818 - classification_loss: 0.0071 311/500 [=================>............] - ETA: 1:36 - loss: 0.1890 - regression_loss: 0.1820 - classification_loss: 0.0071 312/500 [=================>............] - ETA: 1:35 - loss: 0.1895 - regression_loss: 0.1824 - classification_loss: 0.0071 313/500 [=================>............] - ETA: 1:35 - loss: 0.1895 - regression_loss: 0.1824 - classification_loss: 0.0071 314/500 [=================>............] - ETA: 1:34 - loss: 0.1900 - regression_loss: 0.1829 - classification_loss: 0.0071 315/500 [=================>............] - ETA: 1:34 - loss: 0.1899 - regression_loss: 0.1828 - classification_loss: 0.0071 316/500 [=================>............] - ETA: 1:33 - loss: 0.1898 - regression_loss: 0.1827 - classification_loss: 0.0071 317/500 [==================>...........] - ETA: 1:33 - loss: 0.1895 - regression_loss: 0.1824 - classification_loss: 0.0071 318/500 [==================>...........] - ETA: 1:32 - loss: 0.1896 - regression_loss: 0.1826 - classification_loss: 0.0071 319/500 [==================>...........] - ETA: 1:32 - loss: 0.1905 - regression_loss: 0.1834 - classification_loss: 0.0071 320/500 [==================>...........] - ETA: 1:31 - loss: 0.1903 - regression_loss: 0.1832 - classification_loss: 0.0071 321/500 [==================>...........] - ETA: 1:31 - loss: 0.1901 - regression_loss: 0.1830 - classification_loss: 0.0071 322/500 [==================>...........] - ETA: 1:30 - loss: 0.1908 - regression_loss: 0.1837 - classification_loss: 0.0071 323/500 [==================>...........] - ETA: 1:30 - loss: 0.1905 - regression_loss: 0.1834 - classification_loss: 0.0071 324/500 [==================>...........] - ETA: 1:29 - loss: 0.1907 - regression_loss: 0.1836 - classification_loss: 0.0072 325/500 [==================>...........] - ETA: 1:29 - loss: 0.1906 - regression_loss: 0.1835 - classification_loss: 0.0072 326/500 [==================>...........] - ETA: 1:28 - loss: 0.1906 - regression_loss: 0.1835 - classification_loss: 0.0072 327/500 [==================>...........] - ETA: 1:28 - loss: 0.1908 - regression_loss: 0.1836 - classification_loss: 0.0072 328/500 [==================>...........] - ETA: 1:27 - loss: 0.1905 - regression_loss: 0.1834 - classification_loss: 0.0072 329/500 [==================>...........] - ETA: 1:27 - loss: 0.1906 - regression_loss: 0.1835 - classification_loss: 0.0071 330/500 [==================>...........] - ETA: 1:26 - loss: 0.1906 - regression_loss: 0.1834 - classification_loss: 0.0072 331/500 [==================>...........] - ETA: 1:26 - loss: 0.1907 - regression_loss: 0.1835 - classification_loss: 0.0072 332/500 [==================>...........] - ETA: 1:25 - loss: 0.1905 - regression_loss: 0.1833 - classification_loss: 0.0071 333/500 [==================>...........] - ETA: 1:25 - loss: 0.1903 - regression_loss: 0.1832 - classification_loss: 0.0071 334/500 [===================>..........] - ETA: 1:24 - loss: 0.1904 - regression_loss: 0.1832 - classification_loss: 0.0071 335/500 [===================>..........] - ETA: 1:24 - loss: 0.1903 - regression_loss: 0.1832 - classification_loss: 0.0071 336/500 [===================>..........] - ETA: 1:23 - loss: 0.1905 - regression_loss: 0.1834 - classification_loss: 0.0071 337/500 [===================>..........] - ETA: 1:22 - loss: 0.1902 - regression_loss: 0.1831 - classification_loss: 0.0071 338/500 [===================>..........] - ETA: 1:22 - loss: 0.1900 - regression_loss: 0.1829 - classification_loss: 0.0071 339/500 [===================>..........] - ETA: 1:21 - loss: 0.1899 - regression_loss: 0.1828 - classification_loss: 0.0071 340/500 [===================>..........] - ETA: 1:21 - loss: 0.1900 - regression_loss: 0.1829 - classification_loss: 0.0071 341/500 [===================>..........] - ETA: 1:20 - loss: 0.1903 - regression_loss: 0.1832 - classification_loss: 0.0071 342/500 [===================>..........] - ETA: 1:20 - loss: 0.1903 - regression_loss: 0.1832 - classification_loss: 0.0071 343/500 [===================>..........] - ETA: 1:19 - loss: 0.1901 - regression_loss: 0.1830 - classification_loss: 0.0071 344/500 [===================>..........] - ETA: 1:19 - loss: 0.1900 - regression_loss: 0.1828 - classification_loss: 0.0071 345/500 [===================>..........] - ETA: 1:18 - loss: 0.1897 - regression_loss: 0.1826 - classification_loss: 0.0071 346/500 [===================>..........] - ETA: 1:18 - loss: 0.1899 - regression_loss: 0.1827 - classification_loss: 0.0071 347/500 [===================>..........] - ETA: 1:17 - loss: 0.1899 - regression_loss: 0.1828 - classification_loss: 0.0071 348/500 [===================>..........] - ETA: 1:17 - loss: 0.1899 - regression_loss: 0.1828 - classification_loss: 0.0071 349/500 [===================>..........] - ETA: 1:16 - loss: 0.1903 - regression_loss: 0.1832 - classification_loss: 0.0071 350/500 [====================>.........] - ETA: 1:16 - loss: 0.1904 - regression_loss: 0.1832 - classification_loss: 0.0071 351/500 [====================>.........] - ETA: 1:15 - loss: 0.1904 - regression_loss: 0.1832 - classification_loss: 0.0071 352/500 [====================>.........] - ETA: 1:15 - loss: 0.1903 - regression_loss: 0.1832 - classification_loss: 0.0071 353/500 [====================>.........] - ETA: 1:14 - loss: 0.1905 - regression_loss: 0.1833 - classification_loss: 0.0072 354/500 [====================>.........] - ETA: 1:14 - loss: 0.1906 - regression_loss: 0.1834 - classification_loss: 0.0072 355/500 [====================>.........] - ETA: 1:13 - loss: 0.1905 - regression_loss: 0.1833 - classification_loss: 0.0072 356/500 [====================>.........] - ETA: 1:13 - loss: 0.1905 - regression_loss: 0.1833 - classification_loss: 0.0072 357/500 [====================>.........] - ETA: 1:12 - loss: 0.1903 - regression_loss: 0.1831 - classification_loss: 0.0072 358/500 [====================>.........] - ETA: 1:12 - loss: 0.1903 - regression_loss: 0.1832 - classification_loss: 0.0072 359/500 [====================>.........] - ETA: 1:11 - loss: 0.1903 - regression_loss: 0.1832 - classification_loss: 0.0071 360/500 [====================>.........] - ETA: 1:11 - loss: 0.1904 - regression_loss: 0.1832 - classification_loss: 0.0071 361/500 [====================>.........] - ETA: 1:10 - loss: 0.1904 - regression_loss: 0.1832 - classification_loss: 0.0072 362/500 [====================>.........] - ETA: 1:10 - loss: 0.1903 - regression_loss: 0.1832 - classification_loss: 0.0072 363/500 [====================>.........] - ETA: 1:09 - loss: 0.1903 - regression_loss: 0.1831 - classification_loss: 0.0071 364/500 [====================>.........] - ETA: 1:09 - loss: 0.1902 - regression_loss: 0.1831 - classification_loss: 0.0071 365/500 [====================>.........] - ETA: 1:08 - loss: 0.1900 - regression_loss: 0.1828 - classification_loss: 0.0071 366/500 [====================>.........] - ETA: 1:08 - loss: 0.1898 - regression_loss: 0.1827 - classification_loss: 0.0071 367/500 [=====================>........] - ETA: 1:07 - loss: 0.1896 - regression_loss: 0.1825 - classification_loss: 0.0071 368/500 [=====================>........] - ETA: 1:07 - loss: 0.1897 - regression_loss: 0.1826 - classification_loss: 0.0071 369/500 [=====================>........] - ETA: 1:06 - loss: 0.1895 - regression_loss: 0.1824 - classification_loss: 0.0071 370/500 [=====================>........] - ETA: 1:06 - loss: 0.1894 - regression_loss: 0.1823 - classification_loss: 0.0071 371/500 [=====================>........] - ETA: 1:05 - loss: 0.1897 - regression_loss: 0.1825 - classification_loss: 0.0071 372/500 [=====================>........] - ETA: 1:05 - loss: 0.1893 - regression_loss: 0.1822 - classification_loss: 0.0071 373/500 [=====================>........] - ETA: 1:04 - loss: 0.1894 - regression_loss: 0.1823 - classification_loss: 0.0071 374/500 [=====================>........] - ETA: 1:04 - loss: 0.1893 - regression_loss: 0.1822 - classification_loss: 0.0071 375/500 [=====================>........] - ETA: 1:03 - loss: 0.1891 - regression_loss: 0.1820 - classification_loss: 0.0071 376/500 [=====================>........] - ETA: 1:03 - loss: 0.1890 - regression_loss: 0.1819 - classification_loss: 0.0071 377/500 [=====================>........] - ETA: 1:02 - loss: 0.1888 - regression_loss: 0.1818 - classification_loss: 0.0070 378/500 [=====================>........] - ETA: 1:02 - loss: 0.1887 - regression_loss: 0.1817 - classification_loss: 0.0070 379/500 [=====================>........] - ETA: 1:01 - loss: 0.1888 - regression_loss: 0.1818 - classification_loss: 0.0070 380/500 [=====================>........] - ETA: 1:01 - loss: 0.1891 - regression_loss: 0.1821 - classification_loss: 0.0070 381/500 [=====================>........] - ETA: 1:00 - loss: 0.1889 - regression_loss: 0.1819 - classification_loss: 0.0070 382/500 [=====================>........] - ETA: 1:00 - loss: 0.1887 - regression_loss: 0.1817 - classification_loss: 0.0070 383/500 [=====================>........] - ETA: 59s - loss: 0.1886 - regression_loss: 0.1816 - classification_loss: 0.0070  384/500 [======================>.......] - ETA: 59s - loss: 0.1887 - regression_loss: 0.1817 - classification_loss: 0.0070 385/500 [======================>.......] - ETA: 58s - loss: 0.1888 - regression_loss: 0.1818 - classification_loss: 0.0070 386/500 [======================>.......] - ETA: 58s - loss: 0.1885 - regression_loss: 0.1815 - classification_loss: 0.0070 387/500 [======================>.......] - ETA: 57s - loss: 0.1885 - regression_loss: 0.1815 - classification_loss: 0.0070 388/500 [======================>.......] - ETA: 56s - loss: 0.1882 - regression_loss: 0.1813 - classification_loss: 0.0070 389/500 [======================>.......] - ETA: 56s - loss: 0.1880 - regression_loss: 0.1810 - classification_loss: 0.0069 390/500 [======================>.......] - ETA: 55s - loss: 0.1877 - regression_loss: 0.1807 - classification_loss: 0.0069 391/500 [======================>.......] - ETA: 55s - loss: 0.1876 - regression_loss: 0.1807 - classification_loss: 0.0069 392/500 [======================>.......] - ETA: 54s - loss: 0.1877 - regression_loss: 0.1808 - classification_loss: 0.0069 393/500 [======================>.......] - ETA: 54s - loss: 0.1876 - regression_loss: 0.1807 - classification_loss: 0.0069 394/500 [======================>.......] - ETA: 53s - loss: 0.1878 - regression_loss: 0.1809 - classification_loss: 0.0069 395/500 [======================>.......] - ETA: 53s - loss: 0.1879 - regression_loss: 0.1809 - classification_loss: 0.0069 396/500 [======================>.......] - ETA: 52s - loss: 0.1878 - regression_loss: 0.1809 - classification_loss: 0.0069 397/500 [======================>.......] - ETA: 52s - loss: 0.1876 - regression_loss: 0.1807 - classification_loss: 0.0069 398/500 [======================>.......] - ETA: 51s - loss: 0.1875 - regression_loss: 0.1806 - classification_loss: 0.0069 399/500 [======================>.......] - ETA: 51s - loss: 0.1873 - regression_loss: 0.1805 - classification_loss: 0.0069 400/500 [=======================>......] - ETA: 50s - loss: 0.1872 - regression_loss: 0.1803 - classification_loss: 0.0069 401/500 [=======================>......] - ETA: 50s - loss: 0.1870 - regression_loss: 0.1801 - classification_loss: 0.0069 402/500 [=======================>......] - ETA: 49s - loss: 0.1869 - regression_loss: 0.1800 - classification_loss: 0.0068 403/500 [=======================>......] - ETA: 49s - loss: 0.1866 - regression_loss: 0.1798 - classification_loss: 0.0068 404/500 [=======================>......] - ETA: 48s - loss: 0.1864 - regression_loss: 0.1796 - classification_loss: 0.0068 405/500 [=======================>......] - ETA: 48s - loss: 0.1864 - regression_loss: 0.1796 - classification_loss: 0.0068 406/500 [=======================>......] - ETA: 47s - loss: 0.1862 - regression_loss: 0.1794 - classification_loss: 0.0068 407/500 [=======================>......] - ETA: 47s - loss: 0.1866 - regression_loss: 0.1797 - classification_loss: 0.0068 408/500 [=======================>......] - ETA: 46s - loss: 0.1871 - regression_loss: 0.1802 - classification_loss: 0.0068 409/500 [=======================>......] - ETA: 46s - loss: 0.1873 - regression_loss: 0.1804 - classification_loss: 0.0068 410/500 [=======================>......] - ETA: 45s - loss: 0.1873 - regression_loss: 0.1805 - classification_loss: 0.0068 411/500 [=======================>......] - ETA: 45s - loss: 0.1872 - regression_loss: 0.1804 - classification_loss: 0.0068 412/500 [=======================>......] - ETA: 44s - loss: 0.1871 - regression_loss: 0.1803 - classification_loss: 0.0068 413/500 [=======================>......] - ETA: 44s - loss: 0.1871 - regression_loss: 0.1803 - classification_loss: 0.0068 414/500 [=======================>......] - ETA: 43s - loss: 0.1875 - regression_loss: 0.1808 - classification_loss: 0.0068 415/500 [=======================>......] - ETA: 43s - loss: 0.1876 - regression_loss: 0.1808 - classification_loss: 0.0068 416/500 [=======================>......] - ETA: 42s - loss: 0.1877 - regression_loss: 0.1809 - classification_loss: 0.0068 417/500 [========================>.....] - ETA: 42s - loss: 0.1875 - regression_loss: 0.1807 - classification_loss: 0.0068 418/500 [========================>.....] - ETA: 41s - loss: 0.1874 - regression_loss: 0.1806 - classification_loss: 0.0068 419/500 [========================>.....] - ETA: 41s - loss: 0.1872 - regression_loss: 0.1804 - classification_loss: 0.0068 420/500 [========================>.....] - ETA: 40s - loss: 0.1869 - regression_loss: 0.1802 - classification_loss: 0.0068 421/500 [========================>.....] - ETA: 40s - loss: 0.1866 - regression_loss: 0.1799 - classification_loss: 0.0067 422/500 [========================>.....] - ETA: 39s - loss: 0.1866 - regression_loss: 0.1799 - classification_loss: 0.0067 423/500 [========================>.....] - ETA: 39s - loss: 0.1868 - regression_loss: 0.1800 - classification_loss: 0.0067 424/500 [========================>.....] - ETA: 38s - loss: 0.1866 - regression_loss: 0.1799 - classification_loss: 0.0067 425/500 [========================>.....] - ETA: 38s - loss: 0.1864 - regression_loss: 0.1797 - classification_loss: 0.0067 426/500 [========================>.....] - ETA: 37s - loss: 0.1865 - regression_loss: 0.1797 - classification_loss: 0.0067 427/500 [========================>.....] - ETA: 37s - loss: 0.1863 - regression_loss: 0.1796 - classification_loss: 0.0067 428/500 [========================>.....] - ETA: 36s - loss: 0.1862 - regression_loss: 0.1795 - classification_loss: 0.0067 429/500 [========================>.....] - ETA: 36s - loss: 0.1861 - regression_loss: 0.1794 - classification_loss: 0.0067 430/500 [========================>.....] - ETA: 35s - loss: 0.1858 - regression_loss: 0.1791 - classification_loss: 0.0067 431/500 [========================>.....] - ETA: 35s - loss: 0.1859 - regression_loss: 0.1792 - classification_loss: 0.0067 432/500 [========================>.....] - ETA: 34s - loss: 0.1860 - regression_loss: 0.1792 - classification_loss: 0.0067 433/500 [========================>.....] - ETA: 34s - loss: 0.1859 - regression_loss: 0.1792 - classification_loss: 0.0067 434/500 [=========================>....] - ETA: 33s - loss: 0.1858 - regression_loss: 0.1791 - classification_loss: 0.0067 435/500 [=========================>....] - ETA: 33s - loss: 0.1855 - regression_loss: 0.1788 - classification_loss: 0.0067 436/500 [=========================>....] - ETA: 32s - loss: 0.1853 - regression_loss: 0.1787 - classification_loss: 0.0067 437/500 [=========================>....] - ETA: 32s - loss: 0.1854 - regression_loss: 0.1787 - classification_loss: 0.0067 438/500 [=========================>....] - ETA: 31s - loss: 0.1853 - regression_loss: 0.1786 - classification_loss: 0.0067 439/500 [=========================>....] - ETA: 31s - loss: 0.1854 - regression_loss: 0.1787 - classification_loss: 0.0067 440/500 [=========================>....] - ETA: 30s - loss: 0.1852 - regression_loss: 0.1785 - classification_loss: 0.0067 441/500 [=========================>....] - ETA: 30s - loss: 0.1850 - regression_loss: 0.1784 - classification_loss: 0.0067 442/500 [=========================>....] - ETA: 29s - loss: 0.1851 - regression_loss: 0.1784 - classification_loss: 0.0067 443/500 [=========================>....] - ETA: 28s - loss: 0.1850 - regression_loss: 0.1784 - classification_loss: 0.0067 444/500 [=========================>....] - ETA: 28s - loss: 0.1852 - regression_loss: 0.1785 - classification_loss: 0.0067 445/500 [=========================>....] - ETA: 27s - loss: 0.1851 - regression_loss: 0.1784 - classification_loss: 0.0067 446/500 [=========================>....] - ETA: 27s - loss: 0.1854 - regression_loss: 0.1786 - classification_loss: 0.0068 447/500 [=========================>....] - ETA: 26s - loss: 0.1852 - regression_loss: 0.1784 - classification_loss: 0.0068 448/500 [=========================>....] - ETA: 26s - loss: 0.1850 - regression_loss: 0.1782 - classification_loss: 0.0068 449/500 [=========================>....] - ETA: 25s - loss: 0.1853 - regression_loss: 0.1785 - classification_loss: 0.0068 450/500 [==========================>...] - ETA: 25s - loss: 0.1854 - regression_loss: 0.1786 - classification_loss: 0.0068 451/500 [==========================>...] - ETA: 24s - loss: 0.1857 - regression_loss: 0.1789 - classification_loss: 0.0068 452/500 [==========================>...] - ETA: 24s - loss: 0.1854 - regression_loss: 0.1786 - classification_loss: 0.0068 453/500 [==========================>...] - ETA: 23s - loss: 0.1858 - regression_loss: 0.1790 - classification_loss: 0.0068 454/500 [==========================>...] - ETA: 23s - loss: 0.1857 - regression_loss: 0.1789 - classification_loss: 0.0068 455/500 [==========================>...] - ETA: 22s - loss: 0.1857 - regression_loss: 0.1789 - classification_loss: 0.0068 456/500 [==========================>...] - ETA: 22s - loss: 0.1857 - regression_loss: 0.1788 - classification_loss: 0.0068 457/500 [==========================>...] - ETA: 21s - loss: 0.1861 - regression_loss: 0.1793 - classification_loss: 0.0068 458/500 [==========================>...] - ETA: 21s - loss: 0.1863 - regression_loss: 0.1795 - classification_loss: 0.0068 459/500 [==========================>...] - ETA: 20s - loss: 0.1862 - regression_loss: 0.1793 - classification_loss: 0.0068 460/500 [==========================>...] - ETA: 20s - loss: 0.1859 - regression_loss: 0.1791 - classification_loss: 0.0068 461/500 [==========================>...] - ETA: 19s - loss: 0.1862 - regression_loss: 0.1794 - classification_loss: 0.0068 462/500 [==========================>...] - ETA: 19s - loss: 0.1861 - regression_loss: 0.1793 - classification_loss: 0.0068 463/500 [==========================>...] - ETA: 18s - loss: 0.1860 - regression_loss: 0.1792 - classification_loss: 0.0068 464/500 [==========================>...] - ETA: 18s - loss: 0.1860 - regression_loss: 0.1792 - classification_loss: 0.0068 465/500 [==========================>...] - ETA: 17s - loss: 0.1862 - regression_loss: 0.1794 - classification_loss: 0.0068 466/500 [==========================>...] - ETA: 17s - loss: 0.1863 - regression_loss: 0.1795 - classification_loss: 0.0068 467/500 [===========================>..] - ETA: 16s - loss: 0.1864 - regression_loss: 0.1796 - classification_loss: 0.0068 468/500 [===========================>..] - ETA: 16s - loss: 0.1867 - regression_loss: 0.1798 - classification_loss: 0.0068 469/500 [===========================>..] - ETA: 15s - loss: 0.1867 - regression_loss: 0.1798 - classification_loss: 0.0069 470/500 [===========================>..] - ETA: 15s - loss: 0.1868 - regression_loss: 0.1799 - classification_loss: 0.0069 471/500 [===========================>..] - ETA: 14s - loss: 0.1868 - regression_loss: 0.1799 - classification_loss: 0.0068 472/500 [===========================>..] - ETA: 14s - loss: 0.1867 - regression_loss: 0.1799 - classification_loss: 0.0068 473/500 [===========================>..] - ETA: 13s - loss: 0.1869 - regression_loss: 0.1800 - classification_loss: 0.0069 474/500 [===========================>..] - ETA: 13s - loss: 0.1869 - regression_loss: 0.1800 - classification_loss: 0.0069 475/500 [===========================>..] - ETA: 12s - loss: 0.1871 - regression_loss: 0.1803 - classification_loss: 0.0069 476/500 [===========================>..] - ETA: 12s - loss: 0.1874 - regression_loss: 0.1805 - classification_loss: 0.0069 477/500 [===========================>..] - ETA: 11s - loss: 0.1876 - regression_loss: 0.1807 - classification_loss: 0.0069 478/500 [===========================>..] - ETA: 11s - loss: 0.1878 - regression_loss: 0.1809 - classification_loss: 0.0069 479/500 [===========================>..] - ETA: 10s - loss: 0.1877 - regression_loss: 0.1808 - classification_loss: 0.0069 480/500 [===========================>..] - ETA: 10s - loss: 0.1877 - regression_loss: 0.1808 - classification_loss: 0.0069 481/500 [===========================>..] - ETA: 9s - loss: 0.1880 - regression_loss: 0.1811 - classification_loss: 0.0069  482/500 [===========================>..] - ETA: 9s - loss: 0.1882 - regression_loss: 0.1813 - classification_loss: 0.0069 483/500 [===========================>..] - ETA: 8s - loss: 0.1881 - regression_loss: 0.1812 - classification_loss: 0.0069 484/500 [============================>.] - ETA: 8s - loss: 0.1882 - regression_loss: 0.1813 - classification_loss: 0.0069 485/500 [============================>.] - ETA: 7s - loss: 0.1879 - regression_loss: 0.1810 - classification_loss: 0.0069 486/500 [============================>.] - ETA: 7s - loss: 0.1879 - regression_loss: 0.1810 - classification_loss: 0.0069 487/500 [============================>.] - ETA: 6s - loss: 0.1883 - regression_loss: 0.1814 - classification_loss: 0.0069 488/500 [============================>.] - ETA: 6s - loss: 0.1881 - regression_loss: 0.1812 - classification_loss: 0.0069 489/500 [============================>.] - ETA: 5s - loss: 0.1885 - regression_loss: 0.1816 - classification_loss: 0.0069 490/500 [============================>.] - ETA: 5s - loss: 0.1891 - regression_loss: 0.1822 - classification_loss: 0.0069 491/500 [============================>.] - ETA: 4s - loss: 0.1900 - regression_loss: 0.1831 - classification_loss: 0.0069 492/500 [============================>.] - ETA: 4s - loss: 0.1900 - regression_loss: 0.1830 - classification_loss: 0.0069 493/500 [============================>.] - ETA: 3s - loss: 0.1911 - regression_loss: 0.1841 - classification_loss: 0.0070 494/500 [============================>.] - ETA: 3s - loss: 0.1913 - regression_loss: 0.1843 - classification_loss: 0.0070 495/500 [============================>.] - ETA: 2s - loss: 0.1913 - regression_loss: 0.1844 - classification_loss: 0.0070 496/500 [============================>.] - ETA: 2s - loss: 0.1912 - regression_loss: 0.1842 - classification_loss: 0.0070 497/500 [============================>.] - ETA: 1s - loss: 0.1912 - regression_loss: 0.1843 - classification_loss: 0.0070 498/500 [============================>.] - ETA: 1s - loss: 0.1913 - regression_loss: 0.1844 - classification_loss: 0.0070 499/500 [============================>.] - ETA: 0s - loss: 0.1916 - regression_loss: 0.1847 - classification_loss: 0.0070 500/500 [==============================] - 254s 509ms/step - loss: 0.1923 - regression_loss: 0.1853 - classification_loss: 0.0070 1172 instances of class plum with average precision: 0.7370 mAP: 0.7370 Epoch 00044: saving model to ./training/snapshots/resnet101_pascal_44.h5 Epoch 45/150 1/500 [..............................] - ETA: 4:12 - loss: 0.3410 - regression_loss: 0.3307 - classification_loss: 0.0103 2/500 [..............................] - ETA: 4:11 - loss: 0.3051 - regression_loss: 0.2945 - classification_loss: 0.0106 3/500 [..............................] - ETA: 4:11 - loss: 0.2973 - regression_loss: 0.2897 - classification_loss: 0.0076 4/500 [..............................] - ETA: 4:11 - loss: 0.2787 - regression_loss: 0.2712 - classification_loss: 0.0075 5/500 [..............................] - ETA: 4:10 - loss: 0.2502 - regression_loss: 0.2439 - classification_loss: 0.0063 6/500 [..............................] - ETA: 4:11 - loss: 0.2560 - regression_loss: 0.2471 - classification_loss: 0.0089 7/500 [..............................] - ETA: 4:11 - loss: 0.2392 - regression_loss: 0.2315 - classification_loss: 0.0078 8/500 [..............................] - ETA: 4:10 - loss: 0.2524 - regression_loss: 0.2448 - classification_loss: 0.0075 9/500 [..............................] - ETA: 4:09 - loss: 0.2367 - regression_loss: 0.2298 - classification_loss: 0.0069 10/500 [..............................] - ETA: 4:09 - loss: 0.2368 - regression_loss: 0.2301 - classification_loss: 0.0067 11/500 [..............................] - ETA: 4:09 - loss: 0.2284 - regression_loss: 0.2217 - classification_loss: 0.0067 12/500 [..............................] - ETA: 4:08 - loss: 0.2186 - regression_loss: 0.2124 - classification_loss: 0.0062 13/500 [..............................] - ETA: 4:07 - loss: 0.2184 - regression_loss: 0.2124 - classification_loss: 0.0060 14/500 [..............................] - ETA: 4:07 - loss: 0.2125 - regression_loss: 0.2063 - classification_loss: 0.0062 15/500 [..............................] - ETA: 4:06 - loss: 0.2085 - regression_loss: 0.2026 - classification_loss: 0.0059 16/500 [..............................] - ETA: 4:05 - loss: 0.2109 - regression_loss: 0.2050 - classification_loss: 0.0058 17/500 [>.............................] - ETA: 4:04 - loss: 0.2158 - regression_loss: 0.2097 - classification_loss: 0.0061 18/500 [>.............................] - ETA: 4:04 - loss: 0.2117 - regression_loss: 0.2058 - classification_loss: 0.0059 19/500 [>.............................] - ETA: 4:03 - loss: 0.2131 - regression_loss: 0.2070 - classification_loss: 0.0061 20/500 [>.............................] - ETA: 4:03 - loss: 0.2058 - regression_loss: 0.1999 - classification_loss: 0.0058 21/500 [>.............................] - ETA: 4:02 - loss: 0.2030 - regression_loss: 0.1973 - classification_loss: 0.0058 22/500 [>.............................] - ETA: 4:02 - loss: 0.2057 - regression_loss: 0.1997 - classification_loss: 0.0059 23/500 [>.............................] - ETA: 4:01 - loss: 0.2062 - regression_loss: 0.2004 - classification_loss: 0.0058 24/500 [>.............................] - ETA: 4:01 - loss: 0.2050 - regression_loss: 0.1991 - classification_loss: 0.0060 25/500 [>.............................] - ETA: 4:01 - loss: 0.2062 - regression_loss: 0.2002 - classification_loss: 0.0060 26/500 [>.............................] - ETA: 4:00 - loss: 0.2063 - regression_loss: 0.2003 - classification_loss: 0.0060 27/500 [>.............................] - ETA: 4:00 - loss: 0.2077 - regression_loss: 0.2013 - classification_loss: 0.0064 28/500 [>.............................] - ETA: 3:59 - loss: 0.2064 - regression_loss: 0.2001 - classification_loss: 0.0063 29/500 [>.............................] - ETA: 3:59 - loss: 0.2077 - regression_loss: 0.2016 - classification_loss: 0.0062 30/500 [>.............................] - ETA: 3:58 - loss: 0.2062 - regression_loss: 0.2001 - classification_loss: 0.0061 31/500 [>.............................] - ETA: 3:58 - loss: 0.2025 - regression_loss: 0.1965 - classification_loss: 0.0060 32/500 [>.............................] - ETA: 3:57 - loss: 0.1988 - regression_loss: 0.1930 - classification_loss: 0.0058 33/500 [>.............................] - ETA: 3:57 - loss: 0.1961 - regression_loss: 0.1904 - classification_loss: 0.0057 34/500 [=>............................] - ETA: 3:56 - loss: 0.1939 - regression_loss: 0.1881 - classification_loss: 0.0058 35/500 [=>............................] - ETA: 3:56 - loss: 0.1910 - regression_loss: 0.1853 - classification_loss: 0.0057 36/500 [=>............................] - ETA: 3:55 - loss: 0.1921 - regression_loss: 0.1864 - classification_loss: 0.0057 37/500 [=>............................] - ETA: 3:55 - loss: 0.1917 - regression_loss: 0.1860 - classification_loss: 0.0057 38/500 [=>............................] - ETA: 3:54 - loss: 0.1923 - regression_loss: 0.1867 - classification_loss: 0.0056 39/500 [=>............................] - ETA: 3:53 - loss: 0.1953 - regression_loss: 0.1896 - classification_loss: 0.0057 40/500 [=>............................] - ETA: 3:53 - loss: 0.1985 - regression_loss: 0.1925 - classification_loss: 0.0060 41/500 [=>............................] - ETA: 3:52 - loss: 0.1977 - regression_loss: 0.1917 - classification_loss: 0.0060 42/500 [=>............................] - ETA: 3:52 - loss: 0.1965 - regression_loss: 0.1906 - classification_loss: 0.0059 43/500 [=>............................] - ETA: 3:51 - loss: 0.1969 - regression_loss: 0.1909 - classification_loss: 0.0060 44/500 [=>............................] - ETA: 3:51 - loss: 0.1957 - regression_loss: 0.1898 - classification_loss: 0.0060 45/500 [=>............................] - ETA: 3:50 - loss: 0.1949 - regression_loss: 0.1890 - classification_loss: 0.0059 46/500 [=>............................] - ETA: 3:49 - loss: 0.1955 - regression_loss: 0.1895 - classification_loss: 0.0060 47/500 [=>............................] - ETA: 3:49 - loss: 0.1963 - regression_loss: 0.1904 - classification_loss: 0.0060 48/500 [=>............................] - ETA: 3:48 - loss: 0.1972 - regression_loss: 0.1913 - classification_loss: 0.0059 49/500 [=>............................] - ETA: 3:48 - loss: 0.1983 - regression_loss: 0.1924 - classification_loss: 0.0059 50/500 [==>...........................] - ETA: 3:47 - loss: 0.1989 - regression_loss: 0.1930 - classification_loss: 0.0060 51/500 [==>...........................] - ETA: 3:47 - loss: 0.2037 - regression_loss: 0.1965 - classification_loss: 0.0072 52/500 [==>...........................] - ETA: 3:46 - loss: 0.2080 - regression_loss: 0.2008 - classification_loss: 0.0071 53/500 [==>...........................] - ETA: 3:46 - loss: 0.2079 - regression_loss: 0.2008 - classification_loss: 0.0071 54/500 [==>...........................] - ETA: 3:45 - loss: 0.2061 - regression_loss: 0.1990 - classification_loss: 0.0070 55/500 [==>...........................] - ETA: 3:45 - loss: 0.2063 - regression_loss: 0.1992 - classification_loss: 0.0071 56/500 [==>...........................] - ETA: 3:44 - loss: 0.2044 - regression_loss: 0.1974 - classification_loss: 0.0070 57/500 [==>...........................] - ETA: 3:44 - loss: 0.2035 - regression_loss: 0.1966 - classification_loss: 0.0069 58/500 [==>...........................] - ETA: 3:43 - loss: 0.2009 - regression_loss: 0.1942 - classification_loss: 0.0068 59/500 [==>...........................] - ETA: 3:43 - loss: 0.1989 - regression_loss: 0.1922 - classification_loss: 0.0067 60/500 [==>...........................] - ETA: 3:42 - loss: 0.2002 - regression_loss: 0.1934 - classification_loss: 0.0068 61/500 [==>...........................] - ETA: 3:42 - loss: 0.1987 - regression_loss: 0.1920 - classification_loss: 0.0067 62/500 [==>...........................] - ETA: 3:41 - loss: 0.1967 - regression_loss: 0.1901 - classification_loss: 0.0066 63/500 [==>...........................] - ETA: 3:41 - loss: 0.1967 - regression_loss: 0.1901 - classification_loss: 0.0066 64/500 [==>...........................] - ETA: 3:40 - loss: 0.1953 - regression_loss: 0.1888 - classification_loss: 0.0065 65/500 [==>...........................] - ETA: 3:40 - loss: 0.1963 - regression_loss: 0.1898 - classification_loss: 0.0066 66/500 [==>...........................] - ETA: 3:39 - loss: 0.1966 - regression_loss: 0.1898 - classification_loss: 0.0068 67/500 [===>..........................] - ETA: 3:39 - loss: 0.1950 - regression_loss: 0.1883 - classification_loss: 0.0068 68/500 [===>..........................] - ETA: 3:38 - loss: 0.1942 - regression_loss: 0.1874 - classification_loss: 0.0068 69/500 [===>..........................] - ETA: 3:38 - loss: 0.1929 - regression_loss: 0.1861 - classification_loss: 0.0068 70/500 [===>..........................] - ETA: 3:37 - loss: 0.1931 - regression_loss: 0.1863 - classification_loss: 0.0068 71/500 [===>..........................] - ETA: 3:36 - loss: 0.1920 - regression_loss: 0.1852 - classification_loss: 0.0068 72/500 [===>..........................] - ETA: 3:36 - loss: 0.1923 - regression_loss: 0.1854 - classification_loss: 0.0069 73/500 [===>..........................] - ETA: 3:35 - loss: 0.1915 - regression_loss: 0.1846 - classification_loss: 0.0069 74/500 [===>..........................] - ETA: 3:35 - loss: 0.1935 - regression_loss: 0.1866 - classification_loss: 0.0069 75/500 [===>..........................] - ETA: 3:34 - loss: 0.1939 - regression_loss: 0.1870 - classification_loss: 0.0069 76/500 [===>..........................] - ETA: 3:34 - loss: 0.1921 - regression_loss: 0.1853 - classification_loss: 0.0068 77/500 [===>..........................] - ETA: 3:33 - loss: 0.1918 - regression_loss: 0.1850 - classification_loss: 0.0068 78/500 [===>..........................] - ETA: 3:33 - loss: 0.1922 - regression_loss: 0.1853 - classification_loss: 0.0069 79/500 [===>..........................] - ETA: 3:32 - loss: 0.1916 - regression_loss: 0.1847 - classification_loss: 0.0068 80/500 [===>..........................] - ETA: 3:32 - loss: 0.1897 - regression_loss: 0.1830 - classification_loss: 0.0068 81/500 [===>..........................] - ETA: 3:32 - loss: 0.1884 - regression_loss: 0.1817 - classification_loss: 0.0067 82/500 [===>..........................] - ETA: 3:31 - loss: 0.1879 - regression_loss: 0.1812 - classification_loss: 0.0067 83/500 [===>..........................] - ETA: 3:31 - loss: 0.1865 - regression_loss: 0.1799 - classification_loss: 0.0066 84/500 [====>.........................] - ETA: 3:30 - loss: 0.1851 - regression_loss: 0.1786 - classification_loss: 0.0065 85/500 [====>.........................] - ETA: 3:30 - loss: 0.1839 - regression_loss: 0.1775 - classification_loss: 0.0065 86/500 [====>.........................] - ETA: 3:29 - loss: 0.1835 - regression_loss: 0.1769 - classification_loss: 0.0065 87/500 [====>.........................] - ETA: 3:29 - loss: 0.1836 - regression_loss: 0.1770 - classification_loss: 0.0065 88/500 [====>.........................] - ETA: 3:28 - loss: 0.1823 - regression_loss: 0.1758 - classification_loss: 0.0065 89/500 [====>.........................] - ETA: 3:28 - loss: 0.1820 - regression_loss: 0.1756 - classification_loss: 0.0065 90/500 [====>.........................] - ETA: 3:27 - loss: 0.1823 - regression_loss: 0.1759 - classification_loss: 0.0064 91/500 [====>.........................] - ETA: 3:27 - loss: 0.1820 - regression_loss: 0.1756 - classification_loss: 0.0064 92/500 [====>.........................] - ETA: 3:26 - loss: 0.1807 - regression_loss: 0.1744 - classification_loss: 0.0063 93/500 [====>.........................] - ETA: 3:28 - loss: 0.1801 - regression_loss: 0.1739 - classification_loss: 0.0063 94/500 [====>.........................] - ETA: 3:27 - loss: 0.1790 - regression_loss: 0.1729 - classification_loss: 0.0062 95/500 [====>.........................] - ETA: 3:27 - loss: 0.1783 - regression_loss: 0.1721 - classification_loss: 0.0062 96/500 [====>.........................] - ETA: 3:26 - loss: 0.1780 - regression_loss: 0.1719 - classification_loss: 0.0061 97/500 [====>.........................] - ETA: 3:25 - loss: 0.1772 - regression_loss: 0.1711 - classification_loss: 0.0061 98/500 [====>.........................] - ETA: 3:25 - loss: 0.1771 - regression_loss: 0.1710 - classification_loss: 0.0060 99/500 [====>.........................] - ETA: 3:24 - loss: 0.1770 - regression_loss: 0.1710 - classification_loss: 0.0060 100/500 [=====>........................] - ETA: 3:24 - loss: 0.1768 - regression_loss: 0.1708 - classification_loss: 0.0060 101/500 [=====>........................] - ETA: 3:23 - loss: 0.1764 - regression_loss: 0.1704 - classification_loss: 0.0060 102/500 [=====>........................] - ETA: 3:23 - loss: 0.1766 - regression_loss: 0.1705 - classification_loss: 0.0061 103/500 [=====>........................] - ETA: 3:22 - loss: 0.1766 - regression_loss: 0.1705 - classification_loss: 0.0061 104/500 [=====>........................] - ETA: 3:22 - loss: 0.1764 - regression_loss: 0.1703 - classification_loss: 0.0061 105/500 [=====>........................] - ETA: 3:21 - loss: 0.1766 - regression_loss: 0.1705 - classification_loss: 0.0061 106/500 [=====>........................] - ETA: 3:21 - loss: 0.1755 - regression_loss: 0.1694 - classification_loss: 0.0061 107/500 [=====>........................] - ETA: 3:20 - loss: 0.1749 - regression_loss: 0.1688 - classification_loss: 0.0060 108/500 [=====>........................] - ETA: 3:20 - loss: 0.1748 - regression_loss: 0.1688 - classification_loss: 0.0060 109/500 [=====>........................] - ETA: 3:19 - loss: 0.1746 - regression_loss: 0.1686 - classification_loss: 0.0060 110/500 [=====>........................] - ETA: 3:19 - loss: 0.1734 - regression_loss: 0.1674 - classification_loss: 0.0060 111/500 [=====>........................] - ETA: 3:18 - loss: 0.1731 - regression_loss: 0.1671 - classification_loss: 0.0060 112/500 [=====>........................] - ETA: 3:18 - loss: 0.1731 - regression_loss: 0.1671 - classification_loss: 0.0060 113/500 [=====>........................] - ETA: 3:17 - loss: 0.1736 - regression_loss: 0.1674 - classification_loss: 0.0062 114/500 [=====>........................] - ETA: 3:17 - loss: 0.1733 - regression_loss: 0.1671 - classification_loss: 0.0062 115/500 [=====>........................] - ETA: 3:16 - loss: 0.1736 - regression_loss: 0.1674 - classification_loss: 0.0062 116/500 [=====>........................] - ETA: 3:16 - loss: 0.1738 - regression_loss: 0.1676 - classification_loss: 0.0062 117/500 [======>.......................] - ETA: 3:15 - loss: 0.1736 - regression_loss: 0.1674 - classification_loss: 0.0062 118/500 [======>.......................] - ETA: 3:15 - loss: 0.1735 - regression_loss: 0.1673 - classification_loss: 0.0062 119/500 [======>.......................] - ETA: 3:14 - loss: 0.1737 - regression_loss: 0.1675 - classification_loss: 0.0062 120/500 [======>.......................] - ETA: 3:14 - loss: 0.1738 - regression_loss: 0.1676 - classification_loss: 0.0062 121/500 [======>.......................] - ETA: 3:13 - loss: 0.1735 - regression_loss: 0.1673 - classification_loss: 0.0062 122/500 [======>.......................] - ETA: 3:13 - loss: 0.1733 - regression_loss: 0.1671 - classification_loss: 0.0062 123/500 [======>.......................] - ETA: 3:12 - loss: 0.1726 - regression_loss: 0.1664 - classification_loss: 0.0061 124/500 [======>.......................] - ETA: 3:12 - loss: 0.1726 - regression_loss: 0.1664 - classification_loss: 0.0062 125/500 [======>.......................] - ETA: 3:11 - loss: 0.1722 - regression_loss: 0.1660 - classification_loss: 0.0062 126/500 [======>.......................] - ETA: 3:10 - loss: 0.1734 - regression_loss: 0.1672 - classification_loss: 0.0062 127/500 [======>.......................] - ETA: 3:10 - loss: 0.1737 - regression_loss: 0.1675 - classification_loss: 0.0062 128/500 [======>.......................] - ETA: 3:09 - loss: 0.1741 - regression_loss: 0.1679 - classification_loss: 0.0062 129/500 [======>.......................] - ETA: 3:09 - loss: 0.1739 - regression_loss: 0.1678 - classification_loss: 0.0061 130/500 [======>.......................] - ETA: 3:08 - loss: 0.1731 - regression_loss: 0.1670 - classification_loss: 0.0061 131/500 [======>.......................] - ETA: 3:08 - loss: 0.1734 - regression_loss: 0.1673 - classification_loss: 0.0061 132/500 [======>.......................] - ETA: 3:07 - loss: 0.1729 - regression_loss: 0.1669 - classification_loss: 0.0061 133/500 [======>.......................] - ETA: 3:07 - loss: 0.1725 - regression_loss: 0.1665 - classification_loss: 0.0060 134/500 [=======>......................] - ETA: 3:06 - loss: 0.1723 - regression_loss: 0.1662 - classification_loss: 0.0060 135/500 [=======>......................] - ETA: 3:06 - loss: 0.1729 - regression_loss: 0.1669 - classification_loss: 0.0060 136/500 [=======>......................] - ETA: 3:05 - loss: 0.1725 - regression_loss: 0.1666 - classification_loss: 0.0060 137/500 [=======>......................] - ETA: 3:05 - loss: 0.1725 - regression_loss: 0.1666 - classification_loss: 0.0060 138/500 [=======>......................] - ETA: 3:04 - loss: 0.1721 - regression_loss: 0.1661 - classification_loss: 0.0060 139/500 [=======>......................] - ETA: 3:04 - loss: 0.1717 - regression_loss: 0.1657 - classification_loss: 0.0059 140/500 [=======>......................] - ETA: 3:03 - loss: 0.1722 - regression_loss: 0.1662 - classification_loss: 0.0060 141/500 [=======>......................] - ETA: 3:03 - loss: 0.1719 - regression_loss: 0.1659 - classification_loss: 0.0060 142/500 [=======>......................] - ETA: 3:02 - loss: 0.1731 - regression_loss: 0.1671 - classification_loss: 0.0060 143/500 [=======>......................] - ETA: 3:02 - loss: 0.1725 - regression_loss: 0.1665 - classification_loss: 0.0060 144/500 [=======>......................] - ETA: 3:01 - loss: 0.1722 - regression_loss: 0.1662 - classification_loss: 0.0060 145/500 [=======>......................] - ETA: 3:01 - loss: 0.1716 - regression_loss: 0.1656 - classification_loss: 0.0060 146/500 [=======>......................] - ETA: 3:00 - loss: 0.1710 - regression_loss: 0.1650 - classification_loss: 0.0059 147/500 [=======>......................] - ETA: 3:00 - loss: 0.1705 - regression_loss: 0.1646 - classification_loss: 0.0059 148/500 [=======>......................] - ETA: 2:59 - loss: 0.1702 - regression_loss: 0.1643 - classification_loss: 0.0059 149/500 [=======>......................] - ETA: 2:59 - loss: 0.1715 - regression_loss: 0.1655 - classification_loss: 0.0060 150/500 [========>.....................] - ETA: 2:58 - loss: 0.1715 - regression_loss: 0.1655 - classification_loss: 0.0060 151/500 [========>.....................] - ETA: 2:58 - loss: 0.1711 - regression_loss: 0.1651 - classification_loss: 0.0060 152/500 [========>.....................] - ETA: 2:57 - loss: 0.1710 - regression_loss: 0.1651 - classification_loss: 0.0060 153/500 [========>.....................] - ETA: 2:57 - loss: 0.1715 - regression_loss: 0.1655 - classification_loss: 0.0061 154/500 [========>.....................] - ETA: 2:56 - loss: 0.1706 - regression_loss: 0.1646 - classification_loss: 0.0060 155/500 [========>.....................] - ETA: 2:55 - loss: 0.1704 - regression_loss: 0.1644 - classification_loss: 0.0060 156/500 [========>.....................] - ETA: 2:55 - loss: 0.1703 - regression_loss: 0.1643 - classification_loss: 0.0060 157/500 [========>.....................] - ETA: 2:54 - loss: 0.1702 - regression_loss: 0.1642 - classification_loss: 0.0059 158/500 [========>.....................] - ETA: 2:54 - loss: 0.1718 - regression_loss: 0.1659 - classification_loss: 0.0060 159/500 [========>.....................] - ETA: 2:53 - loss: 0.1715 - regression_loss: 0.1655 - classification_loss: 0.0060 160/500 [========>.....................] - ETA: 2:53 - loss: 0.1712 - regression_loss: 0.1653 - classification_loss: 0.0059 161/500 [========>.....................] - ETA: 2:52 - loss: 0.1711 - regression_loss: 0.1652 - classification_loss: 0.0059 162/500 [========>.....................] - ETA: 2:52 - loss: 0.1723 - regression_loss: 0.1664 - classification_loss: 0.0059 163/500 [========>.....................] - ETA: 2:51 - loss: 0.1727 - regression_loss: 0.1668 - classification_loss: 0.0059 164/500 [========>.....................] - ETA: 2:51 - loss: 0.1720 - regression_loss: 0.1661 - classification_loss: 0.0059 165/500 [========>.....................] - ETA: 2:50 - loss: 0.1717 - regression_loss: 0.1659 - classification_loss: 0.0059 166/500 [========>.....................] - ETA: 2:50 - loss: 0.1715 - regression_loss: 0.1657 - classification_loss: 0.0059 167/500 [=========>....................] - ETA: 2:49 - loss: 0.1714 - regression_loss: 0.1655 - classification_loss: 0.0059 168/500 [=========>....................] - ETA: 2:49 - loss: 0.1717 - regression_loss: 0.1658 - classification_loss: 0.0059 169/500 [=========>....................] - ETA: 2:48 - loss: 0.1711 - regression_loss: 0.1653 - classification_loss: 0.0059 170/500 [=========>....................] - ETA: 2:48 - loss: 0.1718 - regression_loss: 0.1659 - classification_loss: 0.0059 171/500 [=========>....................] - ETA: 2:47 - loss: 0.1713 - regression_loss: 0.1655 - classification_loss: 0.0059 172/500 [=========>....................] - ETA: 2:47 - loss: 0.1711 - regression_loss: 0.1652 - classification_loss: 0.0059 173/500 [=========>....................] - ETA: 2:46 - loss: 0.1704 - regression_loss: 0.1646 - classification_loss: 0.0058 174/500 [=========>....................] - ETA: 2:46 - loss: 0.1700 - regression_loss: 0.1642 - classification_loss: 0.0058 175/500 [=========>....................] - ETA: 2:45 - loss: 0.1700 - regression_loss: 0.1642 - classification_loss: 0.0058 176/500 [=========>....................] - ETA: 2:45 - loss: 0.1703 - regression_loss: 0.1645 - classification_loss: 0.0058 177/500 [=========>....................] - ETA: 2:44 - loss: 0.1701 - regression_loss: 0.1643 - classification_loss: 0.0058 178/500 [=========>....................] - ETA: 2:44 - loss: 0.1703 - regression_loss: 0.1645 - classification_loss: 0.0058 179/500 [=========>....................] - ETA: 2:43 - loss: 0.1709 - regression_loss: 0.1650 - classification_loss: 0.0059 180/500 [=========>....................] - ETA: 2:43 - loss: 0.1709 - regression_loss: 0.1651 - classification_loss: 0.0059 181/500 [=========>....................] - ETA: 2:42 - loss: 0.1704 - regression_loss: 0.1646 - classification_loss: 0.0058 182/500 [=========>....................] - ETA: 2:42 - loss: 0.1703 - regression_loss: 0.1645 - classification_loss: 0.0058 183/500 [=========>....................] - ETA: 2:41 - loss: 0.1705 - regression_loss: 0.1646 - classification_loss: 0.0058 184/500 [==========>...................] - ETA: 2:41 - loss: 0.1710 - regression_loss: 0.1651 - classification_loss: 0.0058 185/500 [==========>...................] - ETA: 2:40 - loss: 0.1715 - regression_loss: 0.1657 - classification_loss: 0.0059 186/500 [==========>...................] - ETA: 2:40 - loss: 0.1716 - regression_loss: 0.1658 - classification_loss: 0.0059 187/500 [==========>...................] - ETA: 2:39 - loss: 0.1711 - regression_loss: 0.1653 - classification_loss: 0.0058 188/500 [==========>...................] - ETA: 2:39 - loss: 0.1711 - regression_loss: 0.1653 - classification_loss: 0.0058 189/500 [==========>...................] - ETA: 2:38 - loss: 0.1721 - regression_loss: 0.1663 - classification_loss: 0.0058 190/500 [==========>...................] - ETA: 2:38 - loss: 0.1718 - regression_loss: 0.1659 - classification_loss: 0.0058 191/500 [==========>...................] - ETA: 2:37 - loss: 0.1716 - regression_loss: 0.1658 - classification_loss: 0.0058 192/500 [==========>...................] - ETA: 2:37 - loss: 0.1713 - regression_loss: 0.1655 - classification_loss: 0.0058 193/500 [==========>...................] - ETA: 2:36 - loss: 0.1713 - regression_loss: 0.1655 - classification_loss: 0.0058 194/500 [==========>...................] - ETA: 2:35 - loss: 0.1713 - regression_loss: 0.1654 - classification_loss: 0.0058 195/500 [==========>...................] - ETA: 2:35 - loss: 0.1709 - regression_loss: 0.1651 - classification_loss: 0.0058 196/500 [==========>...................] - ETA: 2:34 - loss: 0.1712 - regression_loss: 0.1654 - classification_loss: 0.0058 197/500 [==========>...................] - ETA: 2:34 - loss: 0.1712 - regression_loss: 0.1654 - classification_loss: 0.0058 198/500 [==========>...................] - ETA: 2:33 - loss: 0.1718 - regression_loss: 0.1660 - classification_loss: 0.0059 199/500 [==========>...................] - ETA: 2:33 - loss: 0.1726 - regression_loss: 0.1668 - classification_loss: 0.0059 200/500 [===========>..................] - ETA: 2:32 - loss: 0.1730 - regression_loss: 0.1672 - classification_loss: 0.0059 201/500 [===========>..................] - ETA: 2:32 - loss: 0.1734 - regression_loss: 0.1675 - classification_loss: 0.0059 202/500 [===========>..................] - ETA: 2:31 - loss: 0.1735 - regression_loss: 0.1676 - classification_loss: 0.0059 203/500 [===========>..................] - ETA: 2:31 - loss: 0.1739 - regression_loss: 0.1680 - classification_loss: 0.0059 204/500 [===========>..................] - ETA: 2:30 - loss: 0.1736 - regression_loss: 0.1678 - classification_loss: 0.0059 205/500 [===========>..................] - ETA: 2:30 - loss: 0.1736 - regression_loss: 0.1678 - classification_loss: 0.0058 206/500 [===========>..................] - ETA: 2:29 - loss: 0.1738 - regression_loss: 0.1680 - classification_loss: 0.0058 207/500 [===========>..................] - ETA: 2:29 - loss: 0.1741 - regression_loss: 0.1682 - classification_loss: 0.0058 208/500 [===========>..................] - ETA: 2:28 - loss: 0.1736 - regression_loss: 0.1678 - classification_loss: 0.0058 209/500 [===========>..................] - ETA: 2:28 - loss: 0.1738 - regression_loss: 0.1680 - classification_loss: 0.0058 210/500 [===========>..................] - ETA: 2:27 - loss: 0.1738 - regression_loss: 0.1680 - classification_loss: 0.0058 211/500 [===========>..................] - ETA: 2:27 - loss: 0.1737 - regression_loss: 0.1679 - classification_loss: 0.0058 212/500 [===========>..................] - ETA: 2:26 - loss: 0.1734 - regression_loss: 0.1676 - classification_loss: 0.0058 213/500 [===========>..................] - ETA: 2:26 - loss: 0.1741 - regression_loss: 0.1683 - classification_loss: 0.0059 214/500 [===========>..................] - ETA: 2:25 - loss: 0.1738 - regression_loss: 0.1680 - classification_loss: 0.0058 215/500 [===========>..................] - ETA: 2:25 - loss: 0.1736 - regression_loss: 0.1677 - classification_loss: 0.0058 216/500 [===========>..................] - ETA: 2:24 - loss: 0.1740 - regression_loss: 0.1681 - classification_loss: 0.0059 217/500 [============>.................] - ETA: 2:24 - loss: 0.1735 - regression_loss: 0.1676 - classification_loss: 0.0059 218/500 [============>.................] - ETA: 2:23 - loss: 0.1733 - regression_loss: 0.1675 - classification_loss: 0.0058 219/500 [============>.................] - ETA: 2:23 - loss: 0.1737 - regression_loss: 0.1678 - classification_loss: 0.0059 220/500 [============>.................] - ETA: 2:22 - loss: 0.1736 - regression_loss: 0.1677 - classification_loss: 0.0059 221/500 [============>.................] - ETA: 2:22 - loss: 0.1735 - regression_loss: 0.1676 - classification_loss: 0.0059 222/500 [============>.................] - ETA: 2:21 - loss: 0.1740 - regression_loss: 0.1680 - classification_loss: 0.0060 223/500 [============>.................] - ETA: 2:21 - loss: 0.1747 - regression_loss: 0.1687 - classification_loss: 0.0060 224/500 [============>.................] - ETA: 2:20 - loss: 0.1744 - regression_loss: 0.1684 - classification_loss: 0.0060 225/500 [============>.................] - ETA: 2:20 - loss: 0.1744 - regression_loss: 0.1684 - classification_loss: 0.0060 226/500 [============>.................] - ETA: 2:19 - loss: 0.1747 - regression_loss: 0.1687 - classification_loss: 0.0060 227/500 [============>.................] - ETA: 2:19 - loss: 0.1744 - regression_loss: 0.1684 - classification_loss: 0.0060 228/500 [============>.................] - ETA: 2:18 - loss: 0.1745 - regression_loss: 0.1685 - classification_loss: 0.0059 229/500 [============>.................] - ETA: 2:18 - loss: 0.1742 - regression_loss: 0.1683 - classification_loss: 0.0059 230/500 [============>.................] - ETA: 2:17 - loss: 0.1751 - regression_loss: 0.1692 - classification_loss: 0.0059 231/500 [============>.................] - ETA: 2:17 - loss: 0.1753 - regression_loss: 0.1694 - classification_loss: 0.0059 232/500 [============>.................] - ETA: 2:16 - loss: 0.1750 - regression_loss: 0.1691 - classification_loss: 0.0059 233/500 [============>.................] - ETA: 2:16 - loss: 0.1754 - regression_loss: 0.1695 - classification_loss: 0.0059 234/500 [=============>................] - ETA: 2:15 - loss: 0.1758 - regression_loss: 0.1699 - classification_loss: 0.0060 235/500 [=============>................] - ETA: 2:14 - loss: 0.1760 - regression_loss: 0.1701 - classification_loss: 0.0059 236/500 [=============>................] - ETA: 2:14 - loss: 0.1762 - regression_loss: 0.1702 - classification_loss: 0.0060 237/500 [=============>................] - ETA: 2:13 - loss: 0.1763 - regression_loss: 0.1703 - classification_loss: 0.0060 238/500 [=============>................] - ETA: 2:13 - loss: 0.1764 - regression_loss: 0.1703 - classification_loss: 0.0060 239/500 [=============>................] - ETA: 2:12 - loss: 0.1765 - regression_loss: 0.1705 - classification_loss: 0.0060 240/500 [=============>................] - ETA: 2:12 - loss: 0.1765 - regression_loss: 0.1705 - classification_loss: 0.0060 241/500 [=============>................] - ETA: 2:11 - loss: 0.1764 - regression_loss: 0.1704 - classification_loss: 0.0060 242/500 [=============>................] - ETA: 2:11 - loss: 0.1764 - regression_loss: 0.1704 - classification_loss: 0.0060 243/500 [=============>................] - ETA: 2:10 - loss: 0.1760 - regression_loss: 0.1700 - classification_loss: 0.0060 244/500 [=============>................] - ETA: 2:10 - loss: 0.1760 - regression_loss: 0.1700 - classification_loss: 0.0060 245/500 [=============>................] - ETA: 2:09 - loss: 0.1763 - regression_loss: 0.1703 - classification_loss: 0.0060 246/500 [=============>................] - ETA: 2:09 - loss: 0.1761 - regression_loss: 0.1701 - classification_loss: 0.0060 247/500 [=============>................] - ETA: 2:08 - loss: 0.1769 - regression_loss: 0.1708 - classification_loss: 0.0060 248/500 [=============>................] - ETA: 2:08 - loss: 0.1777 - regression_loss: 0.1716 - classification_loss: 0.0061 249/500 [=============>................] - ETA: 2:07 - loss: 0.1775 - regression_loss: 0.1714 - classification_loss: 0.0061 250/500 [==============>...............] - ETA: 2:07 - loss: 0.1775 - regression_loss: 0.1714 - classification_loss: 0.0061 251/500 [==============>...............] - ETA: 2:06 - loss: 0.1777 - regression_loss: 0.1716 - classification_loss: 0.0061 252/500 [==============>...............] - ETA: 2:06 - loss: 0.1773 - regression_loss: 0.1712 - classification_loss: 0.0061 253/500 [==============>...............] - ETA: 2:05 - loss: 0.1774 - regression_loss: 0.1713 - classification_loss: 0.0061 254/500 [==============>...............] - ETA: 2:05 - loss: 0.1770 - regression_loss: 0.1709 - classification_loss: 0.0061 255/500 [==============>...............] - ETA: 2:04 - loss: 0.1775 - regression_loss: 0.1713 - classification_loss: 0.0061 256/500 [==============>...............] - ETA: 2:04 - loss: 0.1781 - regression_loss: 0.1718 - classification_loss: 0.0064 257/500 [==============>...............] - ETA: 2:03 - loss: 0.1779 - regression_loss: 0.1716 - classification_loss: 0.0064 258/500 [==============>...............] - ETA: 2:03 - loss: 0.1782 - regression_loss: 0.1718 - classification_loss: 0.0064 259/500 [==============>...............] - ETA: 2:02 - loss: 0.1777 - regression_loss: 0.1714 - classification_loss: 0.0064 260/500 [==============>...............] - ETA: 2:02 - loss: 0.1775 - regression_loss: 0.1712 - classification_loss: 0.0063 261/500 [==============>...............] - ETA: 2:01 - loss: 0.1782 - regression_loss: 0.1718 - classification_loss: 0.0063 262/500 [==============>...............] - ETA: 2:01 - loss: 0.1781 - regression_loss: 0.1718 - classification_loss: 0.0063 263/500 [==============>...............] - ETA: 2:00 - loss: 0.1784 - regression_loss: 0.1720 - classification_loss: 0.0064 264/500 [==============>...............] - ETA: 2:00 - loss: 0.1786 - regression_loss: 0.1723 - classification_loss: 0.0064 265/500 [==============>...............] - ETA: 1:59 - loss: 0.1783 - regression_loss: 0.1720 - classification_loss: 0.0063 266/500 [==============>...............] - ETA: 1:59 - loss: 0.1779 - regression_loss: 0.1716 - classification_loss: 0.0063 267/500 [===============>..............] - ETA: 1:58 - loss: 0.1782 - regression_loss: 0.1719 - classification_loss: 0.0063 268/500 [===============>..............] - ETA: 1:58 - loss: 0.1787 - regression_loss: 0.1724 - classification_loss: 0.0063 269/500 [===============>..............] - ETA: 1:57 - loss: 0.1789 - regression_loss: 0.1726 - classification_loss: 0.0063 270/500 [===============>..............] - ETA: 1:57 - loss: 0.1801 - regression_loss: 0.1738 - classification_loss: 0.0063 271/500 [===============>..............] - ETA: 1:56 - loss: 0.1807 - regression_loss: 0.1744 - classification_loss: 0.0063 272/500 [===============>..............] - ETA: 1:56 - loss: 0.1809 - regression_loss: 0.1746 - classification_loss: 0.0063 273/500 [===============>..............] - ETA: 1:55 - loss: 0.1809 - regression_loss: 0.1745 - classification_loss: 0.0063 274/500 [===============>..............] - ETA: 1:55 - loss: 0.1809 - regression_loss: 0.1745 - classification_loss: 0.0063 275/500 [===============>..............] - ETA: 1:54 - loss: 0.1810 - regression_loss: 0.1746 - classification_loss: 0.0063 276/500 [===============>..............] - ETA: 1:54 - loss: 0.1805 - regression_loss: 0.1742 - classification_loss: 0.0063 277/500 [===============>..............] - ETA: 1:53 - loss: 0.1802 - regression_loss: 0.1739 - classification_loss: 0.0063 278/500 [===============>..............] - ETA: 1:53 - loss: 0.1804 - regression_loss: 0.1741 - classification_loss: 0.0063 279/500 [===============>..............] - ETA: 1:52 - loss: 0.1807 - regression_loss: 0.1744 - classification_loss: 0.0063 280/500 [===============>..............] - ETA: 1:52 - loss: 0.1805 - regression_loss: 0.1742 - classification_loss: 0.0063 281/500 [===============>..............] - ETA: 1:51 - loss: 0.1805 - regression_loss: 0.1742 - classification_loss: 0.0063 282/500 [===============>..............] - ETA: 1:51 - loss: 0.1803 - regression_loss: 0.1740 - classification_loss: 0.0063 283/500 [===============>..............] - ETA: 1:50 - loss: 0.1807 - regression_loss: 0.1744 - classification_loss: 0.0063 284/500 [================>.............] - ETA: 1:50 - loss: 0.1808 - regression_loss: 0.1745 - classification_loss: 0.0063 285/500 [================>.............] - ETA: 1:49 - loss: 0.1805 - regression_loss: 0.1742 - classification_loss: 0.0063 286/500 [================>.............] - ETA: 1:49 - loss: 0.1807 - regression_loss: 0.1744 - classification_loss: 0.0063 287/500 [================>.............] - ETA: 1:48 - loss: 0.1803 - regression_loss: 0.1740 - classification_loss: 0.0063 288/500 [================>.............] - ETA: 1:47 - loss: 0.1803 - regression_loss: 0.1740 - classification_loss: 0.0063 289/500 [================>.............] - ETA: 1:47 - loss: 0.1804 - regression_loss: 0.1741 - classification_loss: 0.0063 290/500 [================>.............] - ETA: 1:46 - loss: 0.1802 - regression_loss: 0.1739 - classification_loss: 0.0063 291/500 [================>.............] - ETA: 1:46 - loss: 0.1800 - regression_loss: 0.1737 - classification_loss: 0.0063 292/500 [================>.............] - ETA: 1:45 - loss: 0.1799 - regression_loss: 0.1736 - classification_loss: 0.0063 293/500 [================>.............] - ETA: 1:45 - loss: 0.1797 - regression_loss: 0.1735 - classification_loss: 0.0063 294/500 [================>.............] - ETA: 1:44 - loss: 0.1795 - regression_loss: 0.1732 - classification_loss: 0.0063 295/500 [================>.............] - ETA: 1:44 - loss: 0.1793 - regression_loss: 0.1730 - classification_loss: 0.0063 296/500 [================>.............] - ETA: 1:43 - loss: 0.1792 - regression_loss: 0.1730 - classification_loss: 0.0062 297/500 [================>.............] - ETA: 1:43 - loss: 0.1791 - regression_loss: 0.1729 - classification_loss: 0.0063 298/500 [================>.............] - ETA: 1:42 - loss: 0.1790 - regression_loss: 0.1727 - classification_loss: 0.0062 299/500 [================>.............] - ETA: 1:42 - loss: 0.1791 - regression_loss: 0.1729 - classification_loss: 0.0062 300/500 [=================>............] - ETA: 1:41 - loss: 0.1792 - regression_loss: 0.1729 - classification_loss: 0.0063 301/500 [=================>............] - ETA: 1:41 - loss: 0.1796 - regression_loss: 0.1734 - classification_loss: 0.0062 302/500 [=================>............] - ETA: 1:40 - loss: 0.1793 - regression_loss: 0.1731 - classification_loss: 0.0062 303/500 [=================>............] - ETA: 1:40 - loss: 0.1798 - regression_loss: 0.1736 - classification_loss: 0.0062 304/500 [=================>............] - ETA: 1:39 - loss: 0.1794 - regression_loss: 0.1731 - classification_loss: 0.0062 305/500 [=================>............] - ETA: 1:39 - loss: 0.1798 - regression_loss: 0.1736 - classification_loss: 0.0062 306/500 [=================>............] - ETA: 1:38 - loss: 0.1796 - regression_loss: 0.1734 - classification_loss: 0.0062 307/500 [=================>............] - ETA: 1:38 - loss: 0.1806 - regression_loss: 0.1744 - classification_loss: 0.0062 308/500 [=================>............] - ETA: 1:37 - loss: 0.1804 - regression_loss: 0.1742 - classification_loss: 0.0062 309/500 [=================>............] - ETA: 1:37 - loss: 0.1801 - regression_loss: 0.1739 - classification_loss: 0.0062 310/500 [=================>............] - ETA: 1:36 - loss: 0.1801 - regression_loss: 0.1739 - classification_loss: 0.0062 311/500 [=================>............] - ETA: 1:36 - loss: 0.1800 - regression_loss: 0.1738 - classification_loss: 0.0062 312/500 [=================>............] - ETA: 1:35 - loss: 0.1796 - regression_loss: 0.1734 - classification_loss: 0.0062 313/500 [=================>............] - ETA: 1:35 - loss: 0.1799 - regression_loss: 0.1737 - classification_loss: 0.0062 314/500 [=================>............] - ETA: 1:34 - loss: 0.1800 - regression_loss: 0.1738 - classification_loss: 0.0062 315/500 [=================>............] - ETA: 1:34 - loss: 0.1797 - regression_loss: 0.1735 - classification_loss: 0.0062 316/500 [=================>............] - ETA: 1:33 - loss: 0.1794 - regression_loss: 0.1732 - classification_loss: 0.0062 317/500 [==================>...........] - ETA: 1:33 - loss: 0.1793 - regression_loss: 0.1731 - classification_loss: 0.0062 318/500 [==================>...........] - ETA: 1:32 - loss: 0.1790 - regression_loss: 0.1728 - classification_loss: 0.0062 319/500 [==================>...........] - ETA: 1:32 - loss: 0.1790 - regression_loss: 0.1729 - classification_loss: 0.0062 320/500 [==================>...........] - ETA: 1:31 - loss: 0.1786 - regression_loss: 0.1725 - classification_loss: 0.0061 321/500 [==================>...........] - ETA: 1:31 - loss: 0.1784 - regression_loss: 0.1722 - classification_loss: 0.0061 322/500 [==================>...........] - ETA: 1:30 - loss: 0.1786 - regression_loss: 0.1725 - classification_loss: 0.0061 323/500 [==================>...........] - ETA: 1:30 - loss: 0.1784 - regression_loss: 0.1723 - classification_loss: 0.0061 324/500 [==================>...........] - ETA: 1:29 - loss: 0.1784 - regression_loss: 0.1723 - classification_loss: 0.0061 325/500 [==================>...........] - ETA: 1:29 - loss: 0.1782 - regression_loss: 0.1721 - classification_loss: 0.0061 326/500 [==================>...........] - ETA: 1:28 - loss: 0.1782 - regression_loss: 0.1721 - classification_loss: 0.0061 327/500 [==================>...........] - ETA: 1:27 - loss: 0.1783 - regression_loss: 0.1722 - classification_loss: 0.0061 328/500 [==================>...........] - ETA: 1:27 - loss: 0.1784 - regression_loss: 0.1722 - classification_loss: 0.0061 329/500 [==================>...........] - ETA: 1:26 - loss: 0.1783 - regression_loss: 0.1722 - classification_loss: 0.0061 330/500 [==================>...........] - ETA: 1:26 - loss: 0.1784 - regression_loss: 0.1723 - classification_loss: 0.0061 331/500 [==================>...........] - ETA: 1:25 - loss: 0.1788 - regression_loss: 0.1727 - classification_loss: 0.0061 332/500 [==================>...........] - ETA: 1:25 - loss: 0.1788 - regression_loss: 0.1727 - classification_loss: 0.0061 333/500 [==================>...........] - ETA: 1:24 - loss: 0.1791 - regression_loss: 0.1729 - classification_loss: 0.0061 334/500 [===================>..........] - ETA: 1:24 - loss: 0.1799 - regression_loss: 0.1738 - classification_loss: 0.0061 335/500 [===================>..........] - ETA: 1:23 - loss: 0.1803 - regression_loss: 0.1742 - classification_loss: 0.0061 336/500 [===================>..........] - ETA: 1:23 - loss: 0.1804 - regression_loss: 0.1743 - classification_loss: 0.0061 337/500 [===================>..........] - ETA: 1:22 - loss: 0.1805 - regression_loss: 0.1743 - classification_loss: 0.0061 338/500 [===================>..........] - ETA: 1:22 - loss: 0.1802 - regression_loss: 0.1741 - classification_loss: 0.0061 339/500 [===================>..........] - ETA: 1:21 - loss: 0.1805 - regression_loss: 0.1744 - classification_loss: 0.0061 340/500 [===================>..........] - ETA: 1:21 - loss: 0.1806 - regression_loss: 0.1745 - classification_loss: 0.0061 341/500 [===================>..........] - ETA: 1:20 - loss: 0.1804 - regression_loss: 0.1743 - classification_loss: 0.0061 342/500 [===================>..........] - ETA: 1:20 - loss: 0.1805 - regression_loss: 0.1743 - classification_loss: 0.0061 343/500 [===================>..........] - ETA: 1:19 - loss: 0.1806 - regression_loss: 0.1745 - classification_loss: 0.0062 344/500 [===================>..........] - ETA: 1:19 - loss: 0.1806 - regression_loss: 0.1744 - classification_loss: 0.0061 345/500 [===================>..........] - ETA: 1:18 - loss: 0.1803 - regression_loss: 0.1742 - classification_loss: 0.0061 346/500 [===================>..........] - ETA: 1:18 - loss: 0.1804 - regression_loss: 0.1743 - classification_loss: 0.0062 347/500 [===================>..........] - ETA: 1:17 - loss: 0.1804 - regression_loss: 0.1742 - classification_loss: 0.0062 348/500 [===================>..........] - ETA: 1:17 - loss: 0.1803 - regression_loss: 0.1742 - classification_loss: 0.0061 349/500 [===================>..........] - ETA: 1:16 - loss: 0.1800 - regression_loss: 0.1739 - classification_loss: 0.0061 350/500 [====================>.........] - ETA: 1:16 - loss: 0.1797 - regression_loss: 0.1736 - classification_loss: 0.0061 351/500 [====================>.........] - ETA: 1:15 - loss: 0.1797 - regression_loss: 0.1736 - classification_loss: 0.0061 352/500 [====================>.........] - ETA: 1:15 - loss: 0.1796 - regression_loss: 0.1735 - classification_loss: 0.0061 353/500 [====================>.........] - ETA: 1:14 - loss: 0.1795 - regression_loss: 0.1734 - classification_loss: 0.0061 354/500 [====================>.........] - ETA: 1:14 - loss: 0.1795 - regression_loss: 0.1734 - classification_loss: 0.0061 355/500 [====================>.........] - ETA: 1:13 - loss: 0.1792 - regression_loss: 0.1731 - classification_loss: 0.0061 356/500 [====================>.........] - ETA: 1:13 - loss: 0.1794 - regression_loss: 0.1733 - classification_loss: 0.0061 357/500 [====================>.........] - ETA: 1:12 - loss: 0.1796 - regression_loss: 0.1735 - classification_loss: 0.0061 358/500 [====================>.........] - ETA: 1:12 - loss: 0.1795 - regression_loss: 0.1734 - classification_loss: 0.0061 359/500 [====================>.........] - ETA: 1:11 - loss: 0.1795 - regression_loss: 0.1734 - classification_loss: 0.0061 360/500 [====================>.........] - ETA: 1:11 - loss: 0.1794 - regression_loss: 0.1733 - classification_loss: 0.0061 361/500 [====================>.........] - ETA: 1:10 - loss: 0.1793 - regression_loss: 0.1732 - classification_loss: 0.0061 362/500 [====================>.........] - ETA: 1:10 - loss: 0.1793 - regression_loss: 0.1732 - classification_loss: 0.0061 363/500 [====================>.........] - ETA: 1:09 - loss: 0.1789 - regression_loss: 0.1728 - classification_loss: 0.0061 364/500 [====================>.........] - ETA: 1:09 - loss: 0.1791 - regression_loss: 0.1731 - classification_loss: 0.0061 365/500 [====================>.........] - ETA: 1:08 - loss: 0.1791 - regression_loss: 0.1729 - classification_loss: 0.0061 366/500 [====================>.........] - ETA: 1:08 - loss: 0.1788 - regression_loss: 0.1727 - classification_loss: 0.0061 367/500 [=====================>........] - ETA: 1:07 - loss: 0.1786 - regression_loss: 0.1725 - classification_loss: 0.0061 368/500 [=====================>........] - ETA: 1:07 - loss: 0.1783 - regression_loss: 0.1723 - classification_loss: 0.0061 369/500 [=====================>........] - ETA: 1:06 - loss: 0.1783 - regression_loss: 0.1722 - classification_loss: 0.0061 370/500 [=====================>........] - ETA: 1:06 - loss: 0.1783 - regression_loss: 0.1722 - classification_loss: 0.0061 371/500 [=====================>........] - ETA: 1:05 - loss: 0.1781 - regression_loss: 0.1721 - classification_loss: 0.0061 372/500 [=====================>........] - ETA: 1:05 - loss: 0.1781 - regression_loss: 0.1721 - classification_loss: 0.0061 373/500 [=====================>........] - ETA: 1:04 - loss: 0.1784 - regression_loss: 0.1723 - classification_loss: 0.0061 374/500 [=====================>........] - ETA: 1:04 - loss: 0.1783 - regression_loss: 0.1722 - classification_loss: 0.0061 375/500 [=====================>........] - ETA: 1:03 - loss: 0.1785 - regression_loss: 0.1724 - classification_loss: 0.0061 376/500 [=====================>........] - ETA: 1:03 - loss: 0.1787 - regression_loss: 0.1726 - classification_loss: 0.0061 377/500 [=====================>........] - ETA: 1:02 - loss: 0.1785 - regression_loss: 0.1724 - classification_loss: 0.0061 378/500 [=====================>........] - ETA: 1:02 - loss: 0.1787 - regression_loss: 0.1726 - classification_loss: 0.0061 379/500 [=====================>........] - ETA: 1:01 - loss: 0.1786 - regression_loss: 0.1725 - classification_loss: 0.0061 380/500 [=====================>........] - ETA: 1:01 - loss: 0.1787 - regression_loss: 0.1727 - classification_loss: 0.0061 381/500 [=====================>........] - ETA: 1:00 - loss: 0.1790 - regression_loss: 0.1729 - classification_loss: 0.0061 382/500 [=====================>........] - ETA: 1:00 - loss: 0.1797 - regression_loss: 0.1736 - classification_loss: 0.0061 383/500 [=====================>........] - ETA: 59s - loss: 0.1796 - regression_loss: 0.1735 - classification_loss: 0.0061  384/500 [======================>.......] - ETA: 59s - loss: 0.1795 - regression_loss: 0.1734 - classification_loss: 0.0061 385/500 [======================>.......] - ETA: 58s - loss: 0.1795 - regression_loss: 0.1735 - classification_loss: 0.0061 386/500 [======================>.......] - ETA: 58s - loss: 0.1795 - regression_loss: 0.1735 - classification_loss: 0.0061 387/500 [======================>.......] - ETA: 57s - loss: 0.1796 - regression_loss: 0.1736 - classification_loss: 0.0061 388/500 [======================>.......] - ETA: 57s - loss: 0.1799 - regression_loss: 0.1739 - classification_loss: 0.0060 389/500 [======================>.......] - ETA: 56s - loss: 0.1798 - regression_loss: 0.1738 - classification_loss: 0.0060 390/500 [======================>.......] - ETA: 55s - loss: 0.1798 - regression_loss: 0.1737 - classification_loss: 0.0060 391/500 [======================>.......] - ETA: 55s - loss: 0.1798 - regression_loss: 0.1737 - classification_loss: 0.0060 392/500 [======================>.......] - ETA: 54s - loss: 0.1797 - regression_loss: 0.1737 - classification_loss: 0.0060 393/500 [======================>.......] - ETA: 54s - loss: 0.1799 - regression_loss: 0.1738 - classification_loss: 0.0061 394/500 [======================>.......] - ETA: 53s - loss: 0.1802 - regression_loss: 0.1742 - classification_loss: 0.0061 395/500 [======================>.......] - ETA: 53s - loss: 0.1802 - regression_loss: 0.1741 - classification_loss: 0.0061 396/500 [======================>.......] - ETA: 52s - loss: 0.1807 - regression_loss: 0.1745 - classification_loss: 0.0062 397/500 [======================>.......] - ETA: 52s - loss: 0.1808 - regression_loss: 0.1745 - classification_loss: 0.0062 398/500 [======================>.......] - ETA: 51s - loss: 0.1806 - regression_loss: 0.1744 - classification_loss: 0.0062 399/500 [======================>.......] - ETA: 51s - loss: 0.1810 - regression_loss: 0.1748 - classification_loss: 0.0062 400/500 [=======================>......] - ETA: 50s - loss: 0.1808 - regression_loss: 0.1746 - classification_loss: 0.0062 401/500 [=======================>......] - ETA: 50s - loss: 0.1808 - regression_loss: 0.1746 - classification_loss: 0.0062 402/500 [=======================>......] - ETA: 49s - loss: 0.1808 - regression_loss: 0.1746 - classification_loss: 0.0062 403/500 [=======================>......] - ETA: 49s - loss: 0.1812 - regression_loss: 0.1750 - classification_loss: 0.0062 404/500 [=======================>......] - ETA: 48s - loss: 0.1810 - regression_loss: 0.1748 - classification_loss: 0.0062 405/500 [=======================>......] - ETA: 48s - loss: 0.1812 - regression_loss: 0.1749 - classification_loss: 0.0062 406/500 [=======================>......] - ETA: 47s - loss: 0.1810 - regression_loss: 0.1747 - classification_loss: 0.0062 407/500 [=======================>......] - ETA: 47s - loss: 0.1809 - regression_loss: 0.1747 - classification_loss: 0.0062 408/500 [=======================>......] - ETA: 46s - loss: 0.1806 - regression_loss: 0.1744 - classification_loss: 0.0062 409/500 [=======================>......] - ETA: 46s - loss: 0.1805 - regression_loss: 0.1743 - classification_loss: 0.0062 410/500 [=======================>......] - ETA: 45s - loss: 0.1806 - regression_loss: 0.1744 - classification_loss: 0.0062 411/500 [=======================>......] - ETA: 45s - loss: 0.1802 - regression_loss: 0.1740 - classification_loss: 0.0062 412/500 [=======================>......] - ETA: 44s - loss: 0.1800 - regression_loss: 0.1739 - classification_loss: 0.0062 413/500 [=======================>......] - ETA: 44s - loss: 0.1803 - regression_loss: 0.1742 - classification_loss: 0.0062 414/500 [=======================>......] - ETA: 43s - loss: 0.1803 - regression_loss: 0.1741 - classification_loss: 0.0062 415/500 [=======================>......] - ETA: 43s - loss: 0.1812 - regression_loss: 0.1750 - classification_loss: 0.0062 416/500 [=======================>......] - ETA: 42s - loss: 0.1811 - regression_loss: 0.1749 - classification_loss: 0.0062 417/500 [========================>.....] - ETA: 42s - loss: 0.1810 - regression_loss: 0.1748 - classification_loss: 0.0062 418/500 [========================>.....] - ETA: 41s - loss: 0.1812 - regression_loss: 0.1750 - classification_loss: 0.0062 419/500 [========================>.....] - ETA: 41s - loss: 0.1812 - regression_loss: 0.1750 - classification_loss: 0.0062 420/500 [========================>.....] - ETA: 40s - loss: 0.1812 - regression_loss: 0.1750 - classification_loss: 0.0062 421/500 [========================>.....] - ETA: 40s - loss: 0.1809 - regression_loss: 0.1748 - classification_loss: 0.0062 422/500 [========================>.....] - ETA: 39s - loss: 0.1808 - regression_loss: 0.1746 - classification_loss: 0.0062 423/500 [========================>.....] - ETA: 39s - loss: 0.1808 - regression_loss: 0.1746 - classification_loss: 0.0062 424/500 [========================>.....] - ETA: 38s - loss: 0.1807 - regression_loss: 0.1745 - classification_loss: 0.0061 425/500 [========================>.....] - ETA: 38s - loss: 0.1807 - regression_loss: 0.1745 - classification_loss: 0.0062 426/500 [========================>.....] - ETA: 37s - loss: 0.1805 - regression_loss: 0.1743 - classification_loss: 0.0062 427/500 [========================>.....] - ETA: 37s - loss: 0.1807 - regression_loss: 0.1745 - classification_loss: 0.0062 428/500 [========================>.....] - ETA: 36s - loss: 0.1809 - regression_loss: 0.1747 - classification_loss: 0.0062 429/500 [========================>.....] - ETA: 36s - loss: 0.1807 - regression_loss: 0.1746 - classification_loss: 0.0062 430/500 [========================>.....] - ETA: 35s - loss: 0.1808 - regression_loss: 0.1746 - classification_loss: 0.0062 431/500 [========================>.....] - ETA: 35s - loss: 0.1808 - regression_loss: 0.1746 - classification_loss: 0.0062 432/500 [========================>.....] - ETA: 34s - loss: 0.1808 - regression_loss: 0.1747 - classification_loss: 0.0062 433/500 [========================>.....] - ETA: 34s - loss: 0.1807 - regression_loss: 0.1745 - classification_loss: 0.0062 434/500 [=========================>....] - ETA: 33s - loss: 0.1805 - regression_loss: 0.1743 - classification_loss: 0.0062 435/500 [=========================>....] - ETA: 33s - loss: 0.1803 - regression_loss: 0.1741 - classification_loss: 0.0062 436/500 [=========================>....] - ETA: 32s - loss: 0.1807 - regression_loss: 0.1745 - classification_loss: 0.0062 437/500 [=========================>....] - ETA: 32s - loss: 0.1809 - regression_loss: 0.1747 - classification_loss: 0.0062 438/500 [=========================>....] - ETA: 31s - loss: 0.1807 - regression_loss: 0.1745 - classification_loss: 0.0062 439/500 [=========================>....] - ETA: 31s - loss: 0.1805 - regression_loss: 0.1743 - classification_loss: 0.0062 440/500 [=========================>....] - ETA: 30s - loss: 0.1806 - regression_loss: 0.1745 - classification_loss: 0.0062 441/500 [=========================>....] - ETA: 30s - loss: 0.1809 - regression_loss: 0.1747 - classification_loss: 0.0062 442/500 [=========================>....] - ETA: 29s - loss: 0.1807 - regression_loss: 0.1745 - classification_loss: 0.0062 443/500 [=========================>....] - ETA: 29s - loss: 0.1806 - regression_loss: 0.1744 - classification_loss: 0.0062 444/500 [=========================>....] - ETA: 28s - loss: 0.1806 - regression_loss: 0.1744 - classification_loss: 0.0062 445/500 [=========================>....] - ETA: 27s - loss: 0.1805 - regression_loss: 0.1744 - classification_loss: 0.0062 446/500 [=========================>....] - ETA: 27s - loss: 0.1804 - regression_loss: 0.1742 - classification_loss: 0.0062 447/500 [=========================>....] - ETA: 26s - loss: 0.1804 - regression_loss: 0.1743 - classification_loss: 0.0062 448/500 [=========================>....] - ETA: 26s - loss: 0.1803 - regression_loss: 0.1742 - classification_loss: 0.0061 449/500 [=========================>....] - ETA: 25s - loss: 0.1804 - regression_loss: 0.1742 - classification_loss: 0.0061 450/500 [==========================>...] - ETA: 25s - loss: 0.1804 - regression_loss: 0.1742 - classification_loss: 0.0061 451/500 [==========================>...] - ETA: 24s - loss: 0.1802 - regression_loss: 0.1741 - classification_loss: 0.0062 452/500 [==========================>...] - ETA: 24s - loss: 0.1801 - regression_loss: 0.1740 - classification_loss: 0.0061 453/500 [==========================>...] - ETA: 23s - loss: 0.1801 - regression_loss: 0.1739 - classification_loss: 0.0061 454/500 [==========================>...] - ETA: 23s - loss: 0.1799 - regression_loss: 0.1738 - classification_loss: 0.0061 455/500 [==========================>...] - ETA: 22s - loss: 0.1798 - regression_loss: 0.1737 - classification_loss: 0.0061 456/500 [==========================>...] - ETA: 22s - loss: 0.1797 - regression_loss: 0.1736 - classification_loss: 0.0061 457/500 [==========================>...] - ETA: 21s - loss: 0.1795 - regression_loss: 0.1734 - classification_loss: 0.0061 458/500 [==========================>...] - ETA: 21s - loss: 0.1792 - regression_loss: 0.1731 - classification_loss: 0.0061 459/500 [==========================>...] - ETA: 20s - loss: 0.1792 - regression_loss: 0.1731 - classification_loss: 0.0061 460/500 [==========================>...] - ETA: 20s - loss: 0.1790 - regression_loss: 0.1729 - classification_loss: 0.0061 461/500 [==========================>...] - ETA: 19s - loss: 0.1789 - regression_loss: 0.1728 - classification_loss: 0.0061 462/500 [==========================>...] - ETA: 19s - loss: 0.1790 - regression_loss: 0.1729 - classification_loss: 0.0061 463/500 [==========================>...] - ETA: 18s - loss: 0.1790 - regression_loss: 0.1729 - classification_loss: 0.0061 464/500 [==========================>...] - ETA: 18s - loss: 0.1793 - regression_loss: 0.1732 - classification_loss: 0.0061 465/500 [==========================>...] - ETA: 17s - loss: 0.1796 - regression_loss: 0.1735 - classification_loss: 0.0061 466/500 [==========================>...] - ETA: 17s - loss: 0.1794 - regression_loss: 0.1733 - classification_loss: 0.0061 467/500 [===========================>..] - ETA: 16s - loss: 0.1796 - regression_loss: 0.1734 - classification_loss: 0.0061 468/500 [===========================>..] - ETA: 16s - loss: 0.1799 - regression_loss: 0.1738 - classification_loss: 0.0061 469/500 [===========================>..] - ETA: 15s - loss: 0.1799 - regression_loss: 0.1738 - classification_loss: 0.0061 470/500 [===========================>..] - ETA: 15s - loss: 0.1800 - regression_loss: 0.1739 - classification_loss: 0.0061 471/500 [===========================>..] - ETA: 14s - loss: 0.1805 - regression_loss: 0.1744 - classification_loss: 0.0061 472/500 [===========================>..] - ETA: 14s - loss: 0.1803 - regression_loss: 0.1742 - classification_loss: 0.0061 473/500 [===========================>..] - ETA: 13s - loss: 0.1804 - regression_loss: 0.1743 - classification_loss: 0.0061 474/500 [===========================>..] - ETA: 13s - loss: 0.1804 - regression_loss: 0.1743 - classification_loss: 0.0061 475/500 [===========================>..] - ETA: 12s - loss: 0.1807 - regression_loss: 0.1746 - classification_loss: 0.0061 476/500 [===========================>..] - ETA: 12s - loss: 0.1807 - regression_loss: 0.1746 - classification_loss: 0.0061 477/500 [===========================>..] - ETA: 11s - loss: 0.1808 - regression_loss: 0.1746 - classification_loss: 0.0061 478/500 [===========================>..] - ETA: 11s - loss: 0.1807 - regression_loss: 0.1746 - classification_loss: 0.0061 479/500 [===========================>..] - ETA: 10s - loss: 0.1809 - regression_loss: 0.1747 - classification_loss: 0.0061 480/500 [===========================>..] - ETA: 10s - loss: 0.1808 - regression_loss: 0.1746 - classification_loss: 0.0061 481/500 [===========================>..] - ETA: 9s - loss: 0.1806 - regression_loss: 0.1744 - classification_loss: 0.0061  482/500 [===========================>..] - ETA: 9s - loss: 0.1805 - regression_loss: 0.1744 - classification_loss: 0.0061 483/500 [===========================>..] - ETA: 8s - loss: 0.1805 - regression_loss: 0.1744 - classification_loss: 0.0061 484/500 [============================>.] - ETA: 8s - loss: 0.1809 - regression_loss: 0.1748 - classification_loss: 0.0061 485/500 [============================>.] - ETA: 7s - loss: 0.1810 - regression_loss: 0.1748 - classification_loss: 0.0061 486/500 [============================>.] - ETA: 7s - loss: 0.1812 - regression_loss: 0.1750 - classification_loss: 0.0062 487/500 [============================>.] - ETA: 6s - loss: 0.1816 - regression_loss: 0.1754 - classification_loss: 0.0062 488/500 [============================>.] - ETA: 6s - loss: 0.1814 - regression_loss: 0.1753 - classification_loss: 0.0062 489/500 [============================>.] - ETA: 5s - loss: 0.1817 - regression_loss: 0.1755 - classification_loss: 0.0062 490/500 [============================>.] - ETA: 5s - loss: 0.1815 - regression_loss: 0.1753 - classification_loss: 0.0062 491/500 [============================>.] - ETA: 4s - loss: 0.1815 - regression_loss: 0.1754 - classification_loss: 0.0062 492/500 [============================>.] - ETA: 4s - loss: 0.1816 - regression_loss: 0.1755 - classification_loss: 0.0062 493/500 [============================>.] - ETA: 3s - loss: 0.1816 - regression_loss: 0.1755 - classification_loss: 0.0062 494/500 [============================>.] - ETA: 3s - loss: 0.1818 - regression_loss: 0.1756 - classification_loss: 0.0061 495/500 [============================>.] - ETA: 2s - loss: 0.1817 - regression_loss: 0.1756 - classification_loss: 0.0061 496/500 [============================>.] - ETA: 2s - loss: 0.1820 - regression_loss: 0.1759 - classification_loss: 0.0062 497/500 [============================>.] - ETA: 1s - loss: 0.1818 - regression_loss: 0.1757 - classification_loss: 0.0061 498/500 [============================>.] - ETA: 1s - loss: 0.1818 - regression_loss: 0.1756 - classification_loss: 0.0061 499/500 [============================>.] - ETA: 0s - loss: 0.1819 - regression_loss: 0.1758 - classification_loss: 0.0061 500/500 [==============================] - 255s 509ms/step - loss: 0.1819 - regression_loss: 0.1758 - classification_loss: 0.0061 1172 instances of class plum with average precision: 0.7549 mAP: 0.7549 Epoch 00045: saving model to ./training/snapshots/resnet101_pascal_45.h5 ALL DONE :-)